Code location

One of the great things about Excel/VBA apps is the ease of deployment. Either email the workbook or add-in and you’re done. (assuming your customer has Excel of course)

From a security POV that is not ideal, the code could be changed to subvert its intended use, it could get infected, and a bunch of other bad stuff might happen. I don’t ever recall an example, although I have seen infections from that crappy laroux VBA virus years ago.

From a management POV it can be a challenge too. Lots of different physical copies, how to check they are all the same if they should be? how to update when business requirements change?

One of the things VSTO offers is the ability to connect a workbook to some separate code that lives on a server somewhere. That solves many of those security and management issues, but at a cost of possibly more difficult deployment, less certain usage (eg if the server is unavailable).

So my question is, do you prefer the all-in-one VBA style, or the separate components VSTO style?

And, are they both valid and useful?

Personally, I think in general I prefer the all inclusive style, but I can see situations where a single code resource on a server would be miles better. Although if most of the logic is in the worksheet cells (as I think it should be in Excel based solutions) what you gain by centralising only a small part, may not be worthwhile. I guess this is where Excel Services steps in. But thats a topic for another day.

I find the possibility that VSTA (maybe?) in some future Excel version may offer us proper, compiled (ie a bit more secure than VBA, maybe faster???) embedded code quite appealing. Do you? or are you a ‘code-should-live-on-a-server’ person?

What do you think?

cheers

Simon

Advertisements

15 Responses to “Code location”

  1. Dennis Wallentin Says:

    Simon,

    >>although I have seen infections from that crappy laroux VBA virus
    >>years ago.

    Well, MSFT marketing department seems to have big issues with it!!

    Different situations require different solutions based on different tools.

    When Vista will be the most used Windows platform then the security aspect will be very important. When I say very important it’s primarily cover how to support Vista’s security system (and not turn off the UAC).

    Speaking about security, the digital certificate has finally become interesting. Digital signed .NET based solutions works better with Vista’s security compared with the unmanaged options.

    The latest beta version of VSTO (“Orca” Beta 2) is very promising as it simplify deployment and now also fully support ClickOnce. The ClickOnce technology offer a maintance that no .NET based solution can provide.
    VSTO offer server capabilities that none of the other available options can compete with. So in a well controlled enviroment the upcoming VSTO version may play a major role.

    Yes, I support the approach to separate code from data. However, we have done it for the last 15 years so it’s not a new approach that VSTO provide us with.

    >>the all-in-one VBA style
    In enviroments where it exist a mix of Office versions and it’s about workbook level solutions then this approach is the only doable.

    Kind regards,
    Dennis

  2. Harlan Grove Says:

    Versions. My perspective is limited to large financial services companies. The one for which I work just finished upgrading everyone to Windows XP and Office 2003 this year. Since the plan is to maintain a ‘common platform’, I’d guess the next companywide upgrade won’t happen until 2010. Given the bad press and newsgroup complaints, I can’t see Vista replacing XP faster than XP replaced NT4/2K.

    Also, and this may be restricted to large companies, most heavily used apps here are being moved to Terminal Server, making deployment a nonissue. Server-based software implies a server infrastructure, and that implies a certain size of business threshold.

  3. Jon Peltier Says:

    Charles Williams of Decision Models (http://www.decisionmodels.com/downloads.htm#addload) has an add-in loader which

    ” is designed to solve some of the problems of maintaining and loading Add-Ins on a network”

    I’ve fiddled with it a bit. It seems like it might be able to manage add-in version and updating from a network.

  4. dermot Says:

    In my corporate finance environment, I’ve several times had the problem of managing multiple copies of workbooks and propagating fixes or enhancements.

    I’ve tried several approaches, including
    * a single master workbook which loads data from different workbooks/databases (this only works if you can use exactly the same workbook for all the copies you need to make)
    * a set of workbooks which “call home” to a central workbook when they start up, and that central workbook has the code and updating capabilities

    I just had an example where 27 copies were made of a template, data was loaded into each of them, and then an error was discovered. I’ve replaced all of them with a single master workbook which creates a hard coded output file when it is run. This output file includes the input data, and it doubles as an input file, ie the data can be reloaded from it. This means I can load the data into my master workbook for each of my 27 models, and create an output file for each. I can reload any of them later, using a single master workbook, which gives me control over maintenance, as well as an audit trail by way of the output files.

  5. Harlan Grove Says:

    In re fixing files with a common format, I use a workbook containing a single worksheet with that single worksheet containing a table with worksheet name in col A, range address in col B, 1/0 whether array formula in col C, and either expected formulas or corrected formula as text in R1C1 addressing in col D. The topmost rows contain the expected formulas and are used to check that a given file is one that needs fixing, followed by a blank row, followed by rows containing formulas to be fixed.

    I use a macro to generate a VBA array of .XLS filenames on the user’s local and network home drives, then cycle through the array calling ExecuteXL4Macro with constructed external references using the files’ pathnames and top rows of the table in the worksheet to check for markers than identify files needing fixing (usually I need to test no more than 3 cells). When it finds a file that needs fixing, it opens it and runs through the rest of the table, using the first two columns to set a reference to the worksheet/range that needs fixing, and the second two columns to change either that range’s .FormulaR1C1 or .FormulaArray property, then save, close, and move on to the next file.

    I’ve had fewer problems with remote users using such workbooks than I have with them using print macros.

  6. Simon Says:

    I have a pretty comprehensive remote patch tool, that allows users to script changes, and copy paste from master versions. It also does VBA changes. It sometimes upsets virus checkers, but then doesn’t everything of any use these days.

  7. Biggus Dickus Says:

    <>

    Funny – just yesterday I was doing a seminar on Excel design for a consulting firm who’s boss I have been friends with for 20 years and when the old macro security dialog popped up I immediately made them set their security to low (as I always do ;-)). Then I explained to them about the Laroux virus and how I haven’t seen it or any others in years. I also explained how MS is still concerned about that to the point where they are willing to put their milk-cow (Excel) at risk in order to protect their relationship with the Security Nazis at the major corporations.

    I wonder if that whole thing isn’t overblown because I have NEVER seen a client insist on a VBALess install of Office, nor have I ever been forced to use Certificates even in the Largest corporations. I think people have gone on to other things, and like in life, I don’t change my behaviour because a meteorite may fall on my head (?).

    Frankly I think MS has basically decided that all their products are going to Managed code and that’s that. So while I would like to be able to just email files around the world and have them run, I expect to have to get involved with IT Security, deal with complex Deployment scenarios and MSI’s and Manifests and all that good s**t. But at the same time I expect it will mean the death of what I do. Oh well – that’s life I guess. It’s not my product.

    Dick

  8. Jon Peltier Says:

    Why set security to Low? Even with a 1 in ten thousand chance of a malicious macro, clicking on Enable is not a big deal. In fact, it’s not the malicious ones that trouble me, but the incompetent ones, the ones which make troublesome adjustments to my Excel or Windows environment. that bother me. These are more like one in one thousand. The warning reminds me to check the code.

  9. Harlan Grove Says:

    From my perspective, we’re slowly heading back to centralized IT. When you think about it, it’d be cheaper to manage terminal server farms of a few hundred machines than thousands of desktop machines. It’d also be easier to enforce security. And deployment would cease to be a concern. As long as access to the apps and data on those servers is secured (and ID/password is a bit archaic and not really secure), any client machine running suitable GUI terminal software could replace desktops. And machines running JUST suitable GUI terminals, booting from CDs and requiring NO local harddisk are already possible, and they’d be REALLY cheap to maintain.

    I’d guess that’d even make deployment of outside developers’ products simpler as long as you could get customer specs about what they have on their machines. All that leaves is how to handle per use rather than per seat licensing and payment.

  10. Dennis Wallentin Says:

    I agree with Jon about security settings. That’s why I use a digital signed solutions which only bee needed to accept the first time it runs.

    Harlan, the cost saving driven process seems to gear towards centralized IT with stupid terminals. The funny thing is that it’s the same point where the IT evolution started out in the early 80’s, i e it ends the circle. Do You see it as a threat or?

    Kind regards,
    Dennis

  11. Biggus Dickus Says:

    “Why set security to Low? ”

    Becasue the users complain about the dialog coming up all the time. I know I could sign everything, etc., etc. but I think I explained my thinking on that in my earlier post – it’s called “acceptable and realistic risk”.

    Let’s agree to disagree on this ;-)

    Dick

  12. Simon Says:

    Jon I’m with you on the security thing.
    Pretty much every s/s I open fires that box and it doesn’t bother me one bit. I like the control to say, no VBA thanks, every now and again.
    And it is the crap ones rather than malicious.
    I hear what you are saying too Dick, you just probably havent met some of the blunderon users that I have (like the chimp that sent me the laroux virus years ago, in some pointless fantasy football thing I didn’t even want).
    cheers
    simon

  13. Harlan Grove Says:

    Recentralization is inevitable. Modern PCs are now capable of doing things only mainframes could do 25 years ago as well as lots of things mainframes still can’t do. Unfortunately, with this extra capability comes greater complexity, and departmental ‘developers’ are often way beyond their competence. Add to that vastly greater heterogeneity of hardware and software configuration, and the headaches of netware deployment and dependencies.

    Other than using my laptop in airplanes in flight, I don’t use any PC that’s not connected to a network of some sort. That’s becoming the case for nearly all worktime/work-related PC use. The obvious economic benefits of recentralizing IT when nearly all users are on networks will make this irresistible.

    None of you should worry about dumb terminals. The machine end-users will have on their desks is irrelevant. The software that would be running centrally is likely to be Windows-based, and very likely Office-based for many years. The problem you face is who’s going to be buying your software, how much support you’re going to need to provide to get it installed with possibly dozens if not hundreds of other specialized packages, and how to get paid when your software packages are installed on far fewer machines than the number of users who’d use them at the same time.

  14. Marcus Says:

    ‘code-should-live-on-a-server’
    I try to be. Many of the add-ins I develop are broken down into components. The only generic portion is an xla which calls a central database, retrieves the user’s privileges and builds a custom menu based on those privileges. Each menu item is usually a separate xla.

    Any xla’s required by the user are copied from a server to the c: drive. Software updates simply require copying the updated xla to the file server. Each time a user starts Excel they receive a fresh copy. I’ve detailed this in a prior posting.

    The worst are stand-alone workbooks, particularly when they get emailed about. For most projects I try to collate the data in a relational database (typically Access). In these situations you can use version control between the workbook and a value stored in the dbs to weed out expired versions.

    Biggus: “MS has basically decided that all their products are going to Managed code and that’s that”
    Horse to water, Biggus. As you’re probably aware, many large corpoates, particularly banks, have so much invested in VBA code that any transition away from VBA will either be extremely painful or simply wont happen (at least in the short to medium term).

    Biggus: “I expect it will mean the death of what I do”
    As quipped by Mark Twain, “The rumors of my death have been greatly exaggerated”. I think there’s a lot of life in the ol’ girl yet. In the medium term (5 years~ish) I see myself getting busier. This trend has already started for me.

    There is a huge installed base of VBA code and I believe there are fewer new entrants coming into the VBA market. Remember, it may not always be new development. In a recent project I reverse-engineered a suite of Workbooks, and migrated the code to VB6 so a Monte Carlo simulation could be run in a ‘production environment’.

    Regards – Marcus

  15. Biggus Dickus Says:

    Marcus:

    “There is a huge installed base of VBA code and I believe there are fewer new entrants coming into the VBA market. “”

    Yes I’ve discussed this over on OZ before. I know that works to OUR benefit in the near to medium term – but it’s a shame overall for the industry and for our clients.

    I have a theory that spreadsheets will never go away and that if you take away the ability to automate them relatively easily then people will reach out to one that does. It is possible that MS could lose their Excel position unless they do this right? Maybe. MS may have to eat some crow somewhere down the line and reintegrate a new Macro language that is sand-boxed but still functional.

    All most people want (including me) is to be able to automate spreadsheets. Frankly VBA is kinda overkill – especially if IT is the problem preventing us from having the tools we need for spreadsheet automation. Look at Access 2007 – they included a new “Sand-boxed” macro functionality. Maybe this would make sense for Excel too ??

    I’m certainly not afraid of learning new s**t – but it has to do the job for me – that main reason I have not run off with VSTO.

    Dick

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: