Spreadsheet proliferation

On the distribution front many of us seem to be targeting a handful of users for most Excel/VBA applications. In itself thats quite interesting, but I think I missed a question.

How many copies of your spreadsheet application end up ‘live’ on the network?

Much of my stuff is for 1 or 2 users, but they may end up with 10, 20 or 30 versions per reporting cycle.

One example I mentioned in the distribution post was meant to be for 30 (named and trained) users. But at least double that use it, and they each probably have literally hundreds of copies. Maybe 1-5 are live at any time, but they probably create 20/30/40 or maybe more per year.

The thing I am thinking about is the maintenance headache when you realise you need to make a change to a live spreadsheet. If its just a couple of users, well then that seems easy enough, but actually if they are sitting on 50 copies that all need updating and assuring as consistent, then thats a bigger job.

This is one area where VSTO/.net could add some real value. But equally splitting data and logic into any separate components could help. Excel Services could also help, (if people are connected I think?).

I have a VBA based ‘patch’ tool that I use for scripting version management stuff so I can keep these distributed apps in line. It does pretty much everything, although I have found some of the VBA replacement stuff to be mildly unstable at times.

How do you manage this proliferation? Do you have tools? do you rely on good (de-coupled) design? Do you just re-issue a new master? and if so how do users move their existing analysis to the new version?

Have you found an approach that totally makes the issue go away? Have you been able to use it for justification to migrate a spreadsheet to another technology?

Cheers

Simon

Advertisements

8 Responses to “Spreadsheet proliferation”

  1. Marcus Says:

    Hi Simon,

    I think I mentioned a while ago my general approach – where possible I separate the data from the spreadsheet and often work with add-ins. It typically works like this.

    > All that is distributed to a user is an add-in (usually xla). When Excel loads this xla refers to a central repository (typically Access but also SQL Server) and retrieves what options that user has. This is table driven so any changes occur centrally.

    > The xla then builds a custom menu for that user and downloads each additional xla that user will need. Each xla is stored centrally on the server and is compressed in a CAB file, it’s unzipped on download. The whole process takes a couple of seconds.

    > When a user selects a required menu option it loads the respective xla. This might display a UserForm, generate a report or load a s/s model.

    > Virtually all data is stored on the server.

    > All reports are based on templates which are stored centrally.

    > All data capture is done via UserForms and the data is stored on the server.

    > All interactions with the repository are done via QueryDefs so changes can be centralised. When SQL Server is used, I have an intermediate Access dbs to act as a middle man. This way I can update queries without going through the IT dept.

    > Each time a user (re)starts Excel they get a fresh copy of everything except the xla which generates the menu.

    The main times when this structure could not be achieved is when users are geographically dispersed to the extent that they are on different domains. Sometimes model templates have been stored on an intranet but this quickly results in versionitis as users like to keep they own personal copy.

  2. Ross Says:

    Marcus, that’s sounds like a good system you have going there. Don’t think I could be asked to put it into a .cab file – are they that big!
    Have you have problems when people want to say, take a live model to a customer – i.e want data in a SS on there local PC with no access to a back end? –

    I built a dictor application a while ago and knocked up a “file system”. I basically said if the cell background colour is x (values only) when copy the value of that cell and it’s address to my data work book. I tarted this up so it works, well just like any other office file system. I changed the “data file” xls file extension to something custom and used a vb front loader with file associations so that a double click of the data file ended up opening with the application – it’s quite neat, if not a little slow (progress bars all round!). It works quite well – there have been a few issue where some files have corrupted – I think it’s something to do with the cell’s ability to read it’s own colour – seems to get confused some times :-)

    I’m not a big fan of Excels XML, – don’t fancy writing out a big XSD for a complex model! (I know nothing about XML, so it might be easier than I think)

    Basically I think this method – not data/logic, but app/data file, is quite good, then you can update app, just like a pucker programme, with service packs and so forth – well ish….

  3. Jon Peltier Says:

    I’ve used version management in a couple applications, storing the build number and date in hidden names. Depending on the app, a mismatch between the add-in and template would result in a warning or even deletion of the old app and popping up a message with a link to downloadable updates. If an add-in detects an older file version, it may create a new copy based on the current template, copy over the data, save the old file as a template, and continue with the analysis. This can be made to work very smoothly with the user hardly noticing the switch.

  4. Dermot Balson Says:

    I’ve basically used two designs
    (a) a single workbook with logic but with no data (or else all the data), which retrieves data on demand. There is no need for any copies of the workbook.
    (b) multiple workbooks, each with their own data, all linked to a central XLA which runs when they open, and which updates them as it needs to

    Clearly, maintenance has to be anticipated in the original design, to avoid a total mess. VSTO sounds sexy enough, but imho there is usually a much simpler alternative, and simple is good.

  5. Harlan Grove Says:

    I’m skeptical of claims that maintenance tasks for which VBA were unstable somehow VSTO/.Net would be stable. Do you mean using VBA macros to alter VBA modules in other workbooks? Brute force works: using macros to copy data from old version workbooks into new version workbooks with the new version workbooks already containing the new/revised VBA modules.

    As for validating version currency, force users to run models with VBA/macros enabled by using stub udfs to return Empty, e.g.,

    Function stub(ParamArray a()) As Variant
    If IsError(Evaluate(“ObscureHiddenName”)) Then stub = CVErr(xlErrNull)
    End Function

    Add or concatenate stub() calls to all critical result formulas, and don’t define ObscureHiddenName in the ‘template’, but set it to something innocuous in the workbook’s Open event handler. If users don’t enable VBA, all these formulas return #NAME? errors. If they enable VBA but disable event handlers before opening the file, these formulas return #NULL! errors. Then add code in the Open event handler to check the workbook’s version against the current version stored in a central location.

    This is subject to all the usual caveats about inability to protect anything effectively in workbooks.

  6. Charlie Says:

    In one of my designs, and the one that is the most widely used, 20 users across Canada, the input data was nicely separated from the calculations and output. The challenge was ensuring that the old data could be read into the new version of the model – there were version stamps on the data files and in general it worked well.

    One issue was that the input data was not always data – but sometimes formulas referring to the current sheet, or a related sheet, or an external workkbook. Saving just the input data with these characteristics was not always successful when loading into a new version of the model

    The next version of this model will be moving away from this saving of the data separate from the calculations and output, simply because they need to build complex input sections and output sections and so retaining everything in one workbook will satisfy that. But then migrating the data to a new version of the model will be non-trivial – I haven’t worked out a solution yet.

    A lot of my work now is dealing with existing models, where the input has not been cleanly separated from the analysis and output. The same issues of migrating input data that is not just clean data, but some of which are forumulas, when the next version of the model comes out, is a huge challenge.

    On a related note, and maybe something for another post, is how do others test their models. Because the issue of migrating old test suites when the model chnages is essentially the same as above. The challenge in my applications is that the users know the models and the inputs/outputs better than I ever will – and yet I would like to have thrown test data at it before I ship it back for beta testing. I have built tools to capture old test data and then migrate it to the new version, but it is not soemehing I would consider releasing to users to solve the above version data migration challenge.

  7. Dennis Wallentin Says:

    As for VSTO there exist two relevant aspects in this context:

    VSTO provides us with
    – the ServerDocument Class
    – the deployment model of IntelliMirror shared directory

    However, if we don’t have the basic understanding of the above it become rather pointless to discuss and review it, right?

    Kind regards,
    Dennis

  8. Simon Says:

    Some great approaches here, I think I should probably try a bit harder to build this sort of maintainability into some of my stuff.
    I still fall for that ‘Quick and dirty please, we only need it for 6 months, then the replacement will be implemented.’
    I have found working with VBA (editing code) is more stable from C# than VBA, and it doesn’t fire virus alerts. Can’t remember what the story was from VB6.
    Like Charlie though I do a lot of support of other peoples work so I have to ‘bind as I find’. But I’m sure I could do more.
    I also need to support disconnected users so I couldn’t be too strict about updates.
    Thanks for the suggestions and explanations folks, very useful indeed.
    Cheers
    Simon

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: