A few comments have touched on parts of this, so I thought I would post what I have seen, and what I think could be improved.
Many important commercial spreadsheets I see seem to go through the following lifecycle (at least once):
- A requirement becomes apparent
- Previous work is adapted if possible
- If not a new model is built, rarely completely from scratch
- The results are reviewed for reasonableness by the business person who developed the model
- Its used live, usually with gradually reducing scepticism
- Other users feedback corrections and enhancements
- Eventually abandoned as business needs change
This is fairly close to the standard Systems Development Life Cycle but a few elements are missing to my mind.
There is often very limited effort around developing useful requirements, and rarely any initial design. There is also rarely any sort of conscious technology choice. It was always going to be in Excel. A bit of effort in initial design may often highlight the fact a spreadsheet may not be ideal. I remember one Excel based unit costing system that was never going to work. A simple migration to Access and the whole thing was almost trivial. And not a 3 line multi nested array formula in sight. I accept the whole skill set argument – but that is why short term flexible consultants and contractors exist (and training courses).
We are always encouraged to reuse previous work, and in many cases reuse is a Good Thing. But what if you didn’t do the necessary due diligence to check that what you are reusing is both correct and appropriate? What if by dragging in old stuff you bring in a couple of hundred broken names and a load of useless styles, or some crazy VBA (did I ever tell you about the guy who trapped ctrl +C to trigger some VBA to delete a load of stuff on the active sheet, when I did a copy on his machine it killed my spreadsheet).
Reviewing the results for reasonableness is unlikely to generate much valuable insight. If you ‘check its ok’, you’ll usually find it is, if you look for the errors or the weak point you’re far more likely to find something worth correcting. I prefer to test with meaningless data so I can focus on the logic. A final review does no harm, unless it is in place of proper testing. Proper testing involves people other than the developer.
Using your customers as beta testers is fine, as long as they know it. Doing so without warning them is likely to reduce your credibility. I often worry that consumers of spreadsheet systems are a bit gullible, I prefer to have a developer pick my work apart as they tend to be more thorough, a real tester might be even better. As long as its formatted nicely and not too blatantly wrong, customers seem to have little difficulty accepting spreadsheets they are given.
As ball park figures, the stuff I work on takes 2-3 months to develop, test and deploy. Its lifetime is 2-5 years roughly. There is a ton of info available that is useful for those early months, but almost none that covers how to manage a live spreadsheet based system (a phase that is probably 10 time longer than the development). I would suggest that most of the real world errors are caused by poor version control causing a correct but inappropriate version of a workbook or worksheet to be used. Abandonment (ie leaving on the network, but not using) rather than graceful retirement (archiving, maybe leaving a shortcut in the original location) means there are normally more wrong versions to choose from than correct ones. Anyone else ever run last years data set with this years balances to produce almost-believable nonsense? I think a decent Sharepoint implementation would go a long way to solving these version issues.
I tried to warn one client about having multiple copies of very similar spreadsheets, he cut me off with ‘Disc space is cheap, I don’t care about redundancy’. As I was explaining the reconciliation nightmare that would follow, and the unnecessary risk he was carrying, he was already out the door. (Maybe I should have gone on a Dale Carnegie Win Friends and Influence People course?)
[I keep wondering about productionising the tatty version-compare tools I use as a commercial venture. Would anyone buy such a tool (if it was any good of course!)? would you prefer an add-in or a standalone exe? Would you swap a relative for one, or just want to swap some (secondhand) chewing gum? Anyone using something similar already? I’ve seen a few on-line, but they don’t seem to do what I want.]
Anyone else had similar experience? Think I’ve missed anything? Think I’ve got it wrong? Anyone got a better approach than warnings that fall on deaf ears?