Consistency or quality?

I’m working on a remote spreadsheet patch that will be applied to a few thousand workbooks worldwide for one client. I didn’t write the base spreadsheet originally but have maintained it for years. Its a standard big gnarly finance model with a few hundred thousand formulas and 6 thousand lines of VBA.

The remote patch tool I use can pretty much make any changes, including swaping worksheets, replacing formulas and array formulas etc. It can also add/remove/edit the VBA, and it cleans the VBA on each application.

This particular patch is a pretty deep surgery one, normally its just salary rate, currency rate and some other financial reference data changes. This time though the business needs have changed and some reports need re-working.

One of the frustrations is the hardcoded range addresses in the VBA.

public const countycol = 64
Public const currency = "J33:L66"
etc

I have to change these each patch – if only they had used range names…

So now I need to add some more.

  1. Should I stay consistent and use the same crappy hardcoded approach or
  2. Should I add my new ones as defined names so they at least work right and won’t need further maintenance?
  3. Should I rip out the working code because its ugly and redo it in what I currently think is a better way?
  4. Other options?

I’m currently planning on going with 1 unless someone comes up with something pretty convincing in favour of one of the other approaches. Heres why:

  • Any further maintenance will have to update the current ones anyway, a couple more will add seconds to the next patch.
  • Starting changing the mechanics now could backfire in future patches
  • I think I prefer a consistent (ly poor) approach to hit and miss quality

2. will probably not get updated right as the assumption until now has been there is no need to insert rows in the worksheet, as the ranges are defined in code.

3. might be nice, but the client won’t pay because they won’t see any benefit.

4.?

What do you think? Any other reasons to go with option 1? Or reasons to avoid it?

Does anyone else do this remote patching? what approach do you follow?

(we tried to export the data to a new model, with the new logic, rather than apply new logic to existing data, but the users data proved to be too unstructured for that to work reliably.)

cheers

Simon

Advertisements

19 Responses to “Consistency or quality?”

  1. C#Man (aka MacroMan) Says:

    I say you build a web app. with C#/ASP.Net.

  2. Simon Says:

    If you had any idea how much cash had already been flushed down that particular toilet…

  3. Marcus Says:

    Hi Simon,

    Short answer: It depends. If this is a once off I’d be more inclined to remain consistent if only as it reduces the probability of inadvertently break something else in the model. If this is something you know you’ll be maintaining over a period of time then I’d be more inclined to some refactoring and minor re-engineering.

    A few thousand workbooks, ouch. If they’ve got that many users surely they’d have pockets deep enough to centralise the reference data and allow the spreadsheet to phone home periodically. Would take as much re-engineering as redoing the whole app properly.
    But as you’ve stated, the client would have to buy in to the ROI. I’ve seen organisations retain 10 staff to copy and paste spreadsheet reports all day as they had the OpEx (operational budget) to do it but couldn’t (wouldn’t) spend the equivalent to implement an automated solution as they couldn’t get the CapEx (capital budget) to do the project. It all cam down to how the balance sheet and P&L looked for the year (read: do I get my bonus).

    P.S. Did you roll this tool yourself or is it 3rd party?

    Cheers – Marcus

  4. gobansaor Says:

    I once worked on a PDP-11 RSTS/E Basic Plus (think of it as VBA without Excel) payroll system which I had inherited and it consisted of the some of the most tortous code you could imagine.

    Went down option 2, adding in new features or significant changes using more elegant code, it payed of in the end as the code that tended to change the most ended up either as “good code” or as referenced data and with a good testing framework as an added benefit. This payroll continued in existence for another 15 years (which I guess is about the max timespan of Excel VBA solutions too!).

    As a “negative” the code changes allowed me to hand over the maintenance to others (which suited me at the time) but perhaps you should look on option 1 as a “little job for life”. I knew I had to do something to allow me to move on from this system when I once spent a whole day following through a particular bug and came eventually to the problematic line of code and beside it was the only comment line in the whole system which stated “I found it!!!”, pity he/she didn’t fix it as well!

    Tom

  5. alastair Says:

    I much prefer to put the data into an mdb. Which means I can supply an upgraded workbook without worrying about the data. Wrt hard coding cell references I am incredibly inconsistent with this. range names are OK but they can still be a pain to update and they leave a footprint for the user.

    I would at least reconsider the code and more importantly the business logic of such an old legacy app. I would tend to view a large unwieldy thing like this as an opportunity to quote for some consultancy to review options. After all they may be wasting significant time in running this thing and a different approach might pay for itself – well worth at least a couple of days to investigate.

  6. Simon Says:

    All good points
    Sadly confidentiality prevents me saying too much, but lets just say this thing has more lives than the average cat, and has outlived many well designed and implemented attempts to replace it. You know the story its been scheduled for retirement for years.

    I have also handed over the patching many times, as its designed to be used by basic level users. (its a Codematic tool)

    I’m pretty anti changing tested working production code, I’ve seen some bad outcomes from ‘tidying up’ code. I wouldn’t say patching VBA with VBA is the most robust process either. So minimal changes is my motto.

    Any other thoughts?

  7. tfsjohan Says:

    In my experience badly written code is really hard to refactor since it’s so unpredictable. Before you begin it might look like it would work by just changing hardcoded references to named ranges etc, but when you get hard hands dirty enough you probably will discover atleast 10 different that will not work as you expected.

    Been there. Done that. :( I took over a project that had like 50 globals variables that changed all the time and it was impossible to follow the flow of the code. Yikes!

    That said, however, I totally agree with Marcus. If this is something that you will continue to work with I would probably prefer to do some refactoring and cleaning anyway. As long as your client is onboard and is willing to pay to be more agile in the future – why not?

    Here my top reason why Excel sometimes suck:

    1. No separation of document and templates. I prefer Word templates where the codes stays in the template and the document is just information.

    2. No separation of data and presentation. Close to impossible to combine data from different workbooks. The database is your friend!

    3. Too static. If you don’t know at designtime how much data you’ll have, everything get’s screwed up. This is a little better with Excel 2003/2007 list/table function, but still… If MS Query wasn’t to unstable, maybe that would help but…

    On the other hand, other times this is exacly what’s so great about Excel.. :)

    // Johan

  8. Jon Peltier Says:

    My approach with several projects has been to overhaul one section at a time as it comes in for changes. At a certain point you’ll know whether the changed version is a major or only a minor improvement. If it’s major, next time changes come in, propose to upgrade the whole think.

    In the cases where I’ve just kept doing the same old bad stuff, I usually end up regretting not fixing it.

  9. Harlan Grove Says:

    If it were me and I was sure that I’d be the only person maintaining it barring being hit by the proverbial bus, I’d replace the hardcoding in stages, probably one module at a time. OTOH, if there were some chance that someone else would be involved, I’d stick with consistency.

    6000-odd lines of code? All in one module?

  10. Simon Says:

    no the 6k is scattered (almost randomly) across the 30 sheets, half a dozen modules and half a dozen forms (that no one uses but have to stay as much of the business logic is behind them). plenty of Globals of course – but at least they are declared, one similar sized monster (different client) didn’t use option explicit – now that was a twat to maintain!

    The trouble is this doesn’t come in for changes, I write a script and distribute that to all the users to apply to all their relevant spreadsheets. I have found big VBA changes to be a stability PITA when done from VBA. In fact I’m amazed it works at all, and of course virus scanners hate it.

    Johan – good point on bad code being hard to refactor, its hard to test, so its hard to change safely.

  11. Jon Peltier Says:

    Johan –

    You can achieve your items 1 & 2 if you use a set of files. Use an Excel add-in for the code and much of the business logic. Use an Excel template for data/option entry and display. Use another workbook or a database for housing the data.

    Item 3 can be somewhat realized through use of modularization, so you could readily go from a fixed worksheet table, to an Excel 2003 List/2007 Table, to a database.

  12. Biggus Dickus Says:

    This is a classic.

    Apparently you have set this model up so no one can insert or deleting rows or columns therefore I would guess also that each worksheet and the workbook are protected with passwords – either that or you have the world’s most boring users.

    If you can rely on things in the current model being where they’re supposed to be then I would make a new master, clean up all the range names in your code, clean up whatever code, add any new reports, formats and formulae you want. Save this as a Master file.

    Then my patch would open the user’s file then the master file and then cycle through each input worksheet taking the values of any cells with hard-coded values (as opposed to formulae) to the same exact cell in the new master. I would use TWksht.cells(x,y).value = SWrksht(x,y).value. This way you won’t copy any formats.

    Other way would be to select all the cells you want to copy from old to new on each worksheet in the master, give that non-contiguos local worksheets name (sheetname!rangename), and then cycle through each of those cells setting the value of the Target cells to the Value of the Source cells.

    This way you will end up wih a clean workbook with code you can trust and all the new bells and whistels you want to add.

    I find it’s easier to grab the values from the old file into a new clean file than running around cleaning up the old file.

    If someone has gone in and changed the source file “illegally” you will be able to tell easily enough and you can push back to the client and tell them somebody “cooked” their file and there’s nothing you can do for them unless they put it back in order.

    Just my thoughts.

    Dick
    p.s. I always set Application.Calculation = xlCalculationManual and Application.EnableEvents = False before running this sucker…

  13. Simon Says:

    Dick
    Its not mine, let be clear, this is not a system I would build, I just keep it limping along.

    The users insist on being able to insert their own support and workings sheets hence the export data to new model approach won’t work, and why we have to update the logic in their existing models. And they can and do insert rows and columns.

    Every time the client says can you patch this I say, yep and we really need to invest a few quid on a revamp. And they always say, there is no point its about to be replaced with an enterprise class app. Ever since 1998 that is.

  14. Biggus Dickus Says:

    <>

    Man have I heard that before …. When they tell me an app is only short-term I assume that means about ten years ;-) I have one that’s 23 years old and won’t go away…..

  15. Biggus Dickus Says:

    “”Every time the client says can you patch this I say, yep and we really need to invest a few quid on a revamp. And they always say, there is no point its about to be replaced with an enterprise class app. Ever since 1998 that is.””

    Man have I heard that before …. When they tell me an app is only short-term I assume that means about ten years ;-) I have one that’s 23 years old and won’t go away…..

  16. Simon Says:

    A bit like when you get dog poop on your shoe? – impossible to get rid of.

  17. Marcus Says:

    Hmm. I think what Biggus is saying is they start as tactical and end up being stratgeic.

  18. Simon Says:

    The most common strategy I see is one of overusing tactical solutions.

  19. Biggus Dickus Says:

    I like Simon’s “poop” reference – nothing like an “Excremental Analogy” I always say…..

    “Hmm. I think what Biggus is saying is they start as tactical and end up being stratgeic.”

    It’s usually actually “It works so let’s leave it alone…”

    Dick

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: