Next Monday: D Day for SOX gravy train

It seems some legal beagles in the US are taking the organisation behind Sarbanes Oxley to court on the basis that the way it is set up is unconstitutional.

Whilst mainstream IT think about that from a security industry POV there is the infant spreadsheet management industry to consider too.

SOX, and section 404 in particular have been used to encourage organisations to take some responsibility for their crappy spreadsheets. Ideally they would de-crappify them, but in my experience companies prefer just to list them and claim they are now ‘managing them’.

It will be interesting to see what, if any, impact this legal challenge has, I think the current mood is for more legislation and control (Certainly in the UK!). But maybe those that claim SarBox has damaged US business competitiveness will hold sway.

I have seen a few remediation/migration type roles on Jobserve recently so maybe orgs are taking this seriously now. There are a lot of ingrained habits to change though.

Are you getting much SOX/Remediation business?



4 Responses to “Next Monday: D Day for SOX gravy train”

  1. Harlan Grove Says:

    From my in-house perspective and extrapolating promiscuously, organizations are either not allowing new spreadsheets to be added to reporting processes or are not doing much more than cataloging them.

    There are still too many instances of mixed user entry and formula results in the same worksheet, but the main thing I haven’t seen is SheetChange event handlers trapping and undoing cut and paste operations and displaying warnings about why it’s bad and why it’s not allowed. Of course it’d be much, Much, MUCH better for MSFT to provide a protection setting that converted cut and paste operations into copy and paste: this could be done with a SheetChange event handler, recording the post-paste values or formulas in the destination range (and maybe also the formatting), undoing the paste operation and setting CutCopyMode to False, then replacing the stored post-paste values or formulas in the destination range. Excel should provide a protection mode to do this automatically.

    There are also too many spreadsheets that rely on Excel’s laughably weak Data Validation functionality. Data validation doesn’t process pasted values, which means it’s unreliable in a multitasking environment such as any version of Windows since Windows 1.x. SheetChange and SheetCalculate and/or validation formulas in ancillary cells or as defined names are necessary for robust data validation. Some models I’ve seen do use macro-driven final validation, fewer use robust interactive validation.

    Finally, there’s overuse of ISERROR. The proper way to test that a cell contains a positive value isn’t X99>0 because that’s true if the nice user enters an apostrophe or a space char (or any other string) in X99 (unless you want beg for trouble by using 123 transition formula evaluation), it’s N(X99)>0. How many times do the rest of you see N(..) being used in production spreadsheets?

    I don’t know about the rest of you, but in my experience cut and paste, ineffective user entry validation and failing to allow for Excel’s, er, ability to make order comparisons between values of different types have been responsible for about as many runtime errors as off-by-one indexing and incorrect references in formulas. But I haven’t seen much effort to control for these potential problems.

  2. ross Says:

    >>There are also too many spreadsheets that rely on Excel’s laughably weak Data Validation functionality.

    Thanks God someone who agree’s with me. Data Validation in Excel sucks! Thank you Harlan.

  3. Marcus from London Says:

    Spot on Harlan – Excel’s Data Validation does little more than provide unwary users with a false sense of security.

    N(..). Hmm, let’s see. I’m involved in maintaining a model which has over 405,000 formulas across 114 worksheets. use of N() – zero. Plenty of A1>0 and A1=”” though.

  4. Harlan Grove Says:

    The problem with A1=”” is that it won’t trap A1 containing one or more spaces.

    I’ve seen a wide and impressive variety of inventive user entries. N(..) and TRIM(..) functions are much underused when it comes to addressing such inventiveness.

    I screwed up my ISERROR point. I meant to add that using ISERROR to trap expected conditions such as no values to average often leads to ignoring more serious errors. For example,


    traps no numbers in SomeRange, but it also traps error values in SomeRange as well as trapping corruption (#REF!) or deletion (#NAME?) of SomeRange or even, if SomeRange is defined as an intersection, #NULL!. ISERROR usually traps too much, and it certainly hides bugs early on and lets them grow into real monsters.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: