Working v performance

I had an interesting discussion with a fellow dev a while ago.

I said ‘our application is working, now all we need to do is sort out the performance’

He said ‘our application is not working till the performance is sorted’

I hadn’t really thought about it like that, my priority is always to understand and model the business issues, and worry about performance as part of productionisation (real word?) later. Obviously I don’t ever ignore performance, but I do prioritise understanding initially.

My view is this: ‘slow but proven correct’ can still be useful, ‘fast but wrong’ is no use. I guess either one may not be a useful building block. I also see it as a bit of risk mitigation going for the logic first.

What are your views/approaches?

I guess its maybe different depending what you are doing? (I’m doing a lot of pure performance critical stuff at the moment)



19 Responses to “Working v performance”

  1. MikeC Says:

    I would tend to agree with your views on this Simon.
    Maybe it’s the sort of thing I tend to build, but I look at getting the base-level function correct first, and not worry too much about how long it takes or how it looks etc during that phase. Once I have everything working and doing what it should do, then I look at increasing speed, reducing resource requirement, and all the pretty refinements that people want, such as formatting the output blah blah blah. During the initial build, as long as it puts a date value in the right place, I’m not going to worry then whether the cells the dates land in are formatted correctly, because that’s a “detail” rather than a “vital” – I can add the extra to make it appear as 01/05/07 rather than 39203 at any time, and I’d rather get the next functional requirement in place first.

    I look at it as building a house – you don’t hang paintings on the walls before you put the foundations in place…

  2. Rob Bruce Says:

    Of course ‘productionisation’ is a word. It’s used extensively in an area of data publishing I worked in where consultants would work through reports with clients and we would ‘productionise’ their logic to enable the reports to be published on a regular basis. This seems similar to your usage.

    I always performance tune early. That way, if an issue arises that necessitates a partial redesign, the consequences are much less severe.

  3. Biggus Dickus Says:

    I’ve heard that talk less lately as the computers and the networks get faster and faster.

    I have been known to tell the user – “So do it manually and see how long it takes (?) …” Actually that’s a legit point in most reporting-focused apps where there is a lot of calculation going on and it just might take time.

    In an on-line situation then performance IS an issue and I do include that in my development. In the end though if the performance “sucks” that’s because there was nothing I could do about it more than I did and that’s just part of the process – sorry ’bout that ;-).

    (do I have a bad attitude or what ….)

  4. Ross Says:

    I think their lucky if it works!

    No, sure somethings performance is part of the development, – i’m working on a project at the mo, trying models in excel to test the logic, but will need to build the full thing in VS. Other times “performance” is nice but not really worth the investment.
    Performance can be defined in lots of ways of course, not just speed – bullet proof, usable, nice looking etc. It’s just one thing that has to go into the over all pot, the bottom line is ROI.

  5. MikeC Says:

    Dick – you’re my inspiration… I wanna be just like you when I grow up…

  6. Jon Peltier Says:

    I generally break my apps into small pieces, mostly to make it easy to track what I’m doing. As I work on each piece, the first effort is towards getting the answer correct, since that’s assumed. The performance comes in as I and the client test the intermediate project.

    This approach assures correctness, and it also assures that performance issues are not left until last.

  7. Lord Says:

    Working first, speed second, and elegance third, but you do need rapid feedback so that it doesn’t entail a redesign later. Something simple that works fast can be modified to provide enhanced functionality, but trying to provide full functionality before getting anything working and fast is a mistake.

  8. Charles Says:

    My method is to design for performance first, then code/test so that it works, then do remedial performance fixing if required (which usually will not be required!)

    If you don’t design for performance up-front its usually too late for anything other than relatively small improvements after you have a working version, although if you have a dog-performance that works I suppose it could be useful for regression testing the redesign you ought to do.

  9. Steve Hammond Says:

    I look at design … not like building a house or a blue print (foundation first and the rest comes later) but rather like coming up with a concept and seeing the rules and practises put in place. The rules must have definition that crosses between platforms which can take more time, but if the client does not have the time, finances, or existing structure in place, then I see it as an indication of what type of applciation they are truely looking for.

    Even when the concept is being put together I like to keep performance in mind but simplicity of the concept has more clout. One reason for this is that I find it’s not the current design that bogs performance as much as the additions that come later.

  10. Harlan Grove Says:

    Performance and Excel?

    If calculation speed is critical, don’t use spreadsheets.

  11. sam Says:


    I agree …beyound 30k rows and 50 columns even the faster version of sumproducts simply are not fast enough….in fact the speed is terrible…

    But “dont use spreadsheets” hmm.. well….there is nothing else around…that is as simple to set up….and slowly but surely the thing that has improved in Excel is since days of Excel 5 is the calculation stablility and calc speed….


  12. Harlan Grove Says:

    Simplicity and speed are usually a trade-off. I have no idea how Excel handles memory fetching internally, but I do know that on modern CPUs numerical performance is hugely improved by prefetching arrays into the CPU cache. I also know that when you roll your own calculations you can control prefetching, but there’s no way for anyone outside of Redmond to control what Excel does at that low a level.

    Even with Excel there are rules-of-thumb. Avoid volatile functions as much as possible. I have an ongoing gripe with a rather large model (>20MB) that contains about 20K cells with volatile function calls – INDIRECT and OFFSET mostly. Then there are the other cells that refer to those cells. The thing takes the better part of a minute to recalc, so there are event handlers to switch recalc to manual and handle recalculation upon specific triggers. The supposed benefits of using Excel for the ease of its formulas is very much offset by the clumsiness of event handler coding and coordination.

    OFFSET is never necessary. INDIRECT is ony necessary for certain forms of 3D referencing. May I suggest that avoiding OFFSET entirely and minimizing use of INDIRECT should be INITIAL implementation guidelines? And that any NOW() or TODAY() calls be replaced by defined names updated by Open, BeforeSave and BeforePrint event handlers?

  13. Biggus Dickus Says:


    I’m afraid OFFSET is my favourite function in Excel (I’m serious for a change). OFFSET allows me to create calendarized reports driven by user selection from combo baoxes listing the months of the year. This is a VERY common need and OFFSET is designed for just that.

    Could you please explain why “OFFSET is never necessary.” ????

  14. Harlan Grove Says:


    never returns anything different than


    The former is volatile, the latter isn’t. Makes you wonder whether the former has to be volatile.

  15. Harlan Grove Says:

    OK, one amendment. OFFSET can refer to ranges outside its first argument’s range, INDEX can’t.

  16. sam Says:

    I ditched offest when I read (on that it is volitile and Index is not….

    I now define dynamic names using Index/Counta rather that Offset/Counta and that helps.

    I definetly avoid =row(),column(), now() and today() where ever possible.

    I also swithed to =sumproduct(N(Condition1),N(Condition2)…..,(ArraytoSum)) after I learnt that this is faster than =sumproduct(Condition1*Condition2….,ArraytoSum) – tip from Bob Umlas…..

  17. Dick Says:

    I forgot where I first read this, but it is now my “coding motto”:
    1) Make it work
    2) Make it elegant
    3) Make it fast

  18. Phil Bewig Says:

    Every spreadsheet has bugs. Every large spreadsheet has many bugs. If you fix the ten worst bugs in the system, it will still have ten more “worst” bugs that need to be fixed.

    If you could take your favorite large spreadsheet, and magically either fix the ten current “worst” bugs, or by some alternate magic make it ten times faster, which would you choose?

    “Too slow” is just as much an error as “wrong answer”.

  19. Patrick O'Beirne Says:

    Harlan asks why OFFSET has to be volatile.
    I’d guess it’s so that the reference returned by OFFSET is re-interpreted before the rest of the expression is calculated, something like INDIRECT.
    It’s what allows OFFSET to be used to get a sum ending at the cell above
    The B67 inside OFFSET is not seen as a circular reference, where in INDEX it would be.
    Although for that specific case, INDEX can be used too, like this:
    (thanks to DailyDoseOfExcel and Dave McRitchie’s mvps pages)

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: