Archive for August, 2007

Excel versions brief description

Wednesday, 8th August, 2007

Here is my description of the various versions, add your own views and/or disagree as appropriate.

Excel 97

Big upgrade from 95, 16k to 64k rows. dual file format (remember them?), VBA now in VBAIDE not just ‘sheets’. COM add-ins via xla/xll wrapper only. A few things from more recent VBA don’t work in 97 – like non-model forms. Plenty of people have stopped supporting this version now. Codematic still does, not that anyone asks (not in the last 12m anyway). Many of us are still trading off the knowledge we gained working with this version (Biggus?) (mainly because most changes since have been non-fundamental, more refinements than massive changes). And because actually not that many new developers are rushing in, see Dick here.

Excel 2000

VBA upgraded from VB4/5 (?) to VB6 engine – big improvement. COM add-ins for commands, w/s functions still via xla/xll wrapper. Bit unstable. Very common upgrade (poss from 95?) still standard in many large orgs.

Excel 2002 (xp)

Like 2k but more stable, improved error checking features. Automation add-ins now possible via COM direct from worksheet functions (performance uninspiring). Not enough improvements for many orgs to upgrade from 2k, but a few skipped from 95/97.

Excel 2003

Like 2k again but more stable (rock solid in my experience, you can crash it, but its generally pretty tough), easy on the eye graduated UI. Many large orgs have upgraded to this in the last 2 years. A few tidy-up changes from 2002, but probably not enough to justify a migration for many orgs. XML beginning to be worth using, allowing server access without instantiating Excel (server Excel not recommended by MS). Enterprise management features now working pretty well (policy stuff). This is my preferred version, a good vintage you might say.

Excel 2007

I havent used this for fee paying work so can’t really comment in depth. Initial reactions were disappointment, the crazy UI of course was the biggest, although there seemed to be lots of great new features I couldn’t find them within my frustration threshold. Maybe I classify as too busy/impatient/easily frustrated to migrate. The other big disappointment was the lack of .net integration. I really thought the framework would have been distributed as part of the O2007 install, which would have really opened up the market for 3rd party .net add-ins. And might have helped me recoup my investment in C# learning. (bitter? me?). I kinda expected performant .net UDFs (I havent tested if they are any better than 2003, they might be) as well as realistic .net component deployment. XLM is still going strong though, well its still there, just. I hope we get a usable .net story before XLM is retired.

Excel 2007 uptake forecast

I can’t see many large orgs that recently went to 2003 moving to 2007 in the next 2 years (thats a 3-4 year lag). Those left on 97 are already unsupported so unlikely to migrate to 2007. Many large orgs only went to 2000 2-4 years ago (ie a 3-5 year lag) they wont be rushing to 2007. Most personal users will get 2007 by default with new pcs. Smaller orgs may not be in a position to benefit from the powerful integration features in O2007, or may not have the server infrastructure. So it looks to me like medium sized orgs, and maybe those large orgs with tired 2k installs may make the move, plus small org and personal new pc buyers, will likely be the market. What do you reckon?? Anyone got firm data either way? (If not just make it up like I have!).

The improved integration with Sharepoint and other server stuff may well be the source of pressure to migrate to 2007. This server side influence is a bit of a new thing so I’m really (really really) interested to see the impact. This, I think will tell us a lot about our markets future.

I’m looking forward to see how things pan out over the next 12m, as far as I am aware MS are pleased with 2007 uptake so far.

What were the key features (good and bad) for you in each version? (Don’t bitch about 2007 too much I have another post about that coming up soon ;-))

Cheers

Simon

Excel version usage

Tuesday, 7th August, 2007

Saw this recently, its a common question, how many people are using the various version of Excel.

Some software vendor kindly gave details from installs of their add-in here:

http://discuss.joelonsoftware.com/default.asp?biz.5.517173.9

Martin Green (spoke at last years UK Excel User conf) ran a survey here:

http://www.fontstuff.com/comment/comment05.htm

neither considers 2007 (who does?) – oh thats harsh ;-)

Summary if you are too busy to click the links:

  • 97 – 5%
  • 2k – 15%
  • XP – 20%
  • 2k3 – 60%

Both sources said they might be skewed towards more tech savvy users.

The numbers seem reasonable to me, what do you think?

cheers

Simon

Defcon

Monday, 6th August, 2007

Anyone else following this?

Defcon is a security conference, (/hacker convention) where some of the brightest minds from both sides of the security industry get together to see whats new. (3-5,000 of them! registration is cash only, excellent).

Their wireless network is described as the most hostile environment in the world.

I’ve already started saving to go next year. Flights are 300 quid cheaper than Seattle (which have increased massively – 500 GBP in March, 800 in Sept!). I’ll take an old (ie disposable) laptop I think.

Anyway the big news this year is some reporter tried to infiltrate the conference in breach of their privacy rules. She was apparently using a hidden camera to try and get something news worthy. Anyway she got hounded out, but the general view is that is the least of her troubles. Think credit rating, identity theft etc etc. Press are allowed but have to comply with some seemingly reasonable rules, as the event attracts undercover Feds etc, as well as some bad lads.

Here is a link

Motto: be careful who you mess with!

oops looks like her LinkedIn profile went already.

cheers

Simon

Dead machine part 2

Sunday, 5th August, 2007

[just a FYI – don’t let this post stop the debate on tech books]

Careless careless careless.

http://forums.zonelabs.com/zonelabs/board/message?board.id=MalwareDiscussion&message.id=2466

Why oh why didn’t I google it first?, too complacent I guess (and it was getting late).

Seems like Zone Alarm killed my machine. Thats Zone Alarm *ANTI* Virus.

They recommended I delete my video driver! thanks guys!

And I thought Symantec was bad from taking up all my system resources, I don’t remember them telling me to kill my machine.

I’m happy to accept a proportion of the blame for being so naive and dumbly doing what they advised, but it still seems a bit poor from their side.

On a lighter note one of the guys on that forum followed the ZA advice and killed his machine, realised what he did, restored it, re ran the scan, then in a ‘Homer Simpsonesque moment’ thought he was attacked by the same trojan again and obediently killed his box again!

Looks like I’m gonna be changing AV vendor sooner rather than later!

I ran Windows in Safe mode (F8 on start up), re-installed the video drivers and rebooted normally. Everything is now working fine. I’m in the process of updating my virus defs before I get caught out like the bloke above!

So from 2 broken machines, I now have 2 working ones. cewl. I havent done any fee paying work, but now I have the machines to do it on. I actually went out yesterday to buy a new machine. But luckily (as it turns out) I couldn’t bring myself to pay so much more than internet price for so much less power, just to bring it home the same day.

And yes I guess this one wasn’t really MS/Windows fault either. In fact I think the AV community has kicked up a stink because MS specifically tried to prevent this type of AV driven blunder from killing Vista.

Lessons learnt: be even more sceptical, check properly – dont delete .inf’s willy nilly or was it .oem?.

I normally refresh my defs and immediately run a scan, I’m thinking I might update, wait 24hrs, check google, then run the scan in future. Anyone think that makes sense or already doing that?

Cheers

Simon

Tech books

Saturday, 4th August, 2007

I’ve got lots of tech books, and have even read many of them, I’m beginning to spot a trend. I wonder if anyone else has noticed? Or if I am wrong? Or why?

Most recent Tech books in the Office development space have been written by Microsoft staffers. Why is that?

In the olden days we had JW, and the Dummies and Bibles series, and PED etc. and for sure most of these are getting updated for 2007. But nearly all the .net and VSTO stuff is from MS staffers. That seems odd. Neutral industry experts have served us well over the years (IMO), why has it changed?

Not convinced something happened?

VSTO-Mere-Mortals – Paul works for MS as does Kathleen

Visual-Studio-Tools-Office – Both Erics work for MS

MS-Net-Development-MS-Office – Andrew works for MS

Professional-Excel-Services – Shahar – MS right?

Beginning-Excel-Services – By MS’s Lead developers of Excel Services.

(btw I think I might try to get into Excel services – tip for the future – I think there is a living to be made here)(from 2010 probably!)

See the pattern?

I’m not saying all books in this space are by MS staff, but a large proportion seem to be.

I can’t think of one VBA (ack too! [spits in disdain]) based book that was written by an MS staffer. Certainly the classics are either from Baarns, JW or part of the PED team.

Many of the pure .net books are written by non MS’ers, it seems to be just the .net/Office space.

I wonder if this stuff is sooo bleeding edge that no one in industry is really using it to the extent they could write a useful book? Or using it at all?

Or have MS simply started encouraging staffers to write?

All the non MS book authors always got good access to the product teams to ensure the books were out around RTM time, so how come it suddenly went in-house?

The thing that worries me about this trend is how real world applicable these books are if they are not based on varied, ‘in the trenches’ experience. One of the great strengths of the PED guys is you know that’s how they make a living.

If the MS books are based on the same use cases that drove the product development, it may give a false impression of product coverage. The gaps (in the product or the book) could well bite you way down the line, when you are already committed.

I remember an ASP project years ago where some books were so superficial we really got caught out (gaping security blunder), and other books that said ‘this is the way they say to do it, and here’s why you should never do that, and here is what to do instead’. We preferred the latter – big time!

I have read several of the books above, and have no complaints at all. But then I have never delivered a commercial product based on Excel and .net to a fee paying customer, so I would suggest I don’t know enough to have a valid opinion. (Actually thinking about it I have delivered a .net, Excel and Access app commercially, but I still don’t feel qualified).

What do you think? Let me know if I got any authors wrong. Is this trend happening in all tech books?

Cheers

Simon

XLA design 2

Friday, 3rd August, 2007

So in the last post we basically got to:

  1. Don’t do temp work in the XLA (even though it almost certainly will not get saved (even if we try according to Ross!))
  2. Do stuff in a temp wb that is not hidden and appears when screenupdating gets turned back on. (Does anyone else get it so that screenupdating doesn’t come back automatically sometimes?)

Now I’m not going to disagree with that but…

If things do go wrong the user is going to get left with an inconsistent (but visible) workbook sitting in their environment that they may not know what to do with.

If its in the XLA and it fails, it will do it quietly, or maybe with a polite warning. They may be able to rerun it and everthing work or they restart Excel and re-open the add-in and everything should be sweet (in theory!). That seems potentially a better option to me? Especially as you can now grab the restart to fix things up.

I’m thinking of some transitory problem like the network bouncing, or a resource temporarily locked.

The other issue I have is messing with the users Excel environment, I like the PED approach of dropping an xla in xlstart to fix things up after a crash, but must confess I have never done that in practice. Have any of you? Has anyone ever copied the .xlb and restored that on failure/recovery? Finding it/them is hard I know!

The other potential option I keep coming back to (thinking about) is automating another Excel instance, doing all the background stuff in there and then bring it back to their current environment if it succeeds. Of course if that goes bang they are going to end up with loads of invisible Excels dragging their system down. I’ve done this with VB/.net apps, and managing those zombie Excels can be a pain. Has anyone done this from an XLA? maybe its inappropriate if you are generally working on the active wb?

I’m really trying to come up with a rock solid add-in approach, that isn’t going to give the users any frights or suprises. Its not enough to be careful, add-ins do crash, and when they do, they usually take the whole Excel instance with them. Well actually thats not true, COM add-ins tend to fail on their own and don’t bring Excel down.

I’m beginning to think there is no clear winner approach. It seems there are pros and cons to both the ‘hide-it-till-its-ready’ approach and the ‘show-all-workings’ approach. Do you think target user ability might be a factor, as well as the add-ins job?

As a matter of interest, if you are not using the worksheets in your xla for anything much, why dont you use a COM add-in instead? is it deployment/tool access (VB6 or Dev Office), or other factors?

Cheers

Simon

XLA design

Thursday, 2nd August, 2007

I have been revisiting the design of some XLAs I wrote a while ago.

What I have tended to do is populate the worksheet part of the XLA and at the last minute copy the worksheets out to a new workbook.

I actually think it may be better to create the new workbook early on, populate its worksheets directly and at the last minute unhide it.

Which approach do you tend to follow?

The reason I think the second may be better is it stops the addin keep getting written to and possibly ending up in a corrupt state should things go wrong. Not that any add-in I ever wrote crashed Excel, No Sir-ee Sir! ;-)  ;-).

It also occurs to me that the second approach will make the XLA easier to migrate to VB6 + .xlt, which is something I have begun to do.

Do you have a general preference? Or a better way?

Cheers

Simon

Possibly rescued machine

Thursday, 2nd August, 2007

I have been messing with the old box that always blue screened in Windows, but seemed to run ok with Ubuntu live. First off it did eventually die running the live CD so it wasn’t just a Windows problem, in fact it wasn’t a Windows problem at all. (assuming that it keeps running whilst I type this of course).

The Ubuntu CD came with a ‘check memory’ option, now I thought PCs did this on start up anyway. But if this box was doing a memory check it wasn’t doing it very well. The Linux tests highlighted lots of memory failures. I took out all the sticks of RAM, except 1 (the biggest). And Lo, all is well. (so far).

As a matter of interest all the Windows blue screens, were ‘to protect my pc’, and reported a different failing file each time.

So sorry to Microsoft and to Windows for incorrectly pointing the finger of blame, wtf to PC boot up ‘memory checks’, and well done to Linux for having some easy to find proper memory checking utils.

I still have the broken lapper to deal with, but I was thinking I should probably do some real work at some point.

cheers

Simon

Another dead box

Wednesday, 1st August, 2007

I managed to kill my Windows internet machine yesterday. Thats keeps my average at 1-2 dead machines per year. I’ll hold my hands up and admit its quite often my own fault.

I’m not keen to take the blame for yesterdays adventures though. I virus scanned the machine (with Zone Alarm, (rapidly losing its status as my fave AV)), it claimed one .dropper virus and quarantined a file. It also said I should delete the file, so I did – doh!

I won’t be so trusting next time!

I know this is a software thing and not hardware because its a dual boot, and it runs fine in Linux.

I have another machine that died last year, I have just left it gathering dust because it really looked like a hardware fault. Today it dawned on me that if it runs a live Linux CD then it can’t be a hardware problem and is probably another broken Windows installation.

So here I am in Firefox on Ubuntu edgy eft, no blue screens in sight, which leads me to the inescapable conclusion that I have 2 broken Windows installations not 1.

So, hot tip: use a live CD as a quick check of hardware health.

And a question, Whats your preferred Anti Virus?

(And/or whats your preferred internet pc (and/or development pc) setup?)

Cheers

Simon