Profile

unixronin: Galen the technomage, from Babylon 5: Crusade (Default)
Unixronin

December 2012

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     

Most Popular Tags

Expand Cut Tags

No cut tags
Wednesday, March 11th, 2009 05:25 pm

Disney and Scholastic share a Software Hall of Shame raspberry today.

For what?  Disney Magic Artist Deluxe and Scholastic I Spy Fantasy.

Why?

Because they're both children's games — young children, as in Wendy's is giving I Spy Fantasy away in Kids' Meals — that require Administrator privileges to run.

FAIL.

Don't these people ever think before they write code?

Thursday, March 12th, 2009 11:06 am (UTC)
I still disagree and I still don't think we're talking at cross purposes.

In my example with the botched SQL update, the client knew the job had to be done — the client didn't believe the operation was critical. It was a routine operation that had a low chance of exploding. The client didn't understand the risks they were facing and didn't understand the consequences of those risks coming to pass (mostly, "we run around and scream wildly"). They were willing to pay a low rate for a couple of hours of work because they were satisfied the job didn't warrant more than that.

As you say, "I check everything under my control." It's a great policy and I agree with it. I just emphatically disagree that "experience/training makes those kinds of checks incidental." Experience and training can reduce the amount of time necessary to spend on this overhead while still maintaining your level of professional diligence, but except for trivially small projects I don't see how that overhead can ever be minimized to the level where it can be called incidental.

I share in the spirit of your first law. It's not my first law, but it's pretty high on the list. I usually phrase it as "no crisis ever came from a controlled failure." Software failure is not necessarily a bad thing. There was a plane crash a while ago in South America where a Boeing autopilot scaled back the engines on landing when the altimeter reading dropped abruptly from 2000ft above ground level to -8 feet AGL. If the autopilot software had assumed the altimeter was capable of being batshit insane from time to time and reacted appropriately, the disaster would probably have been avoided. The altimeter's failure was not the source of the crisis; that was the autopilot's inability to control the failure.