Profile

unixronin: Galen the technomage, from Babylon 5: Crusade (Default)
Unixronin

December 2012

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     

Most Popular Tags

Expand Cut Tags

No cut tags
Monday, May 5th, 2008 07:43 pm

There are a lot of very sharp, bright people out there.

Unfortunate corollary¹:  There are a lot of depressingly stupid people out there.

[1]  Thanks to the law of averages.

Thursday, May 8th, 2008 12:53 am (UTC)
Whereas I'm thinking of two areas:
1) Where hazard or mistake can occur due to large quantities of knowledge being needed to handle complexity.
2) Where the result of someone else's folly is your pain.

In the first, as I attempted to say with the computer example above, systems tend to grow to points where assuming average or reasonable knowledge is still not sufficient to allow strict control and thus "foolproofing" which an expert might consider a downgrade of experience is necessary.

In the second, well I can't think of a good example of the second at the moment. I have one specific to work that caused me to go on this tangent last night, but it's not something for this venue, and I'm not certain it's truly fitting for the topic. *shrug*

Regardless, I agree that there are many instances in life where this society's trying to make things idiot-proof or create specific warnings against unreasonable acts has reached the bottom and began (begun?) to dig. On the other hand, there are instances where not trying to disambiguate every erroneous possibility, or restrict from happening every error, is contraindicated since the errors can have consequences that outweigh the work input.
Thursday, May 8th, 2008 01:34 am (UTC)
Ah, I see. You're considering both inadvertent user error due to inadequate understanding, and conscious acts resulting from complete failure to think through one's actions. When you say "the result of someone else's folly is your pain", I suspect it would be true to say that this is not so much a case of direct stupidity as of negligence on the other person's part. (Granted, that negligence may be stupid in its magnitude.)

There are situations and applications where a "zero defects" standard is critical. They are surprisingly rare. There are very few situations in which a potential defect, if the possibility is anticipated, cannot be provided for via a safety device, an adequate engineering margin of safety, a failover device, or some other technological precaution. The deadly defects, in this case, are the unanticipated ones — perhaps because a system is too complex to fully predict its behavior in edge cases, perhaps because the properties of a material or device are not fully understood. (For example, the extensive use of kapton as lightweight insulation in aerospace applications, based on its known extremely good resistance to high temperatures, except that nobody knew that above a certain critical temperature it becomes a conductor, since — having never suspected the possibility — no-one ever thought to test for it.) The problem is that, by definition, unanticipated failure modes cannot be predicted, and so it's impossible to guarantee that you're prepared for them all. (Example: United Airlines flight 232. The DC-10 had three completely separate, fully redundant control systems. McDonnell-Douglas apparently never considered the possibility that a catastrophic failure of the tail engine could simultaneously disable all three.)

The thing is, these cases really resolve not to simple stupidity, but to failure to manage (and/or perhaps to fully grasp) overwhelming complexity.


[I think this is the sort of thing you mean. I'm not 100% certain.]
Friday, May 9th, 2008 02:23 am (UTC)
I'm doing wonderfully well at expressing myself of late. This thread a seeming proof in itself.

You are correct, I'm including mistakes of inadequate knowledge, intentional acts, and careless error.

I'm thinking not of situations where zero tolerance is the necessary criterion. But instead of areas where the result of a lack of foolproofing causes frustration. The examples I can come up with lead mostly to annoyance, not to pain. For instance, I dealt today with what I swear was a 0 degree of freedom regulation for sampling. This large level of restrictions, while likely leading to superior results, also lead to a wonderful threat of several million dollars in fines for the inability to obtain it, for the inability to meet o'erweening criteria.

I feel like I'm writing the nonsense poetry I enjoyed as a child. "I am not certain quite / that even now I've got it right." Or clear. But I'll refrain from arguing longer. I think we're largely if not almost entirely in agreement that the level of foolproofing and liability addressure has long since passed the reasonable. But I find the response that all foolproofing is unworthy a goal as almost knee-jerk and thus wrong.
Friday, May 9th, 2008 10:36 am (UTC)
But I find the response that all foolproofing is unworthy a goal as almost knee-jerk and thus wrong.
Ah, see, now I never said one shouldn't make reasonable efforts to make things foolproof. Just that when you make all reasonable efforts to make something foolproof, and someone still manages to screw up through some truly impressive feat of total boneheadedness, the correct response is not for a crowd of lawyers to run up and offer to sue people, but for everyone to ask "Well, what'd you go and do a damn fool thing like that for?" And yeah, when you have a regulation that specifies how something is supposed to be done to such a degree of detail that it becomes impossible to actually comply with the regulation in question, then it's time to scrap the regulation, do it over, and fire the nitpicking idiot who drafted it.