There are a lot of very sharp, bright people out there.
Unfortunate corollary¹: There are a lot of depressingly stupid people out there.
[1] Thanks to the law of averages.
There are a lot of very sharp, bright people out there.
Unfortunate corollary¹: There are a lot of depressingly stupid people out there.
[1] Thanks to the law of averages.
no subject
no subject
no subject
One good measure of intelligence is how many people you can get to do your work for you. ;)
no subject
no subject
no subject
...rather than actually getting something progressive accomplished. cleaning up other people's messes is just treading water.
no subject
See also:
"Any time you think you've finally made something idiot-proof, the Universe comes up with a better idiot."
Personally, I think the safety-ninny culture in the US is breeding the human race for stupidity.
no subject
the problem with not attempting to make things foolproof when given reasonable evidence of human foolishness is then you become liable to litigation :P
skinning your knees as a toddler teaches you not to fall down.
or, well, i watched my son learned to walk and realized that throwing your hands up to catch yourself when you fall is a learned reflex, as his many early cranial scrapes attest. make the entire world soft enough and no one would learn to do that any more....
no subject
no subject
no subject
no subject
Hazard due to egregious misuse of otherwise perfectly sound products simply lends itself to some of the easiest examples, and then brings up the compounding vice of the idiot suing the manufacturer of the product for failing to both anticipate said idiot's particular brand of idiocy in advance and warn said idiot in self-luminous letters at least three inches high that — for example — "severe harm or injury may result" if you stick your head into a running wood-chipper to see where the blockage is.
no subject
1) Where hazard or mistake can occur due to large quantities of knowledge being needed to handle complexity.
2) Where the result of someone else's folly is your pain.
In the first, as I attempted to say with the computer example above, systems tend to grow to points where assuming average or reasonable knowledge is still not sufficient to allow strict control and thus "foolproofing" which an expert might consider a downgrade of experience is necessary.
In the second, well I can't think of a good example of the second at the moment. I have one specific to work that caused me to go on this tangent last night, but it's not something for this venue, and I'm not certain it's truly fitting for the topic. *shrug*
Regardless, I agree that there are many instances in life where this society's trying to make things idiot-proof or create specific warnings against unreasonable acts has reached the bottom and began (begun?) to dig. On the other hand, there are instances where not trying to disambiguate every erroneous possibility, or restrict from happening every error, is contraindicated since the errors can have consequences that outweigh the work input.
no subject
There are situations and applications where a "zero defects" standard is critical. They are surprisingly rare. There are very few situations in which a potential defect, if the possibility is anticipated, cannot be provided for via a safety device, an adequate engineering margin of safety, a failover device, or some other technological precaution. The deadly defects, in this case, are the unanticipated ones — perhaps because a system is too complex to fully predict its behavior in edge cases, perhaps because the properties of a material or device are not fully understood. (For example, the extensive use of kapton as lightweight insulation in aerospace applications, based on its known extremely good resistance to high temperatures, except that nobody knew that above a certain critical temperature it becomes a conductor, since — having never suspected the possibility — no-one ever thought to test for it.) The problem is that, by definition, unanticipated failure modes cannot be predicted, and so it's impossible to guarantee that you're prepared for them all. (Example: United Airlines flight 232. The DC-10 had three completely separate, fully redundant control systems. McDonnell-Douglas apparently never considered the possibility that a catastrophic failure of the tail engine could simultaneously disable all three.)
The thing is, these cases really resolve not to simple stupidity, but to failure to manage (and/or perhaps to fully grasp) overwhelming complexity.
[I think this is the sort of thing you mean. I'm not 100% certain.]
no subject
You are correct, I'm including mistakes of inadequate knowledge, intentional acts, and careless error.
I'm thinking not of situations where zero tolerance is the necessary criterion. But instead of areas where the result of a lack of foolproofing causes frustration. The examples I can come up with lead mostly to annoyance, not to pain. For instance, I dealt today with what I swear was a 0 degree of freedom regulation for sampling. This large level of restrictions, while likely leading to superior results, also lead to a wonderful threat of several million dollars in fines for the inability to obtain it, for the inability to meet o'erweening criteria.
I feel like I'm writing the nonsense poetry I enjoyed as a child. "I am not certain quite / that even now I've got it right." Or clear. But I'll refrain from arguing longer. I think we're largely if not almost entirely in agreement that the level of foolproofing and liability addressure has long since passed the reasonable. But I find the response that all foolproofing is unworthy a goal as almost knee-jerk and thus wrong.
no subject
no subject
no subject
Even the smartest people are stupid at times.
no subject
no subject
You're doing it wrong
You've made multiple errors:
1. You're confusing average with median. Depending on the distribution, more than half could be on either side of the average.
2. Even if it were a balanced distribution with median at 100, some of the population would be at exactly 100. This means there would be less than half on either side.
3. Graham's law states that the rate of diffusion of a gas is inversely proportional to the square root of its molecular weight.
:-P
Re: You're doing it wrong
And I think you're referencing different Grahams. :)
no subject
no subject
I should NOT be reading this when I'm attempting to get a classful of 2nd and 3rd year Medical students through a set of tests!
no subject
:)
no subject
no subject
no subject
no subject
no subject
no subject
no subject
i've known enough people like that to recognize that it happens. not that i can still logically understand why anyone would want to live breathing their own smoke but it sure seems like they enjoy it.
no subject
no subject