I had mentioned not long ago that I was shopping for a new car, due to my GM rusting, failing transmission and failing fuel pump. I ended up buying a used 2010 Toyota Corolla.
One of the things purchasing the vehicle did was to remind me of how important it is to always fully specify requirements for both hardware and software. You see, it simply never occurred to me to have on my checklist, the requirement of "Does the headlight switch work?". The answer is no! Understand that it is not a broken switch, it is designed not to work!
The car has a light sensor that turns on the headlights anytime the engine is running and it is dark, per some light sensors and computer algorithm hidden away in the Engine Control Unit. Here in cold Pennsylvania I get my car out of the garage to warm it up, before adventuring out on the daily commute to work, while playing Russian Roulette with the local deer population (Hunters complaining there are not enough deer are not hunting around here!). Anyway, my now impossible to turn off head lights are now shining directly into the neighbors bed room window, as the car warms up. While I've seen far to many people driving around here with their headlights off in the dark, would not a simple LED on the instrument panel have been sufficient? A complex device like a automobile should never over ride the judgment of its user. Do we really need technology to control impulses, enforce good behavior? What happens when all technologies fail?
I think "Smart Cars" are about to get to smart. I once almost had a accident on Interstate 80, in a construction zone.
Somehow a car that was over packed pulled from between some construction equipment, in front of me.
This driver could not see out any window but right in front of him, and he was pulling across the interstate traffic, not going with the flow. He could not see out the passenger window, that was facing me.
He shot out from between the construction equipment about twenty feet in front of me, while I was doing 45 MPH, remember it was a construction zone.
The correct solutions to the problem was to floor the gas, so that I could get in front of him while there was still space, and get off on the right hand brim of the road.
Toyota's system, Toyota cars to monitor driver's eyes for safety, would have guaranteed that a crash happened in that situation by applying the breaks.
"Toyota will start building a safety system into some of its cars this year that monitors if a driver is clearly watching the road during situations when a crash may occur. The system is based around a camera that watches the driver's upper and lower eye lids to evaluate how attentive he or she is to the road ahead. It builds on a current system that measures the driver's head direction when driving. The car's safety system continuously monitors the road ahead using a radar system, and if it determines a crash may be possible, it matches this with the driver evaluation gathered from the camera. If the driver doesn't appear to be paying attention it sounds a buzzer and warning light. If things progress and a crash becomes probable then it also tries to gain attention of the driver by quickly applying and releasing the brakes. At this point the car's pre-crash brake assist system is also readied. When a crash is judged to be unavoidable the safety system engages the brakes and seat-belts for the collision."
Moving from Requirements to Software Quality, the January 2010 issue of PragPub Magazine from Pragmatic Programmers, has an article that you should checkout: Rediscovering QA; Three Unusual Attributes for Software Quality Assurance by Chris McMahon.
PragPro Magazine is edited by Michael Swaine who was the long time editor of Dr. Dobb's Journal, giving PragPro the flavor of Dr. Dobb's in its hayday.
Also this week Walter Bright wrote a piece Patterns of Bugs, where he covers several common bug patterns he has encountered over the years. Take note of the articles comments as they point to some interesting sounding papers and tools. For example one of the commenter's, Kennn [SIC], to Walter's piece points out Andrew Koenig classic paper C Traps and Pitfalls, at Literate Programming.
Literate Programming is one of those things that sounds great in theory but I've personally seen it fall apart in practice. In a nutshell Literate Programming is where you write the documentation for the program, and that documentation is then transformed into executable code.
The Open Source schematic capture package gEDA was originally written in NOWEB (which has nothing to do with the Internet Web). Many people wanted to contribute to the gEDA project, but few wanted to be bothered to learn this obscure language. Only after NOWEB was abandoned in favor of straight C code did the project start to advance significantly.
Walter makes a case for creating an Open Source repository for bugs, where people can see the mistakes others have made. Add a rating system, as in which bug happens the most like, '=' rather than '==', and we have the makings for yet an other social networking site to consume our time, distract us from our goals in life, and make some extremely wealthy. Who wants in? :-)
Perhaps NASA's Lessons Learned site would be a good example? Contrast the space and aviation industry with how they learn from their mistakes and share the knowledge, with that of the medical industry, of hospitals and doctors, where they get to bury their mistakes. Those mistakes get covered up, literally and figuratively, so no one learns from them and they are repeated over and over. For inexplicable reasons my wife's hobbies are to collect medical and insurance horror stories, and there are more every day. She could make a career out of it if anyone paid for such a thing. I keep trying to get her to blog about it, but can't get her interested, so far.
Wonder if I'll void the remaining warranty by installing an old fashion low tech. toggle switch...