This week (June 15th) wrapped up the 2010 Safe & Secure Systems & Software Symposium - S5 sponsored by the Air Force Research Laboratory Control Systems Development and Applications Branch (AFRL/RBCC), located at Wright-Patterson Air Force Base, purported home of Hangar 18.
The Air Force Research Laboratory Control performs basic research, exploratory development, and advanced development of flight control system technologies for highly reliable, fault tolerant vehicle stabilization, and flight path control. Develops concepts, components, criteria, analytical methods, and design tools for flight vehicle applications. Emphasizes development of fault tolerant control system architectures control automations, flight critical hardware and software, such as actuators, sensors and computational elements; and on-board and networked vehicle management systems. Integrates flight control with airframe, avionics, propulsion, utilities, mission, and vehicle flight path management. Conducts full spectrum of development and application activities ranging from initial concept definition through system mechanization, laboratory evaluation, and flight validation.
The AFRL/S5 event brought together industry, academia and government to collaborate on the common goal of improving the airworthiness and assurance certification process of future aerospace flight control systems with both incremental and revolutionary technological innovations in safety and security verification and validation (V&V) techniques that support maintaining cost and risk at acceptable levels.
The executive summary list for the S5 event includes:
- Improving V&V for flight critical/safety and mission/information security
- Designing for Airworthiness Certification:
- Software for Complex Systems
- Fundamentals for the Future
A good paper to start with, to get the flavor of the event, and to see the relevance to Embedded Systems, is: Software survivability: where safety and security converge by Karen Mercedes Goertzel and Booz Allen Hamilton.
The S5 agenda contains links to all of the presented papers (those are links at that site, but it is not obvious in all browsers).
The Goertzel/Hamilton paper puts particular emphasizes that few of us give much consideration, which is something that we all need to change, that is Threats and Attacks, on our systems. Threats and Attacks "Require human intention and intelligence in planning and execution", which stand in contrast to the normal Hazards that we do consider; the stuff of life that just happens, like lightning strikes, failed components, lead free tin-wiskers.
Today's Adversaries are:
- Knowledgeable: They know more about our software than we do, including its vulnerabilities.
- Skill and sophisticated: Not just "script kiddies". Attackers know how to exploit vulnerabilities, how to augment/assist direct attacks with social engineering and surreptitious malware (worms, Trojans, bots, spyware).
- Quick: "Zero day" is the rule, with new attacks appearing before vulnerabilities are discovered by developers, let alone patched.
- Motivated and well-resourced: Not just recreational hackers, but organized criminals, nation-state Info Warriors, Cyber Terrorists.
Our adversaries are not motivated by the Next Quarters Bottom Line, unlike our shorted sited corporate leaders. They are motivated by greed (hmm...that makes it sound like our corporate leaders are our adversaries doesn't it?) or the desire to do us harm. The Bad Guys usually have better resources and a single task to accomplish, unlike most Embedded System development groups, and deadlines that are not motivated by trade show schedules. The Bad Guys take advantage of the mantra of "There is never time to do it right now, but there will always be time to do it over someday in the future", that far to many organizations use as their design standard.
Goertzel/Hamilton mention
(see Naval Sea Systems Command (NAVSEA)/Naval Ordnance Safety and Support Activity (NOSSA)) one of my pet-peeves "unnecessary functionality" as one of the "Hazards and risks that arise in software". The Creeping Feature Creature can be a powerful taskmaster, that leads to increased time to market and development costs that impact the bottom line, for features that will never be used by a customer, but seem "really cool" to management and developers.
Goertzel/Hamilton do give some recommendations on how to improve Software Safety and Security. Alas not of them seem practical in the corporate world due to time, budget, and size constraints.
Spend some time reading all of the other papers, to see where Safety Critical System development is headed.