Sunday, June 20, 2010

Safety and Security Considerations for Component-Based Engineering of Software-Intensive Systems; Engineering software for survivability intro.

In the document known as the AS is State Report [SIC] from the Navy Software Process Improvement Initiative (SPII), the Assistant Secretary of the Navy for Research, in 2007, stated that all systems are to be considered to be Software Intensive, unless a strong case can be made to the contrary. The Navy has been working with other branches of Government to develop plans related to Software Safety.

The Navy and obscure branch of the Department of Homeland Security known as Build Security In, a project of the Strategic Initiatives Branch of the National Cyber Security Division (NCSD), has released a new draft [June 11th/2010] of Safety and Security Considerations for Component-Based Engineering of Software-Intensive Systems.

The Naval Sea Systems Command (NAVSEA) Composition draft is based on the earlier Department of Defense, Joint Software Systems Safety Engineering Handbook, Draft Version 0.95, 2009.

Specifically, the paper discusses:

  • The types of anomalous, unsafe, and non-secure behaviors that can emerge when components interact in component-based systems;
  • Analysis and assessment techniques that can be used to predict where and how such anomalous behaviors are likely to occur;
  • Architectural engineering countermeasures that can be used by the system's developer to either prevent such behaviors or to contain and minimize their impact, thereby mitigating the risk they pose to the safe, secure operation of the system.
  • Architectural engineering techniques and tools that can be used to mitigate many emergent risks and hazards.

The referenced documents on system and software development alone make the paper worth a look.

"Because properties such as safety and security are not intrinsic to individual components in isolation-these properties emerge from the interactions between components or the interactions between a component and its environment or a human user-individual component testing and analysis (e.g., through static and dynamic analysis, fault injection, fuzzing, etc.) can only provide incomplete and indirect evidence of how the component might behave when interoperating with other components within a component-based system. Emergent properties can only be demonstrated through testing that involves component interactions, e.g., pair-wise component testing and testing of whole component assemblies."

I'm not sure I agree with the papers premises that buffer-overflows should be mitigated by 'sandboxing' areas of code that could have buffer-overflows. That might end up being a really big 'sandbox' in some cases. Why not design the code to not allow buffer overflows?

Other sections I find much easier to stomach, such as:

  • Hazards and risks that arise from composition.
  • Parameter-passing issues.
  • Timing and sequencing issues. Right data at the wrong time; firing your 50mm gun before aiming it is not good.
  • Resource conflicts. Race Conditions and Deadlocks.
  • Unanticipated execution of unused/dormant code. One of my favorites pet-peeves is to see 'dead code' in live products.

Section five gives some guidance on FMEA, FMECA, and Hazard Analysis for Component-Based Software:

"As noted in the Joint Software Systems Safety Engineering Handbook, because software has no physical failure modes, Failure Modes and Effects Analysis (FMEA), Failure Mode, Effects, and Criticality Analysis (FMECA),61 and related analysis can be difficult to apply to software intensive systems. This said, software does have functions that can be implemented incorrectly, operate erroneously, or fail to operate at all for various reasons. For component-based software, FMEA/FMECA that strives to identify the causes and potential severity (or criticality, in FMECA parlance) of failures of software functions needs to consider not just errors within individual software components, but errors that may arise from "mismatches", such as expected but-not-received, unexpected, or incorrectly-formatted input from or output to other components."

The Draft goes on to explain ways of mitigating risks.

Unlike most documents out of the Government and Military Academia, this one at least mentions Embedded Software:

"Embedded Software: Software physically incorporated into a physical system, device, or piece of equipment whose function is not purely data processing, or external but integral to that system/device/equipment. The functionality provided by embedded software is usually calculation in support of sensing, monitoring, or control of some physical element of the equipment, device, or system, e.g., mathematical calculations of distances used in calibrating the targeting mechanisms of a weapon system; interpretation and comparison of heat sensor readings in a nuclear-powered submarine engine against a set of safe temperature thresholds."

Alas they still do not grasp the constrained resources of most Embedded Systems.

Appendix C. "Engineering software for survivability" gives a good, short, synopsis what should be considered when designing any Embedded System that needs to keep running, no mater what.

No comments:

Post a Comment