Software System Safety
From Wikipedia, the free encyclopedia
In Software Engineering, Software System Safety optimizes system safety in the design, development, use, and maintenance of software systems and their integration with safety critical hardware systems in an operational environment.
In the past, industry in general considered increased productivity as the most important aspect of Software Engineering. Little consideration was given concerning the reliability or safety of the software product. In recent years, the role of the software and hardware has become the command and control of complex and costly systems upon which human lives may depend. Engineers must recognize that software is just another system component, and that this component can contain errors or defects which can cause undesired events in the hardware system it is controlling. System Safety Engineers should work with Systems Engineers and domain experts to get requirements decomposed and safety-critical functionality correctly implemented in software by Software Engineers. A software safety process per industry best practices, such as IEEE STD 1228-1994 or equivalent, should be developed and adhered to for methods and techniques to identify potential software design inadequacies and errors which can cause hazards or produce undesired events.
[edit] Overview
Software System Safety, an element of the total safety and software development program, cannot be allowed to function independently of the total effort. Both simple and highly integrated multiple systems are experiencing an extraordinary growth in the use of computers and software to monitor and/or control safety-critical subsystems or functions. A software specification error, design flaw, or the lack of generic safety-critical requirements can contribute to or cause a system failure or erroneous human decision. To achieve an acceptable level of safety for software used in critical applications, Software System Safety engineering must be given primary emphasis early in the requirements definition and system conceptual design process. Safety-critical software must then receive continuous management emphasis and engineering analysis throughout the development and operational lifecycles of the system.
[edit] Goals
- Safety consistent with mission requirements, is designed into the software in a timely, cost effective manner.
- On complex systems involving many interactions safety-critical functionality should be identified and thoroughly analyzed before deriving hazards and design safeguards for mitigations.
- Safety-Critical functions lists and preliminary hazards lists should be determined proactively and influence the requirements that will be implemented in software.
- Hazards associated with the system and its software are identified, evaluated and eliminated or the risk reduced to an acceptable level, throughout the lifecycle.
- Reliance on administrative procedures for hazard control is minimized.
- The number and complexity of safety critical interfaces is minimized.
- The number and complexity of safety critical computer software components is minimized.
- Sound human engineering principles are applied to the design of the software-user interface to minimize the probability of human error.
- Failure modes, including hardware, software, human and system are addressed in the design of the software.
- Sound software engineering practices and documentation are used in the development of the software.
- Safety issues are addressed as part of the software testing effort at all levels of testing.
- Software is designed for ease of maintenance and modification or enhancement
- Software with safety-critical functionality must be thoroughly verified with objective analysis and preferrably test evidence that all safety requirements have been met per established criteria.
This article incorporates text from http://www.monmouth.army.mil/cecom/safety/sys_service/software_handbook.htm, a public domain work of the United States Government.
IEEE STD 1228-1994 Software safety Plans