Jump to content

DO-178B

From Wikipedia, the free encyclopedia
(Redirected from DAL A)
Software Considerations in Airborne Systems and Equipment Certification
Abbreviation
  • DO-178B
  • ED-12B
Latest versionDecember 1, 1992; 32 years ago (1992-12-01)
Organization
PredecessorDO-178A
SuccessorDO-178C
DomainAviation

DO-178B, Software Considerations in Airborne Systems and Equipment Certification is a guideline dealing with the safety of safety-critical software used in certain airborne systems. It was jointly developed by the safety-critical working group RTCA SC-167 of the Radio Technical Commission for Aeronautics (RTCA) and WG-12 of the European Organisation for Civil Aviation Equipment (EUROCAE). RTCA published the document as RTCA/DO-178B, while EUROCAE published the document as ED-12B. Although technically a guideline, it was a de facto standard for developing avionics software systems until it was replaced in 2012 by DO-178C.

The Federal Aviation Administration (FAA) applies DO-178B as the document it uses for guidance to determine if the software will perform reliably in an airborne environment,[1] when specified by the Technical Standard Order (TSO) for which certification is sought. In the United States, the introduction of TSOs into the airworthiness certification process, and by extension DO-178B, is explicitly established in Title 14: Aeronautics and Space of the Code of Federal Regulations (CFR), also known as the Federal Aviation Regulations, Part 21, Subpart O.

Software level

[edit]

The Software Level, also termed the Design Assurance Level (DAL) or Item Development Assurance Level (IDAL) as defined in ARP4754 (DO-178C only mentions IDAL as synonymous with Software Level[2]), is determined from the safety assessment process and hazard analysis by examining the effects of a failure condition in the system. The failure conditions are categorized by their effects on the aircraft, crew, and passengers.

  • Catastrophic – Failure may cause a crash. Error or loss of critical function required to safely fly and land aircraft.
  • Hazardous – Failure has a large negative impact on safety or performance, or reduces the ability of the crew to operate the aircraft due to physical distress or a higher workload, or causes serious or fatal injuries among the passengers. (Safety-significant)
  • Major – Failure is significant, but has a lesser impact than a Hazardous failure (for example, leads to passenger discomfort rather than injuries) or significantly increases crew workload (safety related)
  • Minor – Failure is noticeable, but has a lesser impact than a Major failure (for example, causing passenger inconvenience or a routine flight plan change)
  • No effect – Failure has no impact on safety, aircraft operation, or crew workload.

DO-178B alone is not intended to guarantee software safety aspects. Safety attributes in the design and implemented as functionality, must receive additional mandatory system safety tasks to drive and show objective evidence of meeting explicit safety requirements. Typically IEEE STD-1228-1994 Software Safety Plans are allocated and software safety analyses tasks are accomplished in sequential steps (requirements analysis, top level design analysis, detailed design analysis, code level analysis, test analysis and change analysis). These software safety tasks and artifacts are integral supporting parts of the process for hazard severity and DAL determination to be documented in system safety assessments (SSA). The certification authorities require and DO-178B specifies the correct DAL be established using these comprehensive analyses methods to establish the software level A-E. Any software that commands, controls, and monitors safety-critical functions should receive the highest DAL - Level A. It is the software safety analyses that drive the system safety assessments that determine the DAL that drives the appropriate level of rigor in DO-178B. The system safety assessments combined with methods such as SAE ARP 4754A determine the after mitigation DAL and may allow reduction of the DO-178B software level objectives to be satisfied if redundancy, design safety features and other architectural forms of hazard mitigation are in requirements driven by the safety analyses. Therefore, DO-178B central theme is design assurance and verification after the prerequisite safety requirements have been established.

The number of objectives to be satisfied (eventually with independence) is determined by the software level A-E. The phrase "with independence" refers to a separation of responsibilities where the objectivity of the verification and validation processes is ensured by virtue of their "independence" from the software development team. For objectives that must be satisfied with independence, the person verifying the item (such as a requirement or source code) may not be the person who authored the item and this separation must be clearly documented.[3] In some cases, an automated tool may be equivalent to independence.[4] However, the tool itself must then be qualified if it substitutes for human review.

Level Failure condition Objectives[5] With independence Failure rate
A Catastrophic 66 25 10−9/h
B Hazardous 65 14 10−7/h
C Major 57 2 10−5/h
D Minor 28 2 10−3/h
E No Effect 0 0 n/a

Processes and documents

[edit]

Processes are intended to support the objectives, according to the software level (A through D—Level E was outside the purview of DO-178B). Processes are described as abstract areas of work in DO-178B, and it is up to the planners of a real project to define and document the specifics of how a process will be carried out. On a real project, the actual activities that will be done in the context of a process must be shown to support the objectives. These activities are defined by the project planners as part of the Planning process.

This objective-based nature of DO-178B allows a great deal of flexibility in regard to following different styles of software life cycle. Once an activity within a process has been defined, it is generally expected that the project respect that documented activity within its process. Furthermore, processes (and their concrete activities) must have well defined entry and exit criteria, according to DO-178B, and a project must show that it is respecting those criteria as it performs the activities in the process.

The flexible nature of DO-178B's processes and entry/exit criteria make it difficult to implement the first time, because these aspects are abstract and there is no "base set" of activities from which to work. The intention of DO-178B was not to be prescriptive. There are many possible and acceptable ways for a real project to define these aspects. This can be difficult the first time a company attempts to develop a civil avionics system under this standard, and has created a niche market for DO-178B training and consulting.

For a generic DO-178B based process, a visual summary is provided including the Stages of Involvement (SOIs) defined by FAA on the "Guidance and Job Aids for Software and Complex Electronic Hardware".

Planning

[edit]

System requirements are typically input to the entire project.

The last 3 documents (standards) are not required for software level D..

Development

[edit]

DO-178B is not intended as a software development standard; it is software assurance using a set of tasks to meet objectives and levels of rigor.

The development process output documents:

Traceability from system requirements to all source code or executable object code is typically required (depending on software level).

Typically used software development process:

Verification

[edit]

Document outputs made by this process:

Analysis of all code and traceability from tests and results to all requirements is typically required (depending on software level).

This process typically also involves:

Other names for tests performed in this process can be:

Configuration management

[edit]

Documents maintained by the configuration management process:

This process handles problem reports, changes and related activities. The configuration management process typically provides archive and revision identification of:

  • Source code development environment
  • Other development environments (for e.g. test/analysis tools)
  • Software integration tool
  • All other documents, software and hardware

Quality assurance

[edit]

Output documents from the quality assurance process:

  • Software quality assurance records (SQAR)
  • Software conformity review (SCR)
  • Software accomplishment summary (SAS)

This process performs reviews and audits to show compliance with DO-178B. The interface to the certification authority is also handled by the quality assurance process.

Certification liaison

[edit]

Typically a Designated Engineering Representative (DER) reviews technical data as part of the submission to the FAA for approval.

Tools

[edit]

Software can automate, assist or otherwise handle or help in the DO-178B processes. All tools used for DO-178B development must be part of the certification process. Tools generating embedded code are qualified as development tools, with the same constraints as the embedded code. Tools used to verify the code (simulators, test execution tool, coverage tools, reporting tools, etc.) must be qualified as verification tools, a much lighter process consisting in a comprehensive black box testing of the tool.

A third party tool can be qualified as a verification tool, but development tools must have been developed following the DO-178 process. Companies providing these kind of tools as COTS are subject to audits from the certification authorities, to which they give complete access to source code, specifications and all certification artifacts.

Outside of this scope, output of any used tool must be manually verified by humans.

Requirements management

[edit]

Requirements traceability is concerned with documenting the life of a requirement. It should be possible to trace back to the origin of each requirement and every change made to the requirement should therefore be documented in order to achieve traceability. Even the use of the requirement after the implemented features have been deployed and used should be traceable.

Criticism

[edit]

VDC Research notes that DO-178B has become "somewhat antiquated" in that it is not adapting well to the needs and preferences of today's engineers. In the same report, they also note that DO-178C seems well-poised to address this issue.[citation needed]

Resources

[edit]
  • FAR Part 23/25 §1301/§1309
  • FAR Part 27/29
  • AC 23/25.1309
  • AC 20-115B
  • RTCA/DO-178B
  • FAA Order 8110.49 Software Approval Guidelines

See also

[edit]

References

[edit]
  1. ^ "FAA Advisory Circular 20-115B" (PDF). Archived from the original (PDF) on 2008-08-27. Retrieved 2005-11-30.
  2. ^ RTCA/DO-178C "Software Considerations in Airborne Systems and Equipment Certification", p. 116. "One example is the term “item development assurance level” (IDAL), which for software is synonymous with the term “software level."
  3. ^ RTCA/DO-178B "Software Considerations in Airborne Systems and Equipment Certification", p. 82
  4. ^ RTCA/DO-178B "Software Considerations in Airborne Systems and Equipment Certification", p.82
  5. ^ RTCA/DO-178B "Software Considerations in Airborne Systems and Equipment Certification", Annex A

Further reading

[edit]
[edit]