What industries do the DO-178B guidelines apply to?
The DO-178B guidlines apply to civil aviation for aircraft, helicopters and engines as mandated by the Federal Aviation Regulations. The guidelines also apply to the systems and equipment that utilize software or airborne electronic hardware used in aviation.
When is DO-178B compliance required?
For systems and equipment using software to fulfill a safety related aircraft function. FAA Advisory Circular 20-115B cites RTCA/DO-178B as a means of compliance to the Federal Aviation Regulations Part 21, 23, 25, 27, 29 and 33.
DO-178B is used for all new software development as well as for software changes to legacy systems containing software. The FAA defines DO-178B as a means, but not the only means of compliance to the Federal Aviation Regulations. It is an extremely rare exception that an alternative means of compliance is used for software in avionics applications.
Does the amount of testing vary depending on Software Design Assurance Level (i.e. Level A, B, C etc.)?
Yes, and in two distinct ways:
Requirements that need test coverage
- Level A, B & C require that the test cases provide normal range and robustness coverage of all software high level and low level requirements.
- Level D only requires test cases that provide normal range and robustness coverage of all software high level requirements.
The amount of testing needed to achieve the structural coverage objectives.
- Coverage metrics are collected during the execution of the software test procedures. The test cases needed for achieving modified condition decision coverage will be more comprehensive than those needed for decision or statement coverage.
- Level A software requires modified condition decision coverage, Level B software requires decision coverage and Level C software requirements only statement coverage.
- Level D software does not require coverage metrics. In general, the test cases do not need to be as comprehensive as those for Level A, B or C.
What is the relationship between DO-178B compliance and the FAA?
The Federal Aviation Administration (FAA) has the authority to determine whether compliance to the Federal Aviation Regulations has been achieved.
For software, compliance to the Federal Aviation Regulations is defined in the FAA Advisory Circular 20-115B. The advisory circular states that software can use compliance to RTCA/DO-178B to show compliance to the Federal Aviation Regulations.
Since the FAA determines compliance to Federal Aviation Regulations, then for software the FAA determines compliance to RTCA/DO-178B.
RTCA/DO-178B was created by an RTCA committee. The FAA had participants in the committee.
Could an aircraft manufacturer achieve DO-178B compliance and approval using only in-house resources?
Yes, an aircraft manufacturer can do the software development and verification and have company DERs (Designated Engineering Representative) perform the compliance findings for the software. The resources would need to be appropriately qualified and skilled for the tasks. The company DER would need the appropriate delegation and authorization.
Could an avionics software producer achieve DO-178B compliance and approval using only in-house resources?
Yes. In fact many TSO (Technical Standard Order) equipment manufacturers do the software development and use in house company DERs (Designated Engineering Representative) to perform the compliance findings for the software.
Some equipment manufacturers have in house DERs and with suitable arrangements with the aircraft manufacturer use their own DER to approve the software. Note that any system with software also requires an installation approval to use the system and software on the aircraft. There are no generic stand-alone approvals for software. Some portions of the approval may be reused or credited under previously developed software.
What is the role and responsibility of the DER before, during and after DO-178B compliance and approval?
Before – assist with planning and liaison with the FAA.
During – review data, answer questions, and mainly to audit for compliance in accordance with Chapter 2 of FAA Order 8110.49 using the Software Job Aid.
After – assist with compliance findings and approvals for software updates due to in service problems or when features are added.
Are there guidelines or strict requirements to achieve completeness for DO-178B tests?
The applicant and software developer would need to define the criteria by which test cases are selected. It is necessary to provide full test coverage (normal range and robustness) of software high level requirements for Level A/B/C/D and software low level requirements for Level A/B/C.
Another measure of completeness is the structural coverage. Level A requires MC/DC, decision and statement coverage; Level B requires decision and statement coverage; Level C requires statement coverage. The test cases need to be complete enough to result in fulfillment of the coverage metrics. Tests for Level A/B/C also need to provide coverage of the data coupling and control coupling. DO-178B Section 6.4.2 provides some guidance on test case selection. In general, a well defined test case selection criteria and comprehensive test cases will provide the requisite test coverage of the requirements and yield thorough structural coverage metrics. The test case selection criteria should be defined in a Software Verification Plan as described in RTCA/DO-178B Section 11.3 – c.3.
Are there only guidelines or strict requirements to prove correctness of the target software during DO-178B testing?
RTCA/DO-178B defines a set of objectives for test coverage of the software requirements. If the requirements are adequately defined and behaviorally oriented, then the verification tests will demonstrate that the software meets its requirements. Meeting the requirements should also serve to demonstrate correct behavior of the software.
RTCA/DO-178B objectives for software testing also include showing that the actual object code, executing on the target flight hardware, meets its requirements. Testing the software in the target environment will ensure that software is tested the same way it will be used.
Are there only guidelines or strict requirements to prove independence during the variety of tasks of DO-178B verification?
The glossary on page 82 of RTCA/DO-178B provides a definition of independence.
During the software planning process, consideration should be given to which activities need to be independently performed and how the independence will be documented. An effective way to demonstrate independence is to keep records of the author of the requirements, design and code documentation and to record the name of the reviewer on the review form or checklist. Independence of verification activities is one of the questions in the Software Job Aid for consideration during a Stage of Involvement (SOI) audit.
During DO-178B verification, are there implications with other standards (ISO 9000 and others), and if so, why?
There are no formal ties between RTCA/DO-178B and ISO standards.
Does DO-178B mandate the fix of problem reports (PRs) in a timely manner or not?
RTCA/DO-178B configuration management objectives require that problem reporting be established and that changes to baselines are performed with formal change control. The Software Configuration Index should include a list of problem reports resolved by a software release. The Software Accomplishment Summary must include a list of all known open problem reports at the time of the release and a rationale for leaving them unresolved. Most projects will not permit a large quantity of open problem reports in the final release.
In general, problem reports should be routinely assessed as part of configuration status accounting. Open problem reports should be scheduled for resolution and incorporation in upcoming baselines to minimize the rework they can entail. The sooner a problem is resolved, the less downstream affect the problem will have on subsequent activities. Project specific FAA Issue Papers, EASA Certification Review Item or customer contracts may require a specific timeframe to resolve all outstanding problem reports.
Does DO-178B mandate a plan with milestones and timelines to address all problem reports (PRs)?
No, this specific topic is not covered in RTCA/DO-178B.
Problem reports and their ultimate resolution are of utmost importance to aircraft manufacturers. They will often have program or contractual requirements for management of open problem reports. FAA Notice N8110.110 has devoted Chapter to the topic of problem reports. The Notice requires routine assessment of problem reports and active management of their resolution. The FAA will assess the quantity of unresolved problem reports and their potential impact on the aircraft or systems at the time of the approval of the software. Project specific FAA Issue Papers, EASA Certification Review Item or customer contracts may require a specific timeframe to resolve all outstanding problem reports.
Does DO-178B mandate fixing all problem reports (PRs) to achieve certification?
No, not specifically.
RTCA/DO-178B requires that all open problem reports be recorded in the Software Accomplishment Summary. The Software Accomplishment Summary is submitted to the FAA. The FAA must agree with the number and type of open problem reports and their potential to affect aircraft safety. FAA Notice N8110.110 Chapter 2 also provides additional guidance on problem reporting. Project specific FAA Issue Papers, EASA Certification Review Item or customer contracts may specify more strict guidance for problem reports.
What is the role of the DER during the planning phase of the DO-178B certification?
The DER, when so delegated, will perform the Stage of Involvement (SOI-1) audit during, or near the completion of, the planning phase. The DER can also assist with interpretation of guidance materials, Issue Papers and provide any necessary liaison with the FAA. The DER can also facilitate a specialist meeting with the FAA to review the applicant's proposed lifecycle and obtain any feedback to ensure that the PSAC (Plan for Software Aspects of Certification) will ultimately be approved. The DER will prepare an 8110-3 form to document the approval of the PSAC and submit the form to the FAA.
What are the criteria for need of qualification of the tools used during DO-178B certification?
All tools used for software development and software verification should be assessed for qualification. The assessment should be documented in the Plan for Software Aspects of Certification. Tools that can introduce an error into the flight software should be assessed as development tools. If the output of these tools is not verified by review, analysis or test, then it should be qualified. Generally, compilers, linkers and cross-assemblers are not qualified as development tools since the outputs are subject to verification by test. Source code generation tools require qualification if they are used to alleviate code reviews for Level A-C. Tools that can fail to detect an error in the flight software should be assessed as verification tools. If the output of the tool is not verified by review, analysis or test then it should be qualified. Typically, structural coverage analysis tools and tools that automatically check test results are qualified as verification tools.
How is qualification of tools used during DO-178B verification achieved?
Qualification for tools is defined in RTCA/DO-178B Section 12.2 and FAA Order 8110.49 Chapter 9.
For a verification tool, the use of the tool and the proposal for qualifying the tool should be defined in the additional considerations section of the Plan for Software Aspects of Certification. The tool operation requirements need to be defined and documented. Test cases are created to show that the verification tool meets its operational requirements under normal operating conditions. The results of the testing should be recorded and documented.
All data for the verification process should be under configuration control as CC2 (Change Category 2) data. The results of the qualification activities should be summarized in the Software Accomplishment Summary. The qualification activities should be performed with independent verification of the tool data if the DO-178B objective requires independent verification.
Would Excel need to be "qualified" to be used for DO-178B verification purposes?
If Excel is used in a manner that would fail to detect an error in the flight code, it would need to be qualified. If Excel is used to calculate expected results for test cases, and the expected results are not reviewed, then a qualification should be performed. If Excel is used to populate test harnesses with test inputs and the inputs are not independently confirmed, then a qualification should be performed. If Excel is used to compare expected results with actual results for test procedures, and the results are not reviewed, then a qualification should be performed.
Does all of the software running on a processor need to be at the same Design Assurance Level?
No, not necessarily. Partitioning may be used to run different functions of different Design Assurance Levels on the same processor. This is most commonly done with a partitioned real-time operating system (RTOS). Many times, equipment designers will have provisions for having separate executables for software of different Design Assurance Levels. Example: the Level A flight code only executes during flight, the Level E data loader only executes during specific maintenance procedures on the ground or at a repair facility. In general, the analysis and verification to prove time, data, memory partitioning for two or more functions performed simultaneously on a processor are not trivial.
Does all of the software running on a circuit card assembly need to be at the same Design Assurance Level?
This would depend on the architecture of the circuit card. Often times, separate processors are put on a circuit card to have different functions and different Design Assurance Levels. If the circuit card has the necessary isolation, then software of different Design Assurance Levels could be hosted. The partitioning for two or more functions performed simultaneously on a single processor is not trivial and would need to be proved.
How can existing software be upgraded to comply with DO-178B?
This depends on the quantity and quality of the documentation and lifecycle data produced in the original development. In general, the FAA eschews reverse engineering to produce compliance data for software. If this approach is proposed, then it is recommended that the applicant seeks FAA agreement with the approach before proceeding with the effort. The biggest problem with existing software is producing the functional requirements needed to use in the verification testing. Requirements written after the fact tend to be based on knowledge and/or description of the specific design or implementation – not the intended function. When this happens, the verification testing proves that the design or implementation is met and not that the implementation meets its functional requirements. If the software was originally developed and approved at a lower Design Assurance Level and needs to be upgraded to a higher Design Assurance Level, then service history assessment, additional testing and activities required by the higher Design Assurance Level may be performed.
Where do traditional Preliminary and Critical Design Reviews (PDR/CDR) fit in the design life cycle timeline?
The Preliminary Design Review (PDR) is usually performed when the software or hardware requirements trace to and fully implement the system or parent requirements baseline. The Critical Design review (CDR) is usually performed when the software or hardware design traces to and fully implements its requirements. A software CDR is performed prior to the formal coding phase. A hardware CDR is typically used as an approval gate to manufacture hardware to be used in formal tests.
When should the V&V team get involved on a project?
V&V should be involved right from the planning phase. The team should write their own verification plan and assess the project needs for tools, equipment and the overall approach for verification and testing.
- V&V can review plans and standards when independent review is required.
- V&V can also perform an overview of the system, hardware and software to look for overlaps in the testing. This may allow integrated tests to cover system, software and/or electronic hardware requirements all in one test case. Such an approach requires a disciplined effort in requirements capture. It also requires test equipment that can capture analog, digital waveforms, and software data simultaneously during a test scenario.
- V&V can be highly useful and effective in requirements reviews. They can provide early feedback on whether the requirements, as written, can be verified by analysis and/or test.
- V&V can start writing their test cases as soon as the requirements are available and ideally, reviewed and released.
What is validation and when should it occur? How much design can be completed before validation?
In RTCA/DO-254 validation is the review/analysis and/or test of derived requirements to ensure that they are correct and complete in the context of the parent requirements allocated to the hardware item (i.e. a PLD). Validation of the derived requirements, for a programmable logic device (PLD) is typically performed during the review of the requirements document. All requirements denoted as "derived" are evaluated against the applicable criteria for completeness/correctness. Since the design is created to fulfill and trace to the requirements, including the derived requirements, any design work done before the derived requirements have been validated would be done at risk. That is, if the validation of the derived requirements causes changes, then there may be a downstream effect on the respective part of the design. Aerospace Recommended Practice ARP-4754 defines methods for validation of system requirements. Requirements specified at a system level and then allocated for implementation in software or airborne electronic hardware do not need to be validated again at the software or airborne electronic hardware level.
How is the Implementation phase similar between hardware and software? How is it different?
For hardware the implementation phase produces test articles using manufacturing processes and techniques identical or similar to the production environment. Once the hardware has been tested and any final changes made, the data from the implementation phase is used in the production of the hardware. Software does not have an implementation phase defined in DO-178B. The integration activity is similar to the implementation defined in DO-254.
For software, once testing is complete, the executable object code and data necessary for load control of the software is provided to production. Note that the hardware would need to be produced in an implementation phase before the software testing can be formally concluded. That is, software needs to be tested on flight hardware. The flight hardware needs to be representative of the final configuration.
Where does DO-178B fit into the FAA TSO certification?
FAA Advisory Circular 20-115B for software allows the use of DO-178B for TSO compliance. If equipment is developed under TSO rules, the applicant can show compliance to DO-178B for software and DO-254 for airborne electronic hardware. Current policy allows for DERs to make compliance findings for DO-178B for TSOs. This policy is currently being reassessed by the FAA.
Are there any other top-tiered certifications that DO-178B can be applied to?
Currently, DO-178B and DO-254 are aerospace specifications for commercial aircraft. They are a collection of industry best practices.
Will DO-178C replace DO-178B or will either be accepted as part of the FAA certification process?
This is up to the FAA to determine. An FAA Advisory Circular will be issued when the FAA considerations DO-178C as an acceptable means of compliance to the Federal Aviation Regulations. When DO-178B was published, a transition period was allowed for software developed to DO-178A. A transition process may also be defined; it is too early to tell.
What are the most common mistakes a company makes while trying to get their product or system through certification?
- Underestimating the value of effective planning.
- Underestimating the value of well documented and well reasoned standards.
- Underestimating the value of well crafted requirements.
- Underestimating the scope of verification.
- Emphasizing delivery of software or hardware over producing well crafted requirements and design documentation.
- Overestimating the capabilities of outsourced or contracted work.
- Lack of proper oversight of outsourced or contracted activities.
- Not incorporating project specific requirements, such as FAA Issue Papers. The assumption that what was acceptable for the last development program will work on the current program is often incorrect.
- Not considering electronic hardware aspects and software design tools in common cause analysis.
- Looking for ways to avoid the real work that compliance entails.
I have software that is already developed for my product and now I have to certify it to conform to DO-178B standards. Is this possible and if so, how do I do it?
It may be possible and needs to be evaluated on a case by case basis. The success depends on the quantity and quality of the documentation and lifecycle data produced in the original development.
In general, the FAA eschews reverse engineering to produce compliance data for software. If this approach is proposed, then it is recommended that the applicant seeks FAA agreement with the approach before proceeding with the effort. The biggest problem with existing software is producing the functional requirements needed to use in the verification testing. Requirements written after the fact tend to be based on knowledge and/or description of the specific design or implementation – not the intended function. When this happens, the verification testing proves that the design or implementation is met and not that the implementation meets its functional requirements. If the software was originally developed and approved at a lower Design Assurance Level and needs to be upgraded to a higher Design Assurance Level, then service history assessment, additional testing and activities required by the higher Design Assurance Level may be performed.
How many stages of involvement are between the 'authority' (FAA) and the company that is trying to obtain certification and what documents are due at each S.O.I. audit?
The Stage of Involvement (SOI) audits are defined in the FAA Orders for software (8110.49) and airborne electronic hardware (8110.105). All four audits are performed for each program. The amount of direct FAA involvement is determined by a set of criteria called the level of FAA involvement (LOFI). SOI activity is also determined by the amount of delegation to a DER, program requirements and FAA Issue Papers on supplier oversight.
FAA Order 8110.49 defines 4 Stage of Involvement (SOI) audits for software:
- Software Planning Review (SOI #1)
- Software Development Review (SOI #2)
- Software Verification Review (SOI #3)
- Final Certification Software Review (SOI #4)
The lifecycle data inspected at each SOI audit is defined in the respective FAA Order for software (8110.49) and airborne electronic hardware (8110.105).
How many objectives are there in DO-178B that must be addressed?
There are 66 objectives defined in DO-178B.
- All 66 apply to Level A software.
- 65 apply to Level B software.
- 57 apply to Level C software.
- 28 apply to Level D software.
What is a DER? What does the DER have to do with DO-178B?
A Designated Engineering Representative (DER) is a representative of the FAA Administrator authorized by law to examine, test, and/or make inspections necessary to issue aircraft certificates. A DER is not an employee of the U.S. Government, and is not federally protected for the work they perform or their decisions.
For a given project, a DER may be delegated to make compliance findings to the applicable Federal Aviation Regulations on behalf of the FAA. For airborne electronic hardware, compliance to the Federal Aviation Regulations may be established by finding compliance to the objectives in DO-254 (as permitted by Advisory Circular 20-152). For software, compliance to the Federal Aviation Regulations may be established by finding compliance to the objectives in DO-178B (as permitted by Advisory Circular 20-115B). The DER then assesses compliance to DO-178B or DO-254 as a means of compliance to the Federal Aviation Regulations. The compliance findings are documented on FAA form 8110-3.
DERs can fulfill their role as a company DER or a consultant DER. A company DER has delegation for the products their employer produces. A consultant can get delegation for projects from different suppliers or airframe manufacturers. For companies with an organizational delegation, a DER can perform their duties as an Authorized Representative (AR) in a Delegation Option Authorization (DOA) or a Unit Member (UM) in an Organization Designation Authorization (ODA).
What are the DO-178B Criticality Levels? And what does that mean for testing at the different levels?
DO-178B has five design assurance levels – A/B/C/D/E. The design assurance level is assigned commensurate with the hazard the software can cause or contribute to.
Level A-C all require test coverage of high and low level software requirements. Level D only requires test coverage of high level software requirements. Level E is exempt from DO-178B compliance, the testing is determined by the developer or contractual arrangements.
Structural coverage and test coverage of data and control coupling is required for Level A-C, Level D does not. The coverage metrics is most rigorous for Level A (modified condition/decision coverage) while Level B only requires decision and statement coverage. Level C needs to achieve statement coverage only.
Is DO-178B used by the Military?
Military projects can elect to use DO-178B. The compliance to DO-178B is determined by the project, not the FAA. Military projects often cite these guidelines since the supplier often already have exposure. The FAA has a Military Certification Office (MCO) in Wichita, KS for military aircraft derived from commercial products. FAA certification can be performed for an already certified civilian aircraft adapted for military use. There would however be no way to certify a fighter jet since it does not have a civilian application.