Applying DO178B
Applying DO178B
Software for airborne application is highly safety critical as any failures may result
in loss of human life. Government agencies like FAA and JAA in the US and
Europe respectively, enforce stringent software development practices to ensure
the safety of life. RTCA DO-178B provides the guidelines for all the phases of the
software development life cycle for airborne applications and equipment certifica-
tion. The aviation community as a whole and the FAA endorse these guidelines.
The safety criticality of airborne software poses a lot of challenges for its V&V
because the airborne system as a whole should not fail and result in damage or
threat to human life. This paper aims at explaining these in more detail from a
practitioner’s perspective. The first few sections define the activities involved in the
verification of safety critical software and describe the processes recommended
by DO178B. In last couple of sections, a small case study is presented. It is
concluded with a brief description of Wipro Technologies’ focus in this area.
Wipro Technologies
Innovative Solutions, Quality Leadership
White Paper Applying DO178B for IV & V of Safety critical Software
Table of Contents
Introduction ............................................................................................................................ 3
Verification and Validation ..................................................................................................... 3
Failure Categorization and Software Levels ......................................................................... 4
Categorization of Software Failure Conditions ..................................................................... 4
V&V and Other Software Life Cycle Processes .................................................................... 5
DO-178B V&V Process .......................................................................................................... 6
Testing ................................................................................................................................... 7
V&V – A typical case .............................................................................................................. 9
Simulated Testing ............................................................................................................... 10
About the Authors ................................................................................................................ 11
About Wipro Technologies .................................................................................................. 11
Wipro in Embedded Systems ............................................................................................. 11
Introduction
The guidelines from RTCA for implementation of safety critical software for airborne
equipment are fairly elaborate covering all the phases of development and testing.
However, from a practitioners perspective or for an organization entering into this busi-
ness, it was felt that these guidelines should be explained further for a good understand-
ing. This paper focuses on the verification and validation part of DO178B process recom-
mended by RTCA.
- Validation involves assessing the degree to which a software system actually fulfills the
system requirements or the user needs. Validation activities refer primarily to the overall
system specification and the final code. A system that is consistent with its specifications
is dependable.
- Verification refers to the work product with respect to its immediate predecessor. For
example, this may involve checking the consistency of an implementation with a specifica-
tion that forms the reference for the check. The implementation is the “item being verified”
and the specification is the “reference”.
For example, an overall design could be the specification and a more detailed design
could be the implementation. In this case, checking whether the detailed design is
consistent with the overall design would then be the verification of the detailed design.
The same detailed design could play the specification with respect to source code, which
would be verified against the design. In every case, verification is a check of consistency
between two descriptions, in contrast to validation that compares a description against
actual needs.
The diagram on the following page, illustrates the verification and validation activities at
various phases of development. As seen in the diagram, verification activities check
consistency between descriptions at adjacent levels of outputs and between descriptions
and implementation.
Validation activities refer primarily to the overall system specification and the final code.
With respect to overall system specification, validation checks for discrepancies between
actual needs and the system specification as laid out by the analysts, to ensure that the
specification is an adequate guide to building a product that will fulfill its goals. With
respect to final code, validation aims at checking discrepancies between actual need and
the final product (the final code), to reveal possible failures of the development process
and to make sure that the product meets the actual end-user expectation. Validation
checks between the specification and the final product are primarily checks of decisions
that were left open in the specification, e.g., details of the user interface or product
features.
Validation against actual requirements involves a good deal of human judgment. The
chances for ambiguity, misunderstanding and disagreement are high. Specifications
should be sufficiently precise and unambiguous so that there can be no disagreement
about whether a particular system behavior is acceptable to aid validation.
Software
Acceptance Tests/ Solution
System Conformity Review Delivered
Requirement
specification
System
Analysis/ Integration
Review
System
Requirements and
Constraints
Legends Validation
Verification
Catastrophic - Failure conditions that would prevent continued safe flight and landing.
Major - Failure conditions that reduce the capability of the aircraft or the ability of the crew
to cope with adverse operating conditions, to a significant extent. For example, the failure
may result in a significant reduction in safety margins or functional capabilities, a signifi-
cant increase in crew workload, in conditions impairing crew efficiency or discomfort to
occupants, possibly including injuries.
Minor - Failure conditions which would not significantly reduce aircraft safety, and which
would involve crew actions that are well within their capabilities. Minor failure conditions
may include, for example, a slight reduction in safety margins or functional capabilities, a
slight increase in crew workload, such as, routine flight plan changes, or some inconve-
nience to the occupants.
No Effect - Failure conditions that do not affect the operational capability of the aircraft or
increase crew workload.
Software Levels
Corresponding to the failure conditions described above, software levels from A to E are
defined – Level A indicates that the failure of software results in catastrophic results, while
level E will not impact the safety anyway. Software level assessment is done as part of
system safety assessment process. The software level also indicates the efforts that are
required to demonstrate the compliance for certification and it varies with the failure
condition category.
The Software Planning Process and other Integral Processes viz. Software Configuration
Management Process, Software Quality Assurance Processes and Software Liaison
Process do not fall under the purview of Software Verification and Validation Process.
However reviews and audits respectively ensure the correctness of these processes.
The diagram shows the V&V requirements assuming waterfall model. The same can be
extended to iterative development.
Development Process
Requirement Design Coding and Integration
Process Process Integration Process Process
Verification Process
Annexes A3 through A7 of RTCA document on DO178B provide the list of outputs of the
Development process and the V&V process itself. It is necessary that these are verified
and the objectives applicable to the level of the software are satisfied through the verifica-
tion process. For example, for level A software, many of the objectives have to be met with
independence, whereas for level B and C, less number of objectives need to be satisfied
independence.
Design
Verification Process Verification
Verification Plan
The figure above represents the Verification Process as required by DO-178B and
indicates the verification activities at the end of each of the processes like Requirement
Process, Design Process, Coding and Software Integration Process and the Hardware
Integration Process.
Verification of Design process involves review and analysis of the design that is provided
in the Software Design Data. The review comments from this process are fed back to
previous life cycle activities as appropriate.
Verification of coding and integration process involves review and testing of the source
code implemented as per the Software Design Data. The review comments and errors
identified from this process are fed back to previous life cycle activities as appropriate.
Verification of integration process involves testing of the object code on Instruction Set
Simulator/ Target Emulator, Target board for compliance. The test results from this
process are fed back to previous life cycle activities as appropriate. In general all errors
that are reported are managed and tracked to closure.
Software Verification Cases and Procedures as well as the Software Verification Results
are verified for completeness and correctness in the Verification of Verification Process
Results.
Reviews are conducted with respect to a reference. For example the review of the design
document is done either with the Software Requirement Data or a suitable checklist
prepared based on the same. Analyses are expected to provide repeatable evidence of
correctness. Detailed examination of functionality, performance, traceability and safety
implications of a software component are typical subjects of analysis.
S/W
Requirement
Software
Requirements
Additional
Verification Software Structure
Coverage Analysis
Direct Path
Indirect Path
End of
Testing
Testing Process
Testing
Testing is the verification of source code generated at unit level, module level and the
system level. It has two objectives:
l To demonstrate that the software satisfies requirements
l To have a high degree of confidence in the software – i.e., the errors leading to failure
conditions as determined by the system safety assessment process have been
removed.
The above diagram (reproduced from DO-178B) depicts the testing process.
Three types of testing are indicated:
l Hardware Software Integration Testing: - This is to verify that the software functions
correctly in the target environment.
l Software Integration Testing:
l To verify that the interrelationships and the software components are and the software
requirements.
l To verify the implementation of the software requirements and the software compo
nents within the software architecture.
l Low-Level Testing: - This is to verify the implementation of the software low level
requirements.
DO-178B makes it clear that the requirement based coverage and structural coverage
obtained by hardware/ software Integration testing need not be duplicated in low level
testing. However, no low-level test can provide the effectiveness of the high-level tests.
The DO-178B stresses that the software requirements are the primary basis for test case
generation. Primary objective of test cases shall be to test functionality and to bring out
potential errors.
Requirement coverage analysis is aimed at determining the requirements that are not
tested. Structural coverage analysis is aimed at identification of code sequences that is not
covered structurally.
Test Environment
For comprehensive testing of the software, more than one test environment may be
required. A recommended test environment for the software is the software loaded onto
the target computer. The target computer shall interact with a high fidelity simulation of its
actual environment. This may be considered as an integrated target environment. Testing
on the target computer environment assumes importance as some errors are detected
only in this kind of environment.
The requirement-based coverage and structural coverage are achieved only with the help
of precise control and monitoring of the input. Such testing may call for different test setup
wherein small software components get tested in isolation. Test Drivers and test tools
can be used in this kind of testing.
Verification of sequential work products with respect to a reference (previous work product
in the life cycle) is an ongoing activity as the software development progresses. Require-
ments document, design document, source code are typical work products. For example
the source code has to be reviewed with respect to the design document for functionality
and with respect to the coding standards for adherence to the same. Checklists prepared
using the design document and coding standards can be used for the respective reviews.
Wherever the review with a checklist is felt to be inadequate as in the case of design,
analysis has to be done with respect to modularity, architecture, algorithms, design of
functionality, etc.
Test Description for Functional Testing should be prepared to act as the guide for func-
tional testing. This document should list all the test cases, test objective, test input/
expected output, test setup, pass fail criteria, traceability, etc. It is essential that this test
description document be derived from the requirements document.
Test Description for Unit Testing should explain the unit testing methods for achieving
structural coverage and decision coverage. It may be difficult to generate the test cases
manually and a tool, such as AdaTest, for example may be used to facilitate easy scripting
and execution of test cases. It also provided correct feedback on the test statistics like
coverage and others like complexity. Other tools such as RTRT, AdaTest, Cantata, etc.,
can be used for testing. The tools can be used for software integration testing in addition
to unit testing. Test cases can be generated in parallel with testing, based on the cover-
age information provided by the tool. Practically, for documentation purpose, the set of test
cases that are developed by using the tool for the required coverage can be documented
to form the test description document for unit testing.
Integration Testing is the last step in the testing activity and may be done in different
ways. For example, in this case, this can be done in the host development environment
with simulated input conditions. The tests can be repeated on target emulator and later
on the target board itself. Such tests often form part of acceptance tests if the develop-
ment activity is out-sourced from a vendor.
Simulated Testing
For reliable testing, the test environment has to be close to what the real scenario would
be. In practice sophisticated simulators are used. In less complex cases, as in this case,
a test harness that can provide predetermined input values at regular intervals from data
files to the software under test can be used for testing. The output data can be stored and
further analyzed.
References
1. RTCA Guidelines on DO-178B
2. Software Testing and Analysis: Process, Principles, and Techniques
By Mauro Pezze and Michal Young
© Copyright 2002. Wipro Technologies. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, or
transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without express written permission from
Wipro Technologies. Specifications subject to change without notice. All other trademarks mentioned herein are the property of their respective
owners. Specifications subject to change without notice.
America Europe
1995 EI Camino Real, Suite 200 137, Euston Road
Santa Clara, CA 95050, USA London NW12AA,UK
Phone:+1 (408) 2496345 Phone:+ 44 (020) 73870606
Fax: +1 (408) 6157174/6157178 Fax: + 44 (020) 73870605
Japan India-Worldwide HD
Saint Paul Bldg, 5-14-11 Doddakannelli, Sarjapur Road
Higashi-Oi, Shinagawa-Ku, Bangalore-560 035, India
Tokyo 140-0011Japan Phone:+ 91 (80) 8440011 -15
Phone:+ 81 (03) 54627921 Fax: + 91 (80) 8440254
Fax: + 81 (03) 54627922
www.wipro.com
eMail: [email protected]