documentviewer
documentviewer
Software Technology
DO-178C demystified:
Strategies for efficient certification
www.ldra.com
© LDRA Ltd. This document is property of LDRA Ltd. Its contents cannot be reproduced, disclosed or utilized without company approval.
The latest incarnation of the document was released in 2011. DO 178C/ED-12C [3] [4] was joined by a series
of supplements and guidance documents (DO-330 [5], DO-331 [6], DO-332 [7], DO-333 [8]) that further
extended its scope to cover newly emerged needs and technologies. LDRA has participated extensively on
both the DO-178B and DO-178C committees over nearly two decades.
Since 2020, ongoing work around the development of DO-178C has focused on maintaining the safety
and reliability of airborne software systems in the wake of further technological advances. Examples of
resulting guidance include “AC 20-193, Multi-core Processors” [9] and “AC 20-170A, Integrated Modular
Avionics” [10].
DO-178C in context
The illustration shows DO-178C and sister document DO-278A [11] in the context of the broader avionics
development and certification framework.
ARP4754B [12] provides the overarching process for system development and certification. Its sister
document, ARP4761A [13], describes methodologies for safety assessment.
These methodologies inform and integrate with the system development processes of ARP4754B, the
software development guidelines of DO-178C, and the hardware development guidelines of DO-254/ED-80
[14] [15] & DO-160G/ED-14G [16] [17], whilst DO-297/ED-124 [18] [19] provides complementary development
guidance for the implementation of IMA architectures.
As challenges emerge and technologies advance, so standards are introduced or adapted to accommodate
them.
The ARP 4754B development process then allocates the associated DALs to the subsystems that
implement the system’s electronic hardware and software requirements. DO-178C establishes five
“software levels” and modulates objectives that must be satisfied. This means that the effort and expense
of producing a system is proportionate to the potential consequences of its failure.
DO-178C recognizes that software safety must be addressed systematically throughout the software life
cycle. This involves bi-directional life cycle traceability, software design, coding, validation and verification
processes used to ensure correctness, control and confidence in the software. Several mechanisms are
defined to help ensure that the processes are adhered to, and to provide evidence of that adherence.
Evidential artifacts presented in a comprehensive, clear, and concise manner are a key part of any
successful compliant project.
LDRA Certification Services (LCS) [20] offers the LDRA Compliance Management System (LCMS) [21] to
address these challenges. LCMS comprehensive compliance documentation, process and document review
tools, together with Level A FAA Designated Engineering Representative (DER) and EASA Subject Matter
Expert (SME) support. It integrates with the LDRA Tool Suite to provide a comprehensive solution for
developing and certifying safety-critical software.
Key elements of the DO-178C software life cycle include the practices of traceability and structural coverage
analysis. Bi-directional traceability must be established across
the life cycle.
The use of software tools offers particularly significant benefits during software development and software
verification as discussed in §5.0 and §6.0 of the standard, respectively.
DO-178C §5.3 specifies software coding process objectives. For example, developers of source code should
implement low-level requirements and conform to a set of software coding standards. LDRA static analysis
tools provide code scanning – an automated “inspection” of the source code.
If everything follows the development life cycle in textbook fashion, that is perhaps a one-off, trivial task.
The requirements will never change, and tests will never throw up a problem. But unhappily, that is rarely
the case.
The TBmanager component of the LDRA tool suite is a desktop traceability application, and is integrated
with code review, data and control coupling analysis, low-level testing, and code coverage tools.
LDRAvault is a web-based, enterprise level application that aggregates and manages certification artifacts
across projects and programs providing transparency into the development and verification process.
Imported reports and results across multiple users and projects are collated as snapshots which form part
of visualizations such as heat maps and trend graphs.
Low-level tests verify the complete and exclusive implementation of the low-level requirements specified
in the Software Verification Plan (SVP), whereas software integration testing verifies the relationships
between software components with reference to the requirements and the software architecture. In
practice, the mechanisms used for low-level testing often lend themselves to integration testing and hence
verify behaviour in the context of a call tree.
As previously mentioned, keeping track of a project in flux can be challenging. Again, the TBmanager
component of the LDRA tool suite automates the maintenance of the bi-directional relationship between
the products of the different development phases while saving a great deal of time and helping eliminate
errors – not just as far the development of the requirements and source code, but through to requirements-
based testing and test coverage for both high- and low-level requirements.
DO-178C §6.4.4 details requirements for the achievement of 100% MC/DC, decision, and statement
coverage, depending on DAL. Collation of structural coverage metrics is typically achieved by
“instrumenting” a copy of the source code (that is, superimposing function calls to collate coverage
data) and executing that instrumented code using requirement-based test cases.
System requirements can be shown to have been correctly decomposed, implemented, and verified by
combining a complete trace from requirements through to code and test cases, with the achievement of
comprehensive functional test coverage and structural coverage objectives.
MC/DC is a coverage metric for multiple condition decisions. It does not require every possible combination
to be executed but instead requires only one more test than the number of conditions involved. For
example, with 6 conditions there are 64 possible combinations – and yet only 7 tests are needed to achieve
MC/DC coverage.
However, those tests must show that each of the conditions independently affect the result, and it is not
always obvious which input values might achieve that. The use of an MC/DC planner makes the selection of
appropriate values much easier.
“…if the software level is A and a compiler, linker, or other means generates additional code that is not
directly traceable to Source Code statements, then additional verification should be performed to establish
the correctness of such generated code sequences.”
Object Code Verification (OCV) measures code coverage at both the source and the assembly level by
instrumenting each in turn.
Data coupling and control coupling analysis therefore need to be performed post execution, and the
generated artifacts reviewed against the system requirements and architecture.
The analysis of control and data coupling by traditional means can be tedious and challenging. LDRA’s
patented approach leverages static analysis and dynamic analysis in tandem to make that process more
efficient and less error prone.
Control Coupling
Control Coupling is defined in the glossary of DO-178C as “The manner or degree by which one software
component influences the execution of another software component.”
They identify any gaps and guide targeted verification activities, and ultimately provide evidence that all
couples have been validated.
Data Coupling
Data coupling is defined in the glossary of DO-178C to be “The dependence of a software component on
data not exclusively under the control of that software component”. DO-178C §6.4.4.2.c requires “Analysis
to confirm that the requirements-based testing has exercised the data and control coupling between
components”.
Data coupling analysis is focused on the observation and analysis of data elements as they are set and
used (“set/use pairs”) across software component boundaries.
The objective is to establish an upper execution time bound for each task (called Worst Case Execution
Time, or WCET). The WCET will depend on both the target hardware and the software.
It is increasingly common practice to deploy multicore processors (MCPs) in avionics applications. These
MCP devices almost always share hardware resources outside the processor cores, and time-related delays
occur as users wait for access to these Hardware Shared Resources (HSRs).
The LDRA tool suite facilitates a practical, A(M)C 20-193 compliant approach to the optimization of system
configuration using interference research. It leverages execution time measurement using the TBrun
component of the LDRA tool suite supported by the optional TBwcet module [22].
There are many highly effective robust partitioning mechanisms available. These mitigate for many of the
most significant interference channels – but not all. The onus remains on the developer to demonstrate that
interference mitigation is effective which can only be achieved through measurement.
Supplements to DO-178C
When DO-178C superseded DO-178B, four supplements were created to address the need for clarity and
guidance on modern software development practices within the context of aviation software certification.
They add, delete, or modify objectives, activities, and life cycle data in DO-178C to cater to the specific
needs of these technologies.
Certification authorities undertake tool qualification on a project by project basis, so the responsibility
for showing the suitability of any tools falls on to the organisation developing the application. However,
they can leverage Tool Qualification Support Packages (TQSP). LDRA TQSPs contain a series of documents,
including tool operation requirements that identify the development process needs satisfied by the tool,
and test cases to demonstrate that the tool is operating to specification in the verification environment.
The document also discusses the need for plans to identify test and test coverage activities that will be
satisfied at the model level, and those are to be exercised on the target.
Verification on target
DO-331 §MB.6.8.2 goes on to suggest that several “specific tests should still be performed in the target
environment”. It identifies the various forms of verification objectives that can only be met on the target.
It also lists various types of errors that can and cannot be revealed at the simulation level and can only be
detected on the target hardware.
Finally, DO-331 §MB.B.11 (FAQ #11) addresses model coverage activity, suggesting that model coverage
analysis cannot usually take the place of structural coverage analysis as per DO-178C §6.4.4.2.
The integration of test and modelling tools help to achieve that seamlessly, including the static analysis of
generated code, the collection of code coverage from model execution, and the migration of model tests
into an appropriate form for execution on the target hardware.
• §OO.6.7.1 Verify local type consistency. Ensuring that that “each class passes all the tests of all its
parent types which the class can replace” by means of Liskov Substitution Principle tests is usually the
most practical approach here.
• §OO.6.8.1 Verify the use of dynamic memory management is robust: A range of static and dynamic
analysis techniques can be deployed to fulfil DO-332 A-7 OO.6.8.1 and the related vulnerabilities
outlined in Annex OO.D.1.6.1.
Tracking memory allocation and deallocation helps to ensure the proper freeing of memory, as do
associated checks prior to dereferencing. Low-level testing provides a mechanism to explore various
allocation/deallocation scenarios to help ensure that vulnerabilities are addressed. Timing hooks within
low-level tests help characterize allocation/deallocation timing, and dynamic data flow analysis monitors
data elements in runtime to detect lost updates and stale references.
The document outlines various formal methods tools, such as theorem provers, model checkers, and
abstract interpretation tools, and provides criteria for their qualification. It details how these formal
methods can be leveraged to verify complex systems, complementing the traditional approaches
established by DO-178C. Where formal methods are leveraged as part of a safety case, DO-333 helps
ensure that safety-critical systems meet the usual stringent reliability and performance standards.
Conclusions
The use of traceability, test management and static/dynamic analysis tools for an airborne software
project that meets the DO-178C certification requirements offers significant productivity and cost benefits.
Tools generally make compliance checking easier, less error prone and more cost effective. In addition,
they make the creation, management, maintenance and documentation of requirements traceability
straightforward and cost effective.
In particular, the provision of an automated, comprehensive software tool chain offering complete ‘end-
to-end’ traceability across the development life cycle, encompassing requirements, code, on-target tests,
artifacts, and objectives with both static and dynamic analysis capabilities is invaluable to developers and
project managers alike.
Development in compliance with DO-178C will never be an easy task. However, the right tools can be very
helpful in making such work as easy as it can possibly be.
Works cited
[1] RTCA, “RTCA,” [Online]. Available: http://www.rtca.org/ . [Accessed 14th June 2024].
[2] EUROCAE, “EUROCAE,” Userfull, 2024. [Online]. Available: https://eurocae.net/. [Accessed 14th June
2024].
[3] RTCA, Inc., RTCA DO-178C - Software Considerations in Airborne Systems and Equipment Certification,
RTCA, Inc., 2011.
[4] European Organization for Civil Aviation Equipment (EUROCAE), ED-12C - Software Considerations in