You are on page 1of 41

DO-178B to 178C: Avoiding the Unlucky 13 Mistakes, November 2011

Vance Hilderman, President Atego HighRely (vance.hilderman@atego.com)


2011 Atego. All Rights Reserved. 2011 Atego. All Rights Reserved.

Agenda

The Unlucky 13: Predicted Top DO-178C Mistakes (Synopsis) DO-178B in Three Minutes DO-178B Weaknesses Today DO-178C in Five Minutes The Unlucky 13 DO-178C Mistakes: Details Questions & Answers

2011 Atego. All Rights Reserved.

About Atego HighRely


Direct affiliation with Atego Tools: Atego Process Director, Artisan Studio, etc The Worlds Fastest Growing Avionics Services/Products Company:
30% Avionics Software Engineering 20% Avionics Systems Engineering 20% Avionics Software/Hardware Testing 10% Project Management 10% Strategy, Gap Analysis, JumpCert 10% DERs/Certification

Largest repository of DO-178B & DO-254 White Papers One Stop Supplier for all your Avionics Development needs!
2011 Atego. All Rights Reserved.

Are You Feeling Lucky?


If you are Truly Lucky, just dive into DO-178C and gamble:

Otherwise, watch out for the following predicted top mistakes:


- The Unlucky 13

2011 Atego. All Rights Reserved.

Prediction: Top DO-178C Mistakes The Unlucky 13


1. Failing to consider all of DO-178C as an Integrated Eco-System 2. Insufficient PSAC per ARP-4754A 3. Weak Quality Assurance 4. Inadequate and non-automated Traceability within Models 5. Inadequate level of detail within Requirements 6. Lack of path coverage capture during functional tests 7. Tool Qualification: Insufficient coverage & insufficient PSAC 8. Applying Formal Methods to reduce verification without a formal notation 9. Failing to treat parameters as full DO-178C software 10. Avoiding modeling, due to perceived qualification difficulty 11. Unrestricted C++ Usage 12. Not considering Reusable Software Components (RSC) 13. Not considering additional Level A verification (MCDC & Object Code correlation)

2011 Atego. All Rights Reserved.

Review: DO-178B In Three Minutes


First, a Re-Cap
2011 Atego. All Rights Reserved. 2011 Atego. All Rights Reserved.

What is DO-178?
Certification standards for airborne equipment
DO-178 => Software DO-254 => Hardware

Regulated by the FAA Covers full engineering lifecycle


Planning (CM, QA, Development, Testing) Development (Requirements/Design/Implementation) Testing/Verification Certification

2011 Atego. All Rights Reserved.

DO-178: Evolution History Through DO-178C


Doc DO-178 DO-178A Year 1980-82 1985 Basis 498 & 2167A DO-178 Themes Artifacts, documents, traceability, testing Processes, testing, components, four criticality levels, reviews, waterfall methodology Integration, transition criteria, diverse development methods, data (not documents), tools Reducing subjectivity; Address modern software technologies

DO-178B

1992

DO-178A

DO-178C

2008-

DO-178B

2011 Atego. All Rights Reserved.

Three Key DO-178 Processes


Planning Process Occurs first Development Process Follows Planning Correctness Process Continuous Throughout Project

3. Correctness Process

1. Planning Process

2. Development Process

2011 Atego. All Rights Reserved.

Optimal DO-178 Engineering Route (by Vance Hilderman)


Safety Assessment & Rqmts Systems Rqmts

Develop Traceability

Start QA

Develop Plans, Stnds, Chklsts

Implement CM

SOI #1

Time (Planning Phase)


SOI #3

Cert

Verification & Validation

High-Level Rqmts Low-Level Rqmts

SOI #2 SOI #4

Conformity

Code & Logic

Review Integration
Time (Development & Correctness Phases)
2011 Atego. All Rights Reserved.

Design

10

DO-178B Five Key Plans 1. PSAC 2. SQAP 3. SCMP 4. SWDP 5. SWVP

PSAC: Plan for Software Aspects of Certification SQAP: Software Quality Assurance Plan SCMP: Software Configuration Management Plan SWDP: Software Development Plan SWVP: Software Verification Plan *** Plus 3 Standards: Requirements, Design and Coding

2011 Atego. All Rights Reserved.

11

Scope of DO-178B
Typical Avionics LRU

PLD

CPU BSP

ASIC

RTOS APP SW Math

DO-178B

FPGA

Drivers

DO-254

2011 Atego. All Rights Reserved.

12

DO-178B Weaknesses Today


(Hence Why We Need DO-178C )
2011 Atego. All Rights Reserved. 2011 Atego. All Rights Reserved.

13

Key DO-178B Weaknesses Today


Considerations sometimes not fully considered Software treated as a standalone component, instead of an integral part of a system and aircraft Possible shortcuts to circumvent DO-178Bs true intent Lack of consistent guidance for modern software technologies:
Model based development & tools Object Oriented Technologies (OOT) Formal requirements notation and proofs

Incomplete Level A objective coverage Advancements in Tools, implying more appropriate Qualification Thus: DO-178C
2011 Atego. All Rights Reserved.

14

DO-178C In Five Minutes


2011 Atego. All Rights Reserved. 2011 Atego. All Rights Reserved.

15

DO-178C Preview
Almost 20 years since DO-178B released Software landscape has changed ... Advancements in:
Tools & automation Modeling & Object Oriented Technology Formal Methodologies

Commercial world has embraced the above; Avionics has slowly followed

2011 Atego. All Rights Reserved.

16

DO-178C Preview
Since 2005, committees have met to discuss, and update, DO-178B Like 178B, includes Industry & Agencies Unlike 178B, more Tool Vendors Obvious focus on acceptability of certain types of tools, particularly theirs Predominantly America & Europe, nearly equal; quarterly meetings

2011 Atego. All Rights Reserved.

17

DO-178C Preview: Seven Sub-Groups (SGs)


SG1: Document Integration SG2: Issues & Rationale SG3: Tool Qualification SG4: Model Based Design (MBD) & Verification SG5: Object Oriented (OO) Technology SG6: Formal Methods (FM) SG7: Safety Related Considerations (and ground-based systems)

2011 Atego. All Rights Reserved.

18

DO-178C Preview
Unlike the DO-178A to DO-178B update, the core update to 178C is modest Instead, changes are handled via four Supplements, which clarify:
Tools Supplement MBD Supplement OO Supplement FM Supplement

2011 Atego. All Rights Reserved.

19

Tool Qualification
DO-178B / 2 Criteria: Development Verification DO-178C / 3 Criteria: Development Verification & Augments other development or verification activities Verification only Five Tool Qualification Levels:
For Level A For Level B For Level C Tool Operational Requirements (TOR), Arch, Additional Verification TOR Verification

2011 Atego. All Rights Reserved.

20

MBD & OO continued


DO-178B:

No Explicit Provisions Assumes structured design OO acceptance, but user-defined (subjective) Maximize Determinism & Visibility Weak on OO and MBD traceability Weak on structural coverage application to OO & Models

DO-178C: Allow controlled modeling & OO Bound MBD & OO acceptability Emphasize traceability Address memory management & exception handling Verify type consistency (verify substitutes, Each subclass passes all tests applicable to parent Verify all callable methods for each invocation Emphasize detailed MBD & OO design standards Allow defined generics Acceptable Virtualization (code versus data)

2011 Atego. All Rights Reserved.

21

Memory Management
DO-178B: No Explicit Provisions DO-178C: Verify common vulnerabilities of memory managers:
Fragmentation Ambiguous references Heap memory Deallocation Garbage collection (tightly constrained, but allowable)

2011 Atego. All Rights Reserved.

22

Formal Methods
DO-178B: No Explicit Provisions (But commonly applied, subjectively (and via ED-12B in Europe) DO-178C: Recognize acceptance of formal methods for:
Requirements correctness, consistency, and reviews Source code reviews, particularly autocode generation from models (low level requirements) Test cases covering low level requirements Replacement of some forms of testing via formal method-based reviews Potential to reduce testing via code analysis

2011 Atego. All Rights Reserved.

23

Additional DO-178C Changes


Numerous seemingly minor changes to tighten criteria and reduce subjectivity

Collectively: a mind shift to improve avionics quality and determinism

DO-178C: a move to a System/Aircraft Eco-System

2011 Atego. All Rights Reserved.

24

The Unlucky 13 DO-178C Mistakes


The Details
2011 Atego. All Rights Reserved. 2011 Atego. All Rights Reserved.

25

Top DO-178C Mistakes The Unlucky 13


1. Failing to consider all of DO-178C as an Eco-System 2. Insufficient PSAC per ARP-4754A 3. Weak Quality Assurance 4. Inadequate and non-automated Traceability within Models 5. Inadequate level of detail within Requirements 6. Lack of path coverage capture during functional tests 7. Tool Qualification: Insufficient coverage & insufficient PSAC 8. Applying Formal Methods to reduce verification without a formal notation 9. Failing to treat parameters as full DO-178C software 10. Avoiding modeling, due to perceived qualification difficulty 11. Unrestricted C++ Usage 12. Not considering Reusable Software Components (RSC) 13. Not considering additional Level A verification (MCDC & Object Code correlation)

2011 Atego. All Rights Reserved.

26

Mistake #1: Failing to Consider DO-178C Eco-System


DO-178B users often picked and chose their preferred objectives
Failed to consider other related DO-178B Objectives DO-178C users will need to consider ALL objectives simultaneously

DO-178C users: fully consider and utilize ARP 4754A


Consider interrelationships between software, hardware, system, and aircraft Consider Safety (via ARP 4761) impacts Do the above through entire software development lifecycle

2011 Atego. All Rights Reserved.

27

The Immediate DO-178C Eco-System

Safety Assessment ARP 4761

Architecture Criticality Level

System Development
ARP 4754A
HW Rqmts

SW Rqmts

Tests

Tests

Software DO-178B

Hardware DO-254

2011 Atego. All Rights Reserved.

28

Mistake #2: Insufficient PSAC per ARP 4754A and ARP 4761
Related to Mistake #1 More consistent and justifiable criticality level determination and application of corresponding lifecycle DO-178C users: heed ARP 4754A and address system level development

ARP 4761

ARP 4754A

2011 Atego. All Rights Reserved.

29

Mistake #3: Weak Quality Assurance


DO-178B users sometimes employed Weak QA:
Processes/standards were relegated to engineering, not QA QA inspected instead of assessing and enforcing processes Failed to fully consider and audit Suppliers

DO-178C users: apply empowered and strong QA


Software Quality Assurance Plan

2011 Atego. All Rights Reserved.

30

Mistake #4: Inadequate and non-automated traceability within Models


Software Modelling is the future, particularly for complex systems Modelling can greatly improve software schedule and quality Modelling introduces potentially oblique traceability issues DO-178C users:
Adopt modelling and automated code generation Specify modelling traceability strategy in SDP Formalize modelling traceability review and usage within verification

2011 Atego. All Rights Reserved.

31

Mistake #5: Inadequate Level of Detail Within Requirements


Experience has fully shown that Requirements are a key ingredient to quality software Studies reiterate most software defects are due to weak requirements DO-178B users often took shortcuts on requirements:
Failed to fully elucidate low level requirements Filled in the gaps by adding excessive structural coverage tests

DO-178C will require most source-code branches to directly trace to lowlevel requirements:
Bonus question: What will thus be the difference between Level C and B?

2011 Atego. All Rights Reserved.

32

Mistake #6: Lack of Path Coverage Capture During Functional Tests


Related to Mistake #5 DO-178C will necessitate requirements for most code paths Functional Testing = Requirements Testing Assess requirement quality by correlating to structural coverage
Aim for 90% DC coverage during Functional testing

Functional Tests
Robustness Tests

Normal Range Tests

Structural Coverage Analysis

2011 Atego. All Rights Reserved.

33

Mistake #7: Tool Qualification: Insufficient Coverage & Insufficient PSAC


DO-178B had simplistic tool qualification criteria
Think 1990 & the state of software development twenty years ago

DO-178C recognizes major advances in avionics development tools Tool Vendors were major contributors to DO-178C DO-178C will require more thoughtful application of Tool Qualification
Consider the tools application and criticality level of associated software Specify all software related tools in the PSAC Justify which tools will not require Qualification and why Tool
Qualification

Plan

2011 Atego. All Rights Reserved.

34

Mistake #8: Applying Formal Methods without Formal Notation


Advanced and complex systems have logic relationships transcending space and time:
Traditional textual requirements are insufficient to quantify logic relationships

DO-178C provides Formal Methods to assist Formal Methods may greatly enhance verification However, Formal Methods should use formal requirement notation
Formal notation quantifies relationships and allows verification traceability

Specify such in SDP and SVP


Software Development Plan Software Verification Plan

2011 Atego. All Rights Reserved.

35

Mistake #9: Failing to treat Parameters as full DO-178C Software


Under DO-178B, virtually anything could be defined via Parameters:
Constants, Objects, Variables, Data, Logic Difficult to trace and fully verify such parameters; often skipped in requirements

DO-178C will require Parameters to be treated just as regular embedded logic:


Full application of all DO-178C objectives to Parameters

2011 Atego. All Rights Reserved.

36

Mistake #10: Avoiding Modelling, Due to Perceived Qualification Difficulty


Modelling tools develop software, therefore:
A modelling tool failure may cause an error to be inserted in the software Development Tool qualification is quite difficult and expen$ive

Due to modelling tool qualification difficulty, and expanded DO-178C Tool Qualification criteria, inexperienced users may wrongly avoid modelling HOWEVER: modelling tools do NOT require qualification!
Simply verify the outputs of the tool and traceability; avoid qualification!

2011 Atego. All Rights Reserved.

37

Mistake #11: Unrestricted C++ Usage


C++ is increasingly used in avionics, even Level A Traditional DO-178 roadblocks to C++ usage largely removed However, this does NOT mean full C++ attributes should be used:
Inheritance, polymorphism, memory allocation, etc are all problematic

Restrict C++ usage to a defined, safe subset (think MISRA C++ )

Software Coding Standards

2011 Atego. All Rights Reserved.

38

Mistake #12: Not Considering Reusable Software Components (RSC)


Software development is increasingly expensive Major key to long-term cost reduction? Reusable Software Components The Holy Grail: Software Reusability The Problem: Software Re-certification The Solution: Understand & Apply Advisory Circular 20-148
Even if AC 20-148 not formally applied, use as guidance
Advisory Circular 20-148 (Reusable Software Components)

2011 Atego. All Rights Reserved.

39

Mistake #13: Not Considering Additional Level A Verification


Under DO-178B, relatively minor differences between Levels A & B:
Source-to-binary correlation MCDC coverage Added independence

However, Level A systems must be 100X more reliable than Level B DO-178Cs Solution:
More stringent MCDC criteria More stringent source-to-object code correlation
Software Verification Plan

2011 Atego. All Rights Reserved.

40

Additional Information
Email: info@atego.com Website: www.atego.com DO-178B/254 White Papers: www.atego.com/wp DO-178B Websites:
www.do178site.com www.do178blog.com

DO-254 Websites:
www.do254site.com www.do254blog.com

2011 Atego. All Rights Reserved.

41