Presentation by Nick Tudor and Duncan Brown

advertisement
DO178C Overview
Duncan Brown - Rolls Royce
Nick Tudor - QinetiQ
Agenda




Why we need a new document
Structure of the Special Committee
Progress to date
Specific overview of SG6 – Formal Methods
Why we need a new document
The DO-178B / ED-12B
 Takes into account the inputs, constraints, requirements from all
the stakeholders :
– Consensus between Airframe Manufacturers, Equipment suppliers,
Certification Authorities
 DO-178B / ED-12B was written as much as possible as a
“requirements oriented” document
– try to stay not prescriptive on the means > less sensitive to technology
evolution
 DO-178B / ED-12B is a “process oriented” document :
– In 1992, we could not imagine to get sufficient assurance on the software
product just by assessing it
 More than 10 years of use do not reveal major safety flaws
So, why changing ?
 Because FAA wants to...
 But also because we need it …
DO 178B / ED-12B was released in 1992 :
– In 1992, Software engineering was 24 years old...
– In 2005, Software Engineering is 50% older, as compared to 1992…
 Changing for a better consensus, taking into account :
– Legacy from the clarification group
– Lessons learnt in applying DO-178B / ED-12B
– Newly available industrial means
A number of issues that need to be resolved
 EUROCAE WG 52/RTCA SC 190 main points :
–
–
–
–
–
Should safety specific considerations be addressed in DO178 ?
Configuration Control requirement too high for tool
Integration process vs. integration testing
Finding Common mode errors not really addressed
Not enough goal oriented :
 DO-178B/ED-12B forces the applicant to address the objectives directly
which may not be applicable for a given technology
 Objectives in Annex tables are not all objectives--some are specific means of
compliance (MC/DC), so an alternative means of compliance are not feasible
– COTS issue not addressed in the same way in DO-178B & DO-278
 Recent development shows us that these issues are not
theoretical…
Fall 2004:RTCA Ad-hoc SW meeting
 Ad-hoc SW meeting convened by RTCA in October :
– US members : FAA, NASA, Boeing, Honeywell, Rockwell, P&W, United
Technologies, Avidyne, Consultants,
– European guests (D. Hawken, P. Heller, R. Hannan, G.Ladier)
Need
to validate
models,
Section
6 The
ofmethod
178B
requires
"testing".
There
criteria
for
development
tool
need
expressive
requirements
Memory
and timing
margins,
WCET,
are now
other
techniques
thatrigorous.
are more
qualification
is
too
languages
to betosource
used
tocode
verify
– 71 issues : clarification or consistency
…
difficult
analyze.
No
exhaustive.
Analyzing
using
models.
Do
we
now
also need
"objective"
that
addresses
WCET
tools.
Not
clear
how
to
address
structural
– 90 issues : new guidance. Helpful or needed for
most
ofonthem
guidance
how
tothe
build
models,
Consider
adding
DO-278
coverage
and dead
code
issues
whencriteria
using
Really
needs
to
be
addressed
in the
validate
models,
etc.
to
DO-178[
]
such
tools.
safety assessment and ARP
Amongst the top 20 issues :
4754/4761 - then flows down to the
software.
 161 issues identified on DO-178B

– Model based development
– Development tool qualification criteria
– Analyzing computer resource usage and WCET
– Separate document for CNS/ATM
– Security Guidance
– Static verification
Terms of reference
 The ad-hoc SW group has proposed to :
–
–
–
–
–
In American Guidance
is stronger
thanassurance.
Guidelines
Maintain the current objective-based
approach
for SW
In French Guidance is Directives, Guidelines are
Modify (minimum changes)
DO-178B/ED-12B > DO-178C/ED-12C.
Recommandations
Develop rationale for each objective and package
Develop guidelines that provide information ( DO-248B/ED-94B)
Develop supplements to document technology-specific or methodspecific guidance.
Supplements :
–May provide alternate means to satisfy DO-178C/ED-12C objectives
–Possible supplements : Tool qualification, Model-based development,
Object-oriented technology, Formal methods,…
–On this basis, RTCA and EUROCAE agreed to commence new
committees
Structure of the Special
Committee/Working Group
Joint between
EUROCAE and RTCA
WG-71/SC-205 Structure
Joint committee WG71/ SC205
Executive Committee
SG1: Sub Group
Coordination
Joint Chair: Gerard Ladier [Airbus]
Joint Chair: Jim Krodel [P&W]
Joint Sec: Ross Hannan [Sigma]
Joint Sec: Mike DeWalt [CSI]
FAA Rep: Barbara Lingberg [FAA]
SG2: Issue Review
& Rationale
SG3: Tools
SG4: Model
Based Design
SG5: Object Oriented
Technology
SG6: Formal Methods
SG7: Safety CNS/ATM
Membership from Airframers,
avionics suppliers, certification
authorities, engine manufacturers,
CNS/ATM specialists, Space
community, consultants
Progress to date
Joint between
EUROCAE and RTCA
Sub Group 1 – Document Integration
Chairs
Tom Ferrell (Ferrell and Associates Consulting) and
Ron Ashpole (Bewicks Consulting)
 Focussing on issues within current document
e.g.
– Annex A tables do not accurately reflect the real requirements in the main document.
Hence causes people who focus only on the Annex tables to have issues with the
DER
 Information Paper system for managing work products
 Technology Supplement Template
 “Hooks” in DO-178C core document to link in Technology
Supplements
 Changes to core document such as Errata
Sub Group 2 - Issues & Rationale
Chairs
Will Struck (FAA)
Ross Hannan (Sigma Associates)
 Believe it or not – they lost the rationale!
– Discussion on why we use MCDC (or not)
– Will coordinate activities relating to dead deactivated code (different for
some technologies?)
 Rationale for objectives should be released to WG May 07
 Also new objectives will be set as a result
Sub Group 3 – Tool Qualification
Chairs
Leanna Rierson (Digital Safety Consulting)
Frederick Pothon (ACG Solutions)
 Likely new approach for tool qualification.
– Aim is to develop a qualification approach that meets the needs of
development tools and keep the current approach for current
classes of verification tools (as far as possible),
– enable re-use of tool qualification data,
– allow emerging tool technologies,
– identify users and developers,
– objective based approach,
– used by multiple domains (system/software).
– Propose a re-write of section 12.2 with the aim to assess the impact
of the tool rather than to determine the level directly.
– New DOXXX/EDXXX document : objective based, by level
– Has some merit in that tool manufacturers don’t necessarily have to
be software certification experts.
Sub Group 4 – Model Based Design (&
Verification)
Chairs
Mark Lillis (Hamilton Sundstrand)
Pierre Lionne (EADS)
 Split in agendas
– Some wish to do things because they have a technology
– Others wish to go back to first principles (as advised by Exec)
 Opportunity is being lost as nothing abstract in what needs
to be demonstrated by any MBD is being discussed
– Not addressing syntax/semantics
– Nothing said about relationship to existing objectives
– Diving into low level issues
 Biggest discussion topics to date have been
– What is the difference between High-Level and Low-Level
requirements?
– What is source code?
Sub Group 5 – OO Technologies
Chairs
Jim Chelini (Verocel Inc)
Peter Heller (Airbus)
 Unlikely to be much new because of FAA OOTiA work
– Aim would be to withdraw this following completion of DO-178C
and supplements
– However not clear if an OO supplement is required
 Much initial work was creating FAQs/Discussion Papers for
DO-248 and minor changes for DO-178C – This was
challenged as the intent of the working group is to
consolidate guidance.
 Will address some structural coverage issues specific to
OO and polymorphism
Sub Group 6 – Formal Methods
Chairs
Kelly Hayhurst (NASA)
Duncan Brown (Rolls-Royce)
 Established a definite need for a Formal Methods
technology supplement
 Decided to separate the case studies and tutorial
information into a discussion paper
 Proposed rewrite of Section 6 – Verification because it
made the use of technology supplements easier
Sub Group 7 – CNS/ATM & Safety
Chairs
Don Heck (Boeing)
David Hawken (National Air Traffic Services Limited)
 Issues surrounding merge of DO-278 (CNS/ATM) and DO178 (Airborne)
– Likely to happen
– Domain specific sections where applicable – annexes?
 Links to system safety considerations
 Decided that there is not to be Level A+
Why section 6 must change
The Problem
 DO-178B Section 6 calls explicitly for Review, Analysis and
Test rather than setting out the objectives for verification
and leaving the applicant to create the appropriate
verification plan.
 However specific methods are deemed the only way to
meet specific objectives.
 There are issues with specific technologies such as Formal
Methods, OO and MBD with respect to how the current
section 6 objectives can be met.
 The existing material can benefit from additional
clarification.
The Proposal
 To re-arrange the DO-178B section 6 material according to
life cycle data.
 To explicitly state the objectives for the verification of each
life cycle data item.
 To retain the complete testing process under the
verification of executable object code.
 To generalise the wording to use verification instead of
specific methods (Review, Analysis & Test).
 To ensure that DO-178C on its own is as forceful with
regard to testing as DO-178B.
The Verification Process
System
Requirements
A-3.2 Accuracy & Consistency
A-3.3 HW Compatibility
A-3.4 Verifiability
A-3.5 Conformance
A-3.7 Algorithm Accuracy
A-3.1 Compliance
A-3.6 Traceability
(A-2: 1, 2)
High-Level
Requirements
A-6.1 Compliance
A-6.2 Robustness
A-4.1 Compliance
A-4.6 Traceability
A-4. 8 Architecture Compatibility
(A-2: 3, 4, 5)
A-4.9 Consistency
A-4.10 HW Compatibility
A-4.11 Verifiability
A-4.12 Conformance
A-4.13 Partition Integrity
Software
Architecture
A-5.2 Compliance
A-5.3 Verifiability
A-5.4 Conformance
A-5.6 Accuracy & Consistency
Low-Level
Requirements
A-4.2 Accuracy & Consistency
A-4.3 HW Compatibility
A-4.4 Verifiability
A-4.5 Conformance
A-4.7 Algorithm Accuracy
A-5.1 Compliance
(A-2: 6) A-5.5 Traceability
Source Code
A-6.3 Compliance
A-6.4 Robustness
(A-2: 7)
A-5. 7 Complete & Correct
Executable
Object Code
A-6.5 Compatible With Target
Compliance: with requirements
Conformance: with standards
The Verification Process – Level A
System
Requirements
A-3.2 Accuracy & Consistency
A-3.3 HW Compatibility
A-3.4 Verifiability
A-3.5 Conformance
A-3.7 Algorithm Accuracy
A-3.1 Compliance
A-3.6 Traceability
(A-2: 1, 2)
High-Level
Requirements
A-6.1 Compliance
A-6.2 Robustness
A-4.1 Compliance
A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2: 3, 4, 5)
A-4.9 Consistency
A-4.10 HW Compatibility
A-4.11 Verifiability
A-4.12 Conformance
A-4.13 Partition Integrity
Software
Architecture
A-5.2 Compliance
A-5.3 Verifiability
A-5.4 Conformance
A-5.6 Accuracy & Consistency
Low-Level
Requirements
A-4.2 Accuracy & Consistency
A-4.3 HW Compatibility
A-4.4 Verifiability
A-4.5 Conformance
A-4.7 Algorithm Accuracy
A-5.1 Compliance
(A-2: 6) A-5.5 Traceability
Source Code
A-6.3 Compliance
A-6.4 Robustness
(A-2: 7)
A-5. 7 Complete & Correct
Executable
Object Code
A-6.5 Compatible With Target
Compliance: with requirements
Conformance: with standards
The Verification Process – Level B
System
Requirements
A-3.2 Accuracy & Consistency
A-3.3 HW Compatibility
A-3.4 Verifiability
A-3.5 Conformance
A-3.7 Algorithm Accuracy
A-3.1 Compliance
A-3.6 Traceability
(A-2: 1, 2)
High-Level
Requirements
A-6.1 Compliance
A-6.2 Robustness
A-4.1 Compliance
A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2: 3, 4, 5)
A-4.9 Consistency
A-4.10 HW Compatibility
A-4.11 Verifiability
A-4.12 Conformance
A-4.13 Partition Integrity
Software
Architecture
A-5.1 Compliance
(A-2: 6) A-5.5 Traceability
A-5.2 Compliance
A-5.3 Verifiability
A-5.4 Conformance
A-5.6 Accuracy & Consistency
Low-Level
Requirements
A-4.2 Accuracy & Consistency
A-4.3 HW Compatibility
A-4.4 Verifiability
A-4.5 Conformance
A-4.7 Algorithm Accuracy
Source Code
A-6.3 Compliance
A-6.4 Robustness
(A-2: 7)
A-5. 7 Complete & Correct
Executable
Object Code
A-6.5 Compatible With Target
Compliance: with requirements
Conformance: with standards
The Verification Process – Level C
System
Requirements
A-3.2 Accuracy & Consistency
A-3.3 HW Compatibility
A-3.4 Verifiability
A-3.5 Conformance
A-3.7 Algorithm Accuracy
A-3.1 Compliance
A-3.6 Traceability
(A-2: 1, 2)
High-Level
Requirements
A-6.1 Compliance
A-6.2 Robustness
A-4.1 Compliance
A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2: 3, 4, 5)
A-4.9 Consistency
A-4.10 HW Compatibility
A-4.11 Verifiability
A-4.12 Conformance
A-4.13 Partition Integrity
Software
Architecture
A-5.1 Compliance
(A-2: 6) A-5.5 Traceability
A-5.2 Compliance
A-5.3 Verifiability
A-5.4 Conformance
A-5.6 Accuracy & Consistency
Low-Level
Requirements
A-4.2 Accuracy & Consistency
A-4.3 HW Compatibility
A-4.4 Verifiability
A-4.5 Conformance
A-4.7 Algorithm Accuracy
Source Code
A-6.3 Compliance
A-6.4 Robustness
(A-2: 7)
A-5. 7 Complete & Correct
Executable
Object Code
A-6.5 Compatible With Target
Compliance: with requirements
Conformance: with standards
The Verification Process – Level D
System
Requirements
A-3.2 Accuracy & Consistency
A-3.3 HW Compatibility
A-3.4 Verifiability
A-3.5 Conformance
A-3.7 Algorithm Accuracy
A-3.1 Compliance
A-3.6 Traceability
(A-2: 1, 2)
High-Level
Requirements
A-6.1 Compliance
A-6.2 Robustness
A-4.1 Compliance
A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2: 3, 4, 5)
A-4.9 Consistency
A-4.10 HW Compatibility
A-4.11 Verifiability
A-4.12 Conformance
A-4.13 Partition Integrity
Software
Architecture
A-5.1 Compliance
(A-2: 6) A-5.5 Traceability
A-5.2 Compliance
A-5.3 Verifiability
A-5.4 Conformance
A-5.6 Accuracy & Consistency
Low-Level
Requirements
A-4.2 Accuracy & Consistency
A-4.3 HW Compatibility
A-4.4 Verifiability
A-4.5 Conformance
A-4.7 Algorithm Accuracy
Source Code
A-6.3 Compliance
A-6.4 Robustness
(A-2: 7)
A-5. 7 Complete & Correct
Executable
Object Code
A-6.5 Compatible With Target
Compliance: with requirements
Conformance: with standards
The Verification Process – Level E
Executable
Object Code
Comparison of Old -> New
6.0 SOFTWARE VERIFICATION PROCESS
6.1 Software Verification Process Objectives
6.2 Software Verification Process Activities
6.3 Software Reviews and Analyses
6.3.1 Reviews and Analyses of the
High-Level Requirements
a.
b.
c.
d.
e.
f.
g.
Compliance with system requirements
Accuracy and consistency
Compatibility with the target computer
Verifiability
Conformance to standards
Traceability
Algorithm aspects
6.3.2 Reviews and Analyses of the
Low-Level Requirements
a.
b.
c.
d.
e.
f.
g.
Compliance with high-level requirements
Accuracy and consistency
Compatibility with the target computer
Verifiability
Conformance to standards
Traceability
Algorithm aspects
6.0 SOFTWARE VERIFICATION PROCESS
6.1 Software Verification Process Objectives
6.2 Software Verification Process Activities
6.3 Detailed Guidance for Verification Activities
6.3.1 Verification Activities for the
High-Level Requirements
a. Compliance with system requirements
b. Accuracy and consistency
c. Compatibility with the target computer
d. Verifiability
e. Conformance to standards
f. Traceability
g. Algorithm aspects
6.3.2 Verification Activities for the
Low-Level Requirements
a. Compliance with high-level requirements
b. Accuracy and consistency
c. Compatibility with the target computer
d. Verifiability
e. Conformance to standards
f. Traceability
g. Algorithm aspects
Comparison of Old -> New
6.3.3 Reviews and Analyses of the
Software Architecture
a. Compliance with high-level requirements
b. Consistency
c. Compatibility with the target computer
d. Verifiability
e. Conformance to standards
f. Partitioning integrity
6.3.4 Reviews and Analyses of the
Source Code
a. Compliance with low-level requirements
b. Compliance with the software architecture
c. Verifiability
d. Conformance to standards
e. Traceability
f. Accuracy and consistency
6.3.5 Reviews and Analysis of the Outputs of the
Integration Process
6.3.3Verification Activities for the
Software Architecture
a. Compliance with high-level requirements
b. Consistency
c. Compatibility with the target computer
d. Verifiability
e. Conformance to standards
f. Partitioning integrity
6.3.4 Verification Activities for the
Source Code
a. Compliance with low-level requirements
b. Compliance with the software architecture
c. Verifiability
d. Conformance to standards
e. Traceability
f. Accuracy and consistency
6.3.5 Verification Activities for the Executable Object
Code
a. Completeness and correctness
b. Compliance with the high-level requirements
c. Robustness for high and low-level requirements
d. Compliance with the low-level requirements
e. Compatibility with the target computer
6.3.5.1 Software Testing
6.3.5.2 Test Environment
Comparison of Old -> New
6.3.6 Reviews and Analyses of the Test Cases,
Procedures, and Results
6.3.6 Verification Activities for the Analyses, Test Cases,
Procedures and Results
a. Analysis and Test cases
b. Analysis and Test procedures
c. Analysis and Test results
6.3.6.1 Coverage Analysis
6.3.6.1.1 Requirements Coverage Analysis
6.3.6.1.2 Structural Coverage Analysis
6.4 Software Testing Process
6.4.1 Test Environment
6.4.2 Requirements-Based Test Case Selection
6.4.2.1 Normal Range Test Cases
6.4.2.2 Robustness Test Cases
6.4.3 Requirements-Based Testing Methods
6.4.4 Test Coverage Analysis
6.4.4.1 Requirements-Based Test Coverage Analysis
6.4.4.2 Structural Coverage Analysis
6.4.4.3 Structural Coverage Analysis Resolution
6.3.6.1.3 Structural Coverage Analysis Resolution
{Transferred to section 6.3.5.1}
{Transferred to section 6.3.5.2}
{Transferred to 6.3.5}
{Transferred to 6.3.5}
{Transferred to 6.3.5}
{Transferred to 6.3.5}
{Transferred to section 6.3.6.1}
{Transferred to section 6.3.6.1.1}
{Transferred to section 6.3.6.1.2}
{Transferred to section 6.3.6.1.3}
Major Comments Raised
 The paper lowers the bar for testing significantly (To zero!)
 Review and analysis are the only applicable methods for
verification of higher level life cycle data.
 Testing is the only applicable method for meeting the
verification of the executable object code.
Summary
 Latest Revision of Paper emphasises the reliance on
testing where there are no other accepted means for
verification.
 DO-178C used alone needs to be as forceful in the need
for testing as DO-178B
 It now says that “…testing is necessary to ensure that the
executable object code is compatible with the target
computer”
 Only the use of approved guidance in conjunction with DO178C could alter the amount of testing required.
 Paper to be agreed by SG6 at interim meeting mid-year.
 Plenary consensus to be sought in Vienna in the Autumn.
 There is “significant” backing for this paper.
 This provides a way to use Formal Methods as part of
certification – The technology supplement will provide the
“How”
IP 601 Rev B (Draft)
Download