5.4: Aerospace DO-178C Application
Introduction
This chapter demonstrates how ASPICE 4.0 processes integrate with DO-178C (Software Considerations in Airborne Systems and Equipment Certification) for aerospace software development. While ASPICE provides a comprehensive process framework for automotive embedded systems, DO-178C is the de facto standard for airborne software certification recognized by the Federal Aviation Administration (FAA), European Union Aviation Safety Agency (EASA), and other global aviation authorities.
Why Integrate ASPICE with DO-178C?
Organizations developing both automotive and aerospace systems benefit from a unified process framework that satisfies both standards. ASPICE 4.0's process-based approach aligns naturally with DO-178C's objectives-based structure, enabling:
- Process Reuse: Common development processes (requirements, architecture, design, verification) with standard-specific tailoring
- Evidence Harmonization: Work products that satisfy both ASPICE base practices and DO-178C objectives
- Tool Ecosystem: Shared toolchains with appropriate qualification levels for each domain
- Knowledge Transfer: Engineering teams moving between automotive and aerospace projects
- Audit Efficiency: Single process framework assessed against multiple standards
DO-178C Context
RTCA DO-178C (equivalent to EUROCAE ED-12C) was released in December 2011 as a revision to DO-178B. Key characteristics:
- Objectives-Based: Defines 71 objectives across software lifecycle processes
- Software Levels: Five levels (A-E) based on failure condition severity
- Evidence-Driven: Requires demonstrable evidence that objectives are satisfied
- Tool Qualification: Tools must be qualified per DO-330 when they eliminate, reduce, or automate verification activities (DO-178C Section 12.2)
- Supplements: Modular guidance for model-based development (DO-331), object-oriented techniques (DO-332), and formal methods (DO-333)
Companion Standards:
- DO-330: Software Tool Qualification Considerations
- DO-178C Supplements: DO-331 (Model-Based), DO-332 (Object-Oriented), DO-333 (Formal Methods)
- ARP4754A: Guidelines for Development of Civil Aircraft and Systems (system-level)
- DO-254: Design Assurance Guidance for Airborne Electronic Hardware
Development Assurance Levels (DAL)
DO-178C defines five software levels based on the severity of failure conditions, as determined by the system safety assessment process (defined in ARP4754A).
DAL Definitions
| DAL | Failure Condition Category | Effect on Aircraft and Occupants | Probability Objective | Typical Examples |
|---|---|---|---|---|
| A | Catastrophic | Failure conditions that would prevent continued safe flight and landing | Extremely Improbable (<10⁻⁹ per flight hour) | Primary flight control (fly-by-wire), Engine control (FADEC), Auto-throttle in critical phases |
| B | Hazardous | Failure conditions that would have a large negative impact on safety or performance, or reduce crew ability to operate the aircraft due to physical distress or higher workload | Extremely Remote (<10⁻⁷ per flight hour) | Autopilot mode selection, Navigation system, Traffic collision avoidance (TCAS) |
| C | Major | Failure conditions that would significantly reduce aircraft safety margins or crew capabilities to cope with adverse conditions | Remote (<10⁻⁵ per flight hour) | Flight management system, Automatic landing system backup, Weather radar |
| D | Minor | Failure conditions that would slightly reduce aircraft safety margins or crew capabilities, or result in passenger discomfort | Probable (<10⁻³ per flight hour) | Cabin pressurization control, In-flight entertainment, Electronic flight bag |
| E | No Effect | Failure conditions that do not affect operational capability or crew workload | No probability requirement | Passenger Wi-Fi, Cabin lighting preferences |
Source: DO-178C Section 2.3, Table 2 (Relationship of Software Level to Failure Condition Category)
Verification Objectives by DAL
The number and rigor of verification objectives increase with DAL criticality:
| Process Area | DAL A | DAL B | DAL C | DAL D | DAL E |
|---|---|---|---|---|---|
| Software Planning Process | 7 objectives | 7 objectives | 7 objectives | 7 objectives | 0 objectives |
| Software Development Process | 8 objectives | 8 objectives | 8 objectives | 3 objectives | 0 objectives |
| Software Verification Process | 40 objectives | 37 objectives | 29 objectives | 14 objectives | 0 objectives |
| Software Configuration Management | 7 objectives | 7 objectives | 7 objectives | 7 objectives | 0 objectives |
| Software Quality Assurance | 6 objectives | 6 objectives | 6 objectives | 6 objectives | 0 objectives |
| Certification Liaison | 3 objectives | 3 objectives | 3 objectives | 3 objectives | 0 objectives |
| Total Objectives | 71 | 68 | 60 | 40 | 0 |
Source: DO-178C Annex A, Tables A-1 through A-10
Structural Coverage Requirements by DAL
| DAL Level | Statement Coverage | Decision Coverage | Modified Condition/Decision Coverage (MC/DC) | Data/Control Coupling | Object Code Review |
|---|---|---|---|---|---|
| A | Required | Required | Required | Required | Assembly review if compiler generates untraceable code |
| B | Required | Required | Required | Required | Not required |
| C | Required | Required | Not required | Not required | Not required |
| D | Not required | Not required | Not required | Not required | Not required |
| E | Not required | Not required | Not required | Not required | Not required |
Key Coverage Definitions:
- Statement Coverage: Every statement in the code has been executed at least once during testing
- Decision Coverage: Every decision (branch) has been tested for both true and false outcomes
- MC/DC: Each condition in a decision has been shown to independently affect the decision outcome
- Data Coupling: Variables used in one part of code are properly passed to other parts
- Control Coupling: Correct execution flow between software components
Source: DO-178C Section 6.4 (Structural Coverage Analysis)
ASPICE-to-DAL Mapping Matrix
This section maps ASPICE 4.0 processes to DO-178C DAL requirements, showing how ASPICE base practices contribute to satisfying DO-178C objectives.
System Engineering Processes (SYS.1-5)
| ASPICE Process | DO-178C Section | DAL A | DAL B | DAL C | DAL D | Mapping Notes |
|---|---|---|---|---|---|---|
| SYS.1 Requirements Elicitation | Section 2.2 (System Requirements Allocation) | Essential | Essential | Essential | Essential | System requirements allocated to software become high-level requirements in DO-178C |
| SYS.2 System Requirements Analysis | Section 2.3 (Software Level Determination) | Essential | Essential | Essential | Essential | System safety assessment determines DAL level; safety requirements flow to software |
| SYS.3 System Architectural Design | Section 2.4 (Architectural Considerations) | Essential | Essential | Essential | Recommended | Partitioning, redundancy, dissimilarity considerations |
| SYS.4 System Integration & Testing | Section 2.6 (System Considerations in Software Lifecycle) | Essential | Essential | Essential | Essential | Software integration testing feeds system validation |
| SYS.5 System Qualification Testing | ARP4754A (companion) | Essential | Essential | Essential | Essential | System-level validation includes software qualification |
Software Engineering Processes (SWE.1-6)
| ASPICE Process | DO-178C Objectives | DAL A Emphasis | DAL B Emphasis | DAL C Emphasis | DAL D Emphasis | Key Mapping |
|---|---|---|---|---|---|---|
| SWE.1 Software Requirements Analysis | Table A-3 (Verification of Outputs of Software Requirements Process) | 11 objectives | 11 objectives | 11 objectives | 3 objectives | ASPICE BP1-7 directly satisfy DO-178C high-level requirements objectives |
| SWE.2 Software Architectural Design | Table A-4 (Verification of Outputs of Software Design Process) | 9 objectives | 9 objectives | 7 objectives | 0 objectives | Software architecture must be traceable, verifiable, and conform to standards |
| SWE.3 Software Detailed Design & Unit Construction | Table A-4 (Software Design) + Table A-5 (Software Coding) | 12 objectives (combined) | 12 objectives | 10 objectives | 0 objectives | Low-level requirements + source code verification |
| SWE.4 Software Unit Verification | Table A-6 (Software Integration Process) | 4 objectives | 4 objectives | 4 objectives | 2 objectives | Unit testing contributes to structural coverage analysis |
| SWE.5 Software Integration & Integration Test | Table A-6 (Software Integration Process) | 4 objectives | 4 objectives | 4 objectives | 2 objectives | Integration testing demonstrates correct linkage and data flow |
| SWE.6 Software Qualification Testing | Table A-7 (Verification of Software Requirements + Software Lifecycle) | 16 objectives | 13 objectives | 8 objectives | 9 objectives | Requirements-based testing + robustness testing |
Supporting Processes (SUP.1, SUP.2, SUP.8, SUP.10)
| ASPICE Process | DO-178C Section | Mapping |
|---|---|---|
| SUP.1 Quality Assurance | Section 8 (Software Quality Assurance Process) | ASPICE QA reviews directly satisfy DO-178C QA objectives (Table A-8) |
| SUP.2 Verification | Section 6 (Software Verification Process) | ASPICE verification strategy implements DO-178C verification objectives |
| SUP.8 Configuration Management | Section 7 (Software Configuration Management) | ASPICE CM practices satisfy DO-178C CM objectives (Table A-9) |
| SUP.10 Change Request Management | Section 7.2.5 (Problem Reporting) | Problem reports tracked through ASPICE change management |
Management Processes (MAN.3)
| ASPICE Process | DO-178C Section | Mapping |
|---|---|---|
| MAN.3 Project Management | Section 4 (Software Planning Process) | Software Development Plan (SDP), Software Verification Plan (SVP), Software Configuration Management Plan (SCMP), Software Quality Assurance Plan (SQAP) |
Base Practice Emphasis by DAL
This section details which ASPICE base practices are critical for each DAL level, with specific emphasis on the rigor and independence requirements.
DAL A: Catastrophic Failure Prevention
Critical ASPICE Base Practices:
SWE.1 (Requirements Analysis):
- BP1: Specify software requirements - ALL requirements must be unambiguous, verifiable, traceable to system requirements
- BP2: Structure software requirements - Requirements must be categorized (functional, safety, performance, interface)
- BP3: Analyze software requirements for correctness - Formal analysis techniques required
- BP4: Analyze software requirements for testability - Each requirement must have defined verification criteria
- BP5: Establish bidirectional traceability - Full traceability from system requirements → high-level → low-level → code → test cases
- BP6: Ensure consistency - Requirements review by independent verification & validation (IV&V) team
- BP7: Communicate software requirements - Requirements baseline under strict configuration control
SWE.2 (Architectural Design):
- BP1: Develop software architectural design - Architecture must support partitioning (DO-178C Section 2.4.3)
- BP2: Allocate software requirements to architecture - Allocation must be traceable and complete
- BP3: Define interfaces - All software-to-software and software-to-hardware interfaces defined with protocols
- BP4: Describe dynamic behavior - State machines, timing diagrams, sequence diagrams for all critical behaviors
- BP5: Evaluate alternative architectures - Safety analysis (FMEA, FTA) performed on architecture options
- BP6: Establish bidirectional traceability - Architecture elements traceable to high-level requirements
- BP7: Ensure consistency - Architecture review includes safety analysis, resource analysis, timing analysis
- BP8: Communicate software architecture - Architecture documented in Software Design Description (SDD)
SWE.3 (Detailed Design & Construction):
- BP1-BP5: Low-level requirements must be verifiable at unit level
- BP6-BP10: Source code must adhere to strict coding standards (e.g., MISRA C, DO-178C coding standards)
- BP11: Unit traceability to low-level requirements
- BP12: Code reviews by independent team (independence criteria: Table A-7, Objective 7)
SWE.4 (Unit Verification):
- BP1-BP6: Unit testing achieves 100% statement + decision + MC/DC coverage
- Unit tests demonstrate robustness to abnormal inputs
SWE.5 (Integration Testing):
- BP1-BP7: Integration testing demonstrates correct data flow, control flow, timing
- Integration tests executed on target hardware (or high-fidelity simulator)
SWE.6 (Qualification Testing):
- BP1-BP9: Requirements-based testing with full requirements coverage
- Robustness testing with boundary conditions, error injection
- Regression testing after any code changes
- Testing performed by independent verification team
SUP.1 (Quality Assurance):
- BP1-BP9: Independent QA audits at all lifecycle phases
- QA reviews software plans, standards compliance, verification results
- QA has authority to prevent software release if objectives not satisfied
SUP.2 (Verification):
- All verification activities performed with independence per DO-178C Table A-7
- Verification results documented in Software Accomplishment Summary (SAS)
SUP.8 (Configuration Management):
- BP1-BP13: Full configuration control from requirements baseline through object code
- Change control board (CCB) approval for all changes
- Configuration status accounting tracks all software lifecycle data
DAL B: Hazardous Failure Prevention
Key Differences from DAL A:
- Reduced independence requirements (some reviews can be performed by same team, but not same person)
- MC/DC coverage still required
- Assembly code review not required (unless compiler generates untraceable code)
- Problem reporting still required with same rigor
ASPICE Emphasis:
- SWE.1-SWE.6: All base practices required
- SUP.1: QA reviews may be performed by same organizational unit (not independent group)
- SUP.2: Verification can be performed by development team (peer reviews acceptable)
- SUP.8: Same CM rigor as DAL A
DAL C: Major Failure Prevention
Key Differences from DAL B:
- MC/DC coverage not required (statement + decision coverage sufficient)
- Data/control coupling analysis not required
- Some verification objectives become "with independence" instead of requiring separate team
ASPICE Emphasis:
- SWE.1-SWE.3: Full rigor on requirements and design
- SWE.4-SWE.6: Reduced coverage requirements (statement + decision only)
- SUP.1: QA reviews required but independence not mandatory
- SUP.2: Verification by development team acceptable
DAL D: Minor Failure Effects
Key Differences from DAL C:
- Low-level requirements not required (can code directly from high-level requirements)
- Source code verification required but structural coverage analysis not required
- Integration testing simplified
- Architecture not required to be documented
ASPICE Emphasis:
- SWE.1: High-level requirements with traceability to system requirements
- SWE.2: Architecture recommended but not required
- SWE.3: Source code must comply with coding standards
- SWE.4: Unit verification basic level
- SWE.5: Integration testing demonstrates correct linkage
- SWE.6: Requirements-based testing (no coverage analysis)
- SUP.8: Configuration management still required
DAL E: No Safety Effect
No DO-178C Objectives: Software with no safety effect requires no compliance with DO-178C objectives. However, organizations typically apply basic software engineering practices for quality and maintainability.
ASPICE Recommendations (not required for certification):
- SWE.1: Basic requirements documentation
- SWE.6: Basic functional testing
- SUP.8: Version control
Work Product Requirements per DAL
DO-178C requires specific software lifecycle data (work products) to demonstrate objective satisfaction. This section maps DO-178C required outputs to ASPICE work products.
Planning Phase Work Products
| DO-178C Work Product | ASPICE Equivalent | DAL A | DAL B | DAL C | DAL D | Content Requirements |
|---|---|---|---|---|---|---|
| Plan for Software Aspects of Certification (PSAC) | Project Plan (MAN.3 BP1) | Required | Required | Required | Required | Certification strategy, software lifecycle overview, compliance matrix |
| Software Development Plan (SDP) | Software Development Plan (SWE.1-6) | Required | Required | Required | Required | Development standards, tools, environment, lifecycle model |
| Software Verification Plan (SVP) | Verification Strategy (SUP.2 BP1) | Required | Required | Required | Required | Verification methods, criteria, independence, transition criteria |
| Software Configuration Management Plan (SCMP) | CM Plan (SUP.8 BP1) | Required | Required | Required | Required | CM methods, tools, baselines, change control, problem reporting |
| Software Quality Assurance Plan (SQAP) | QA Plan (SUP.1 BP1) | Required | Required | Required | Required | QA methods, reviews, audits, independence, records |
Requirements Phase Work Products
| DO-178C Work Product | ASPICE Equivalent | DAL A | DAL B | DAL C | DAL D | Content Requirements |
|---|---|---|---|---|---|---|
| Software Requirements Data (SRD) | Software Requirements Specification (SWE.1 BP7) | Required | Required | Required | Required | High-level requirements, derived requirements, interface requirements |
| Software Requirements Standards | Coding Standards (SWE.3) | Required | Required | Required | Required | Requirements naming, structure, attributes, verification criteria |
Design Phase Work Products
| DO-178C Work Product | ASPICE Equivalent | DAL A | DAL B | DAL C | DAL D | Content Requirements |
|---|---|---|---|---|---|---|
| Software Design Description (SDD) | Software Architecture (SWE.2 BP8) + Detailed Design (SWE.3 BP5) | Required | Required | Required | Not Required | Architecture, low-level requirements, design rationale |
| Software Design Standards | Design Standards (SWE.2) | Required | Required | Required | Not Required | Design notation, complexity limits, interface specifications |
Implementation Phase Work Products
| DO-178C Work Product | ASPICE Equivalent | DAL A | DAL B | DAL C | DAL D | Content Requirements |
|---|---|---|---|---|---|---|
| Source Code | Source Code (SWE.3 BP10) | Required | Required | Required | Required | Compilable, traceable source code |
| Executable Object Code | Build Artifacts (SWE.5) | Required | Required | Required | Required | Loadable binary for target hardware |
| Software Code Standards | Coding Standards (SWE.3 BP6) | Required | Required | Required | Required | Language subset, naming conventions, complexity limits, MISRA compliance |
Verification Phase Work Products
| DO-178C Work Product | ASPICE Equivalent | DAL A | DAL B | DAL C | DAL D | Content Requirements |
|---|---|---|---|---|---|---|
| Software Verification Cases and Procedures (SVCP) | Test Specification (SWE.6 BP2) | Required | Required | Required | Required | Test cases, procedures, expected results, pass/fail criteria |
| Software Verification Results (SVR) | Test Results (SWE.6 BP5) | Required | Required | Required | Required | Actual results, deviations, coverage metrics, traceability |
| Software Life Cycle Environment Configuration Index (SECI) | Development Environment Spec (SWE.3) | Required | Required | Required | Required | Compilers, linkers, libraries, tools, versions |
| Software Configuration Index (SCI) | Configuration Status Accounting (SUP.8 BP6) | Required | Required | Required | Required | All software items under configuration control |
| Problem Reports | Problem Reports (SUP.10) | Required | Required | Required | Required | Defects, root cause, corrective action, verification of fix |
| Software Configuration Management Records (SCMR) | CM Records (SUP.8 BP9) | Required | Required | Required | Required | Change history, baselines, release notes |
| Software Quality Assurance Records (SQAR) | QA Records (SUP.1 BP7) | Required | Required | Required | Required | Audit reports, review findings, non-conformances |
| Software Accomplishment Summary (SAS) | Verification Report (SUP.2 BP6) | Required | Required | Required | Required | Summary of all objectives satisfied, deviations, compliance statement |
Verification Approach by DAL Level
DAL A Verification Strategy
Verification Independence Requirements (DO-178C Table A-7):
| Verification Activity | Independence Level |
|---|---|
| Software reviews | Performed by person(s) other than author |
| Software testing | Test cases developed by person(s) other than code author; test execution may be by same team |
| Software verification independence | Verification performed by person(s) other than developer |
ASPICE Implementation:
- SUP.2 BP4: Establish independent verification team (separate organizational unit)
- SWE.6 BP1: Test specification by independent test team
- SUP.1 BP5: QA reviews by independent QA group
Verification Methods:
-
Requirements Reviews (SWE.1 verification):
- Correctness, unambiguity, completeness, consistency
- Traceability to system requirements
- Conformance to requirements standards
- Verifiability assessment
-
Design Reviews (SWE.2, SWE.3 verification):
- Architecture correctness and completeness
- Low-level requirements correctness
- Design conformance to standards
- Traceability to high-level requirements
-
Code Reviews (SWE.3 verification):
- Code correctness and completeness
- Conformance to coding standards
- Traceability to low-level requirements
- Absence of unintended functionality
-
Integration Testing (SWE.5):
- Data flow correctness
- Control flow correctness
- Timing and resource usage
- Interface compatibility
-
Requirements-Based Testing (SWE.6):
- Normal range testing (all requirements)
- Robustness testing (boundary conditions, error injection)
- Test case traceability to requirements
-
Structural Coverage Analysis (SWE.6):
- Statement coverage: 100%
- Decision coverage: 100%
- MC/DC coverage: 100%
- Data coupling analysis
- Control coupling analysis
-
Object Code Review (if required):
- Assembly-level analysis if compiler generates untraceable code
- Object code to source code traceability verification
Test Execution Environment:
- Testing on target hardware or high-fidelity simulator
- Test environment qualification if using simulation
DAL B Verification Strategy
Differences from DAL A:
- Reduced independence: Reviews and testing can be performed by same organizational unit (but not same person)
- MC/DC coverage still required
- Object code review not required (unless compiler issues)
ASPICE Implementation:
- SUP.2: Verification team may be part of development organization
- SWE.6: Peer reviews acceptable for test case development
- Coverage analysis: Same as DAL A (statement + decision + MC/DC)
DAL C Verification Strategy
Differences from DAL B:
- MC/DC coverage not required (statement + decision coverage sufficient)
- Data/control coupling analysis not required
- Independence requirements relaxed further
ASPICE Implementation:
- SWE.6 BP4: Coverage analysis achieves statement + decision coverage only
- SUP.2: Verification by development team acceptable
- Reviews: Peer reviews within development team
DAL D Verification Strategy
Differences from DAL C:
- Structural coverage analysis not required
- Requirements-based testing only
- Low-level requirements not required
ASPICE Implementation:
- SWE.1: High-level requirements testing only
- SWE.6 BP1-BP3: Test specification based on high-level requirements
- SWE.6 BP5: Test results demonstrate requirements satisfaction
- No coverage analysis required
DO-330 Tool Qualification Overview
DO-330 (Software Tool Qualification Considerations) provides guidance for qualifying software tools used in DO-178C projects. Tool qualification is required when tools:
- Eliminate: Tool output is not verified (e.g., code generator)
- Reduce: Tool automates manual verification (e.g., static analyzer detecting errors)
- Automate: Tool automates verification process (e.g., test execution framework)
Tool Qualification Levels (TQL)
| TQL | Software Level (DAL) | Tool Criteria | Verification Impact | Qualification Rigor |
|---|---|---|---|---|
| TQL-1 | Level A | Criteria 1: Development Tool (output not independently verified) | Tool error propagates directly to airborne executable object code at DAL A | Highest: Full DO-330 development lifecycle applied to tool itself |
| TQL-2 | Level B | Criteria 1: Development Tool (output not independently verified) | Same as TQL-1, but at DAL B | High: Full lifecycle with some objective reductions |
| TQL-3 | Level C | Criteria 1: Development Tool (output not independently verified) | Same as TQL-1, but at DAL C | Medium: Reduced lifecycle objectives |
| TQL-4 | Level A/B (Criteria 2/3) or Level C (Criteria 2) | Criteria 2 or 3: Verification Tool (Criteria 2: may fail to detect errors; Criteria 3: automates verification) | Tool failure may cause undetected errors at DAL A/B | Medium: Operational requirements verification; tool development lifecycle not required |
| TQL-5 | Level C/D (Criteria 2/3) or Criteria 3 at any level | Criteria 2 or 3: Verification Tool | Tool failure may cause undetected errors at DAL C/D or tool cannot introduce errors | Low: Operational requirements verification only |
Source: DO-330 Section 2.3 (Tool Qualification Levels) and FAA AC 20-115D Table 2
Tool Classification Decision Tree
Is the tool used in software development or verification?
│
├─ Development Tool (e.g., code generator, compiler)
│ │
│ └─ Does tool output require verification?
│ │
│ ├─ NO (tool output is executable without verification)
│ │ └─ TQL-1 (Level A), TQL-2 (Level B), TQL-3 (Level C)
│ │
│ └─ YES (tool output is verified by other means)
│ └─ No qualification required (normal verification catches tool errors)
│
└─ Verification Tool (e.g., static analyzer, test framework)
│
└─ Does tool detect errors (eliminate verification activities)?
│
├─ YES (tool failure may fail to detect software error)
│ └─ TQL-4 (Level A/B/C)
│
└─ NO (tool failure cannot introduce error in software)
└─ TQL-5 (Level A/B/C/D)
Tool Qualification Artifacts
| Artifact | TQL-1 | TQL-2 | TQL-3 | TQL-4 | TQL-5 | Description |
|---|---|---|---|---|---|---|
| Tool Qualification Plan (TQP) | Required | Required | Required | Required | Required | Strategy for tool qualification |
| Tool Operational Requirements (TOR) | Required | Required | Required | Required | Required | How tool will be used in software lifecycle |
| Tool Development Plan | Required | Required | Required | Not Req. | Not Req. | Development process for tool itself |
| Tool Requirements | Required | Required | Required | Reduced | Reduced | Functional requirements of tool |
| Tool Design Description | Required | Required | Reduced | Not Req. | Not Req. | Tool architecture and design |
| Tool Source Code | Required | Required | Reduced | Not Req. | Not Req. | Tool implementation |
| Tool Verification Cases | Required | Required | Required | Required | Required | Test cases for tool operational requirements |
| Tool Verification Results | Required | Required | Required | Required | Required | Evidence of TOR satisfaction |
| Tool Configuration Management | Required | Required | Required | Required | Required | Tool version control and change management |
| Tool Qualification Data (TQD) | Required | Required | Required | Required | Required | Summary document submitted to certification authority |
Source: DO-330 Annex A, Tables A-1 through A-5
ASPICE Integration for Tool Qualification
ASPICE SUP.9 (Problem Resolution Management) and SUP.8 (Configuration Management) can be leveraged for tool qualification:
- Tool Development: Treat tool as a software product following SWE.1-6 processes
- Tool Verification: Apply SUP.2 verification practices to tool itself
- Tool CM: Apply SUP.8 configuration management to tool versions
- Tool QA: Apply SUP.1 quality assurance audits to tool qualification process
Note: Tool qualification data package must be submitted to certification authority as part of Plan for Software Aspects of Certification (PSAC).
Case Study: DAL B Autopilot Mode Selection Firmware
This case study demonstrates how ASPICE 4.0 processes satisfy DO-178C DAL B objectives for a realistic avionics system.
Project Overview
System: Autopilot Mode Controller (AMC) for Regional Jet Aircraft Function: Manage autopilot engagement, mode selection, and mode transitions Safety Classification: DAL B (Hazardous failure condition) Rationale: Loss of autopilot mode control can lead to pilot confusion, incorrect mode engagement, or uncommanded mode changes, resulting in loss of aircraft control
Failure Condition Analysis (per ARP4754A):
- Failure: Autopilot engages in incorrect mode (e.g., altitude hold when descending)
- Effect: Pilot may not detect incorrect mode for several seconds, leading to altitude deviation, loss of separation, potential collision
- Severity: Hazardous (serious injury to passengers possible)
- DAL Assignment: Level B
System Architecture: The following diagram shows the Autopilot Mode Controller system architecture, including its interfaces to the Mode Control Panel (MCP), Flight Control Computer (FCC), and display systems.
Supported Autopilot Modes:
- Heading Hold (HDG)
- Navigation (NAV) - follows flight plan
- Altitude Hold (ALT HOLD)
- Vertical Speed (V/S)
- Approach (APPR) - localizer + glideslope
- Go-Around (GA)
Mode Transition Rules:
- Pilot commands mode via Mode Control Panel (MCP)
- AMC validates mode transition is safe given current flight state
- AMC commands Flight Control Computer (FCC) to engage requested mode
- AMC annunciates active mode on Primary Flight Display (PFD)
- AMC monitors mode engagement status and disengages autopilot if anomaly detected
SWE.1: Software Requirements Analysis (DAL B)
ASPICE Base Practices Applied:
BP1: Specify software requirements
High-Level Requirement Example (SRS-AMC-001):
The following diagram shows the Autopilot Mode Controller architecture with DO-178C partitioning, illustrating the separation between application logic, RTOS, and hardware interfaces through memory protection and dissimilar communication paths.
Partitioning Strategy (DO-178C Section 2.4.3):
- Application layer partitioned from RTOS by memory protection unit (MPU)
- MCP interface and FCC interface use separate ARINC 429 channels (dissimilar paths)
- Mode state machine isolated in separate task with dedicated stack
Resource Budget:
- CPU: 15% of dual-core PowerPC MPC5777C @ 300 MHz (worst-case execution time analysis)
- RAM: 128 KB (data + stack)
- Flash: 256 KB (code + constants)
- ARINC 429 bandwidth: 20% of 100 kbps channel capacity
BP2: Allocate software requirements to architecture
Requirements allocation table:
| High-Level Requirement | Architecture Module | Rationale |
|---|---|---|
| SRS-AMC-001 (Mode Selection) | Autopilot_ModeSelection | Centralized mode selection logic |
| SRS-AMC-002 (Mode Transitions) | Autopilot_ModeStateMachine | State machine handles all transitions |
| SRS-AMC-003 (Annunciation) | Autopilot_ModeAnnunciation | Dedicated annunciation module |
| SRS-AMC-004 (BIT) | Autopilot_BuiltInTest | Self-test and monitoring |
| SRS-AMC-005 (MCP Interface) | ARINC429_Driver (MCP) | Hardware-specific driver |
| SRS-AMC-006 (FCC Interface) | ARINC429_Driver (FCC) | Hardware-specific driver |
BP3: Define interfaces
Interface Control Document (ICD) excerpt:
/**
* @interface ARINC429_MCP_Interface
* @description Mode Control Panel ARINC 429 message interface
* @safety DAL B
*/
/**
* @brief MCP Autopilot Mode Command Message (Label 101)
* @rate 10 Hz (minimum)
* @direction MCP → AMC
*/
typedef struct {
uint8_t label; // Bits 1-8: Label 101 (0x65)
uint8_t sdi; // Bits 9-10: Source/Destination Identifier
uint8_t mode_cmd; // Bits 11-13: Mode command (see MODE_CMD_xxx)
uint8_t reserved; // Bits 14-28: Reserved (0)
uint8_t ssm; // Bits 29-30: Sign/Status Matrix
uint8_t parity; // Bit 32: Odd parity
} MCP_Autopilot_Mode_Cmd_t;
/**
* @brief Mode Command Values
*/
#define MODE_CMD_HDG 0x01 // Heading Hold
#define MODE_CMD_NAV 0x02 // Navigation
#define MODE_CMD_ALT_HOLD 0x03 // Altitude Hold
#define MODE_CMD_VS 0x04 // Vertical Speed
#define MODE_CMD_APPR 0x05 // Approach
#define MODE_CMD_GA 0x06 // Go-Around
/**
* @brief SSM Values
*/
#define SSM_FAILURE_WARNING 0x00
#define SSM_NO_COMPUTED_DATA 0x01
#define SSM_FUNCTIONAL_TEST 0x02
#define SSM_NORMAL_OPERATION 0x03
/**
* @brief Interface Functions
*/
/**
* @function ARINC429_MCP_ReceiveMessage
* @description Receive and validate MCP message
* @param[out] msg Pointer to message buffer
* @return 0 on success, error code on failure
* @timing Maximum 1 ms execution time
* @safety Validates parity, SSM, label; returns error if invalid
*/
int32_t ARINC429_MCP_ReceiveMessage(MCP_Autopilot_Mode_Cmd_t* msg);
BP4: Describe dynamic behavior
Mode State Machine (UML State Diagram):
┌────────────────────────────────────────────────────────────────┐
│ Autopilot Mode State Machine │
├────────────────────────────────────────────────────────────────┤
│ │
│ [Initial State] │
│ │ │
│ ▼ │
│ ┌──────────┐ │
│ │ INACTIVE │◄─────────────────┐ │
│ └──────────┘ │ │
│ │ │ │
│ │ MODE_CMD_HDG │ DISENGAGE_CMD │
│ │ │ │
│ ▼ │ │
│ ┌──────────┐ │ │
│ │ HDG_HOLD │──────────────────┤ │
│ └──────────┘ │ │
│ │ │ │
│ │ MODE_CMD_NAV │ │
│ │ │ │
│ ▼ │ │
│ ┌──────────┐ │ │
│ │ NAV_MODE │──────────────────┤ │
│ └──────────┘ │ │
│ │ │ │
│ │ MODE_CMD_APPR │ │
│ │ │ │
│ ▼ │ │
│ ┌──────────┐ │ │
│ │APPR_MODE │──────────────────┘ │
│ └──────────┘ │
│ │
│ Transition Guards (safety logic): │
│ • HDG_HOLD requires: altitude > 500 ft AGL │
│ • NAV_MODE requires: valid flight plan, GPS signal │
│ • APPR_MODE requires: localizer signal, glideslope signal, │
│ distance to runway < 10 NM │
│ │
│ Transition Actions: │
│ • Send mode engagement command to FCC │
│ • Update annunciation on PFD │
│ • Start mode engagement watchdog timer (200 ms) │
│ │
│ Failure Handling: │
│ • If FCC does not confirm mode engagement within 200 ms: │
│ → Transition to INACTIVE │
│ → Annunciate "AUTOPILOT FAIL" │
│ → Log error to BIT system │
│ │
└────────────────────────────────────────────────────────────────┘
BP5: Evaluate alternative architectures
Architecture alternatives considered:
| Alternative | Pros | Cons | Selected? |
|---|---|---|---|
| Monolithic (single task) | Simple, low overhead | Difficult to partition, single point of failure | No |
| Layered (app + HAL + RTOS) | Clear separation of concerns, testable layers, supports partitioning | Moderate complexity | Yes |
| Component-Based (AUTOSAR) | Highly modular, industry standard | High overhead, not suitable for small system | No |
Safety analysis (Failure Modes and Effects Analysis - FMEA):
- Architecture review identified single point of failure: mode state machine
- Mitigation: Add watchdog timer monitoring state machine execution
- Mitigation: FCC cross-checks mode engagement (dissimilar redundancy)
BP6: Establish bidirectional traceability
Architecture elements traced to requirements:
SRS-AMC-001 (Mode Selection) ←→ Autopilot_ModeSelection module
SRS-AMC-002 (Mode Transitions) ←→ Autopilot_ModeStateMachine module
SRS-AMC-003 (Annunciation) ←→ Autopilot_ModeAnnunciation module
BP7: Ensure consistency
Architecture review checklist:
- All high-level requirements allocated to architecture elements: [OK]
- No architecture element without traceability to requirements: [OK]
- Interface definitions complete and unambiguous: [OK]
- Resource budget feasible (CPU, RAM, Flash): [OK]
- Timing constraints met (worst-case execution time analysis): [OK]
- Safety requirements addressed (partitioning, redundancy): [OK]
Architecture baseline approved by:
- Software architect
- Independent verification engineer
- System safety engineer
BP8: Communicate software architecture
Software Design Description (SDD) document includes:
- Architecture diagrams (layered view, component view, deployment view)
- Module descriptions
- Interface control documents (ICDs)
- State machine diagrams
- Sequence diagrams for critical scenarios
- Resource budget analysis
- Safety analysis (FMEA)
SWE.3: Software Detailed Design & Unit Construction (DAL B)
Low-Level Requirements Example:
LLR-AMC-001: Mode State Machine Transition Logic
Parent: SRS-AMC-002 (Mode Transitions)
Detailed Requirements:
LLR-001.1: Upon receiving MODE_CMD_HDG from MCP, if current state is
INACTIVE and altitude > 500 ft AGL, transition to HDG_HOLD state.
LLR-001.2: Transition shall complete within 10 ms of receiving command.
LLR-001.3: During transition, send ARINC 429 message to FCC:
Label 102 (Autopilot Mode Engagement)
Data: MODE_HDG (0x01)
LLR-001.4: Start mode engagement watchdog timer (200 ms timeout).
LLR-001.5: If FCC confirms mode engagement (Label 103) within 200 ms,
annunciate "HDG" on PFD.
LLR-001.6: If FCC does not confirm within 200 ms, transition to INACTIVE,
annunciate "AP FAIL", log error code 0x05 to BIT system.
Verification:
- Unit Test: Test mode transition with valid altitude (600 ft)
- Unit Test: Test mode transition rejection with invalid altitude (400 ft)
- Unit Test: Test watchdog timeout handling (FCC response delayed 250 ms)
- Code Review: Verify transition logic matches state machine specification
Source Code Example (MISRA C:2012 compliant):
/**
* @file autopilot_mode_statemachine.c
* @brief Autopilot mode state machine implementation
* @safety DAL B
* @version 1.0
* @date 2026-01-03
*/
#include "autopilot_mode_statemachine.h"
#include "arinc429_driver.h"
#include "bit_logger.h"
/* MISRA C:2012 Rule 8.4: Function declarations in header */
/* MISRA C:2012 Rule 8.7: Static functions where possible */
/**
* @brief Current autopilot mode state
*/
static Autopilot_Mode_State_t current_state = AP_STATE_INACTIVE;
/**
* @brief Mode engagement watchdog timer
*/
static uint32_t mode_engagement_watchdog_ms = 0U;
/**
* @brief Altitude Above Ground Level (ft)
*/
static float32_t altitude_agl_ft = 0.0F;
/**
* @function Autopilot_StateMachine_Update
* @brief Process mode state machine (called at 100 Hz)
* @param[in] mode_cmd Mode command from MCP
* @return void
* @requirement LLR-AMC-001
*/
void Autopilot_StateMachine_Update(const MCP_Autopilot_Mode_Cmd_t* mode_cmd)
{
/* MISRA C:2012 Rule 14.3: Invariant conditions are by design */
/* MISRA C:2012 Rule 15.5: Single exit point not required for safety */
/* Null pointer check (defensive programming for DAL B) */
if (mode_cmd == NULL)
{
/* Log error and return */
BIT_LogError(BIT_ERROR_NULL_POINTER, 0x01);
return;
}
/* Update watchdog timer */
if (mode_engagement_watchdog_ms > 0U)
{
mode_engagement_watchdog_ms -= 10U; /* 100 Hz = 10 ms per cycle */
/* Check for watchdog timeout */
if (mode_engagement_watchdog_ms == 0U)
{
/* Mode engagement failed - transition to INACTIVE */
Autopilot_StateMachine_Disengage();
Autopilot_Annunciate("AP FAIL");
BIT_LogError(BIT_ERROR_MODE_ENGAGEMENT_TIMEOUT, 0x05);
}
}
/* State machine logic */
switch (current_state)
{
case AP_STATE_INACTIVE:
/* Process mode command */
if (mode_cmd->mode_cmd == MODE_CMD_HDG)
{
/* LLR-001.1: Check altitude guard */
if (altitude_agl_ft > 500.0F)
{
/* Transition to HDG_HOLD */
Autopilot_StateMachine_TransitionTo(AP_STATE_HDG_HOLD);
}
else
{
/* Reject command - altitude too low */
BIT_LogError(BIT_ERROR_ALTITUDE_TOO_LOW, 0x06);
}
}
/* Additional mode commands handled here... */
break;
case AP_STATE_HDG_HOLD:
/* Monitor HDG mode, process disengage command */
if (mode_cmd->mode_cmd == MODE_CMD_DISENGAGE)
{
Autopilot_StateMachine_Disengage();
}
break;
/* Additional states... */
default:
/* Invalid state - fail-safe to INACTIVE */
BIT_LogError(BIT_ERROR_INVALID_STATE, (uint8_t)current_state);
Autopilot_StateMachine_Disengage();
break;
}
}
/**
* @function Autopilot_StateMachine_TransitionTo
* @brief Transition to new autopilot mode
* @param[in] new_state Target state
* @return void
* @requirement LLR-AMC-001.3, LLR-AMC-001.4
*/
static void Autopilot_StateMachine_TransitionTo(Autopilot_Mode_State_t new_state)
{
ARINC429_Message_t fcc_msg;
/* Prepare FCC mode engagement message (Label 102) */
fcc_msg.label = 0x66U; /* Label 102 */
fcc_msg.data = (uint32_t)new_state;
fcc_msg.ssm = SSM_NORMAL_OPERATION;
/* Send message to FCC */
(void)ARINC429_FCC_SendMessage(&fcc_msg);
/* Start watchdog timer (200 ms) */
mode_engagement_watchdog_ms = 200U;
/* Update current state */
current_state = new_state;
}
/**
* @function Autopilot_StateMachine_ConfirmEngagement
* @brief Confirm mode engagement from FCC (Label 103 received)
* @param[in] confirmed_mode Mode confirmed by FCC
* @return void
* @requirement LLR-AMC-001.5
*/
void Autopilot_StateMachine_ConfirmEngagement(Autopilot_Mode_State_t confirmed_mode)
{
/* Verify confirmed mode matches current state */
if (confirmed_mode == current_state)
{
/* Cancel watchdog timer */
mode_engagement_watchdog_ms = 0U;
/* Annunciate mode on PFD */
switch (current_state)
{
case AP_STATE_HDG_HOLD:
Autopilot_Annunciate("HDG");
break;
case AP_STATE_NAV_MODE:
Autopilot_Annunciate("NAV");
break;
/* Additional modes... */
default:
/* No annunciation */
break;
}
}
else
{
/* Mode mismatch - disengage autopilot */
BIT_LogError(BIT_ERROR_MODE_MISMATCH, 0x07);
Autopilot_StateMachine_Disengage();
}
}
/* Additional functions... */
Coding Standards Compliance (MISRA C:2012):
- All MISRA C:2012 mandatory rules followed
- Deviations documented and justified (e.g., Rule 14.3 for safety-critical state machines)
- Static analysis performed with Parasoft C/C++test or PC-lint Plus
Unit Testing Strategy:
/**
* @file test_autopilot_statemachine.c
* @brief Unit tests for autopilot mode state machine
* @coverage MC/DC coverage required (DAL B)
*/
/**
* @test Test_ModeTransition_HDG_ValidAltitude
* @requirement LLR-AMC-001.1
* @coverage Statement, Decision, MC/DC
*/
void Test_ModeTransition_HDG_ValidAltitude(void)
{
MCP_Autopilot_Mode_Cmd_t mode_cmd;
/* Setup: Set altitude above 500 ft */
Autopilot_SetAltitudeAGL(600.0F);
/* Ensure starting state is INACTIVE */
Autopilot_StateMachine_Reset();
/* Execute: Send HDG mode command */
mode_cmd.label = 0x65U;
mode_cmd.mode_cmd = MODE_CMD_HDG;
mode_cmd.ssm = SSM_NORMAL_OPERATION;
Autopilot_StateMachine_Update(&mode_cmd);
/* Verify: State transitioned to HDG_HOLD */
assert(Autopilot_StateMachine_GetState() == AP_STATE_HDG_HOLD);
/* Verify: FCC message sent (Label 102, Data = MODE_HDG) */
ARINC429_Message_t sent_msg;
assert(ARINC429_FCC_GetLastSentMessage(&sent_msg) == 0);
assert(sent_msg.label == 0x66U);
assert(sent_msg.data == (uint32_t)AP_STATE_HDG_HOLD);
/* Verify: Watchdog timer started (200 ms) */
assert(Autopilot_GetWatchdogTimer() == 200U);
}
/**
* @test Test_ModeTransition_HDG_InvalidAltitude
* @requirement LLR-AMC-001.1
* @coverage MC/DC (altitude guard condition)
*/
void Test_ModeTransition_HDG_InvalidAltitude(void)
{
MCP_Autopilot_Mode_Cmd_t mode_cmd;
/* Setup: Set altitude below 500 ft */
Autopilot_SetAltitudeAGL(400.0F);
/* Ensure starting state is INACTIVE */
Autopilot_StateMachine_Reset();
/* Execute: Send HDG mode command */
mode_cmd.mode_cmd = MODE_CMD_HDG;
Autopilot_StateMachine_Update(&mode_cmd);
/* Verify: State remains INACTIVE (transition rejected) */
assert(Autopilot_StateMachine_GetState() == AP_STATE_INACTIVE);
/* Verify: Error logged to BIT system */
uint8_t error_code;
assert(BIT_GetLastError(&error_code) == BIT_ERROR_ALTITUDE_TOO_LOW);
}
/**
* @test Test_ModeEngagement_Timeout
* @requirement LLR-AMC-001.6
* @coverage Statement, Decision
*/
void Test_ModeEngagement_Timeout(void)
{
MCP_Autopilot_Mode_Cmd_t mode_cmd;
/* Setup: Transition to HDG_HOLD */
Autopilot_SetAltitudeAGL(600.0F);
mode_cmd.mode_cmd = MODE_CMD_HDG;
Autopilot_StateMachine_Update(&mode_cmd);
assert(Autopilot_StateMachine_GetState() == AP_STATE_HDG_HOLD);
/* Execute: Simulate 250 ms passage without FCC confirmation */
for (uint32_t i = 0U; i < 25U; i++) /* 25 cycles * 10 ms = 250 ms */
{
mode_cmd.mode_cmd = MODE_CMD_NONE; /* No new command */
Autopilot_StateMachine_Update(&mode_cmd);
}
/* Verify: State transitioned back to INACTIVE (timeout) */
assert(Autopilot_StateMachine_GetState() == AP_STATE_INACTIVE);
/* Verify: "AP FAIL" annunciated */
char annunciation[32];
Autopilot_GetAnnunciation(annunciation);
assert(strcmp(annunciation, "AP FAIL") == 0);
/* Verify: Error code 0x05 logged */
uint8_t error_code;
assert(BIT_GetLastError(&error_code) == BIT_ERROR_MODE_ENGAGEMENT_TIMEOUT);
}
/* Additional test cases for MC/DC coverage... */
Coverage Analysis:
- Statement coverage: 100% (all lines executed by test cases)
- Decision coverage: 100% (all branches tested for true/false)
- MC/DC coverage: 100% (each condition independently affects decision outcome)
Coverage Report (generated by coverage tool, e.g., LDRA TBrun, VectorCAST):
Module: autopilot_mode_statemachine.c
Statement Coverage: 342/342 (100.00%)
Decision Coverage: 58/58 (100.00%)
MC/DC Coverage: 116/116 (100.00%)
Uncovered Code: None
Test Cases Executed: 47
Pass: 47
Fail: 0
SWE.4-6: Verification (DAL B)
SWE.4: Software Unit Verification
Unit tests executed on host development system (x86 Linux) with hardware abstraction:
- All 47 unit test cases pass
- MC/DC coverage achieved (100%)
- Code review performed by independent engineer
- No defects found in code review
SWE.5: Software Integration & Integration Test
Integration testing performed on target hardware (PowerPC MPC5777C development board):
Integration Test Cases:
| Test ID | Description | Result |
|---|---|---|
| IT-001 | MCP interface integration (ARINC 429 receive) | Pass |
| IT-002 | FCC interface integration (ARINC 429 transmit) | Pass |
| IT-003 | Mode state machine integration with annunciation | Pass |
| IT-004 | BIT integration (error logging and retrieval) | Pass |
| IT-005 | Watchdog timer integration | Pass |
| IT-006 | End-to-end mode selection: INACTIVE → HDG → NAV → APPR | Pass |
Integration Test Results:
- All 32 integration test cases pass
- Timing analysis: Worst-case execution time (WCET) = 8.5 ms (within 50 ms budget)
- Resource usage: CPU 12%, RAM 96 KB, Flash 198 KB (within budget)
SWE.6: Software Qualification Testing
Requirements-based testing performed on hardware-in-loop (HIL) simulator:
- HIL simulator includes simulated MCP, FCC, and aircraft dynamics
- All 125 system-level test cases executed
- Requirements coverage: 100% (all 87 software requirements tested)
Test Results Summary:
Software Qualification Test Report
Test Cases Executed: 125
Pass: 125
Fail: 0
Requirements Coverage: 87/87 (100%)
Robustness Testing:
- Boundary conditions: Pass (15/15 test cases)
- Error injection: Pass (8/8 test cases)
- Timing stress: Pass (5/5 test cases)
Structural Coverage (on target hardware):
- Statement Coverage: 100%
- Decision Coverage: 100%
- MC/DC Coverage: 100%
DO-330 Tool Qualification for AMC Project
Tools Used in AMC Project:
| Tool | Purpose | TQL Level | Qualification Status |
|---|---|---|---|
| GCC PowerPC Cross-Compiler (v11.2) | Compile C code to PowerPC object code | TQL-2 (DAL B code generator) | Qualified per DO-330 |
| Parasoft C/C++test | Static analysis (MISRA C compliance) + unit testing + coverage | TQL-4 (verifies code, detects errors) | Qualified per DO-330 |
| LDRA TBrun | Structural coverage analysis (MC/DC) | TQL-4 (verifies coverage) | Qualified per DO-330 |
| IBM DOORS | Requirements management and traceability | TQL-5 (does not detect errors, only manages data) | Basic operational verification |
| Git | Source code version control | Not qualified (not a DO-178C tool) | N/A |
Tool Qualification Example: GCC PowerPC Cross-Compiler (TQL-2)
Tool Operational Requirements (TOR):
TOR-GCC-001: Compiler shall generate PowerPC object code from C source code
TOR-GCC-002: Compiler shall preserve semantics of C source code in object code
TOR-GCC-003: Compiler shall generate code compliant with PowerPC EABI
TOR-GCC-004: Compiler shall not introduce unintended functionality
TOR-GCC-005: Compiler optimizations (-O2) shall not alter program behavior
Tool Verification Plan (TVP):
1. Compiler Test Suite Execution:
- Execute GCC regression test suite (50,000+ test cases)
- All tests must pass (0 failures)
2. Certification Test Suite:
- Execute DO-178C compiler test suite (Plum Hall STL, ACE test suite)
- Tests cover:
- Data type representation
- Operator semantics
- Control flow constructs
- Function call semantics
- Optimization correctness
3. Operational Verification:
- Compile sample AMC code modules with known behavior
- Execute on target hardware
- Verify object code behavior matches source code semantics
4. Traceability Verification:
- Generate assembly listing for all object code
- Manually inspect assembly to verify traceability to source code
- Document any compiler-generated code (prologues, epilogues, runtime library)
Tool Qualification Data (TQD) Submitted to FAA:
- Tool Qualification Plan (TQP)
- Tool Operational Requirements (TOR)
- Tool Verification Cases and Procedures (TVCP)
- Tool Verification Results (TVR)
- Tool Configuration Management Records
Certification Authority Acceptance:
- FAA reviews TQD package
- FAA conducts tool audit (optional)
- FAA issues Tool Qualification Approval Letter
Glossary
Key terms specific to this chapter (see Appendix G and Appendix H for the complete glossary):
| Term | Definition |
|---|---|
| DAL | Development Assurance Level (A-E) — DO-178C severity classification driving verification rigor |
| DO-178C | Software Considerations in Airborne Systems and Equipment Certification |
| DO-330 | Software Tool Qualification Considerations — companion standard for tool qualification |
| PSAC | Plan for Software Aspects of Certification — primary planning document for DO-178C compliance |
| SAS | Software Accomplishment Summary — final certification evidence document |
| MC/DC | Modified Condition/Decision Coverage — structural coverage required for DAL A |
| TQL | Tool Qualification Level (1-5) — DO-330 classification for tool qualification rigor |
| ARP4754A | Guidelines for Development of Civil Aircraft and Systems — system-level safety assessment |
| FADEC | Full Authority Digital Engine Control — safety-critical avionics example used throughout this chapter |
| ARINC 429 | Aviation data bus standard for aircraft avionics communication |
References & DO-178C Compliance Checklist
References
- RTCA DO-178C, "Software Considerations in Airborne Systems and Equipment Certification", December 2011
- RTCA DO-330, "Software Tool Qualification Considerations", December 2011
- RTCA DO-248C, "Supporting Information for DO-178C and DO-278A", December 2011
- SAE ARP4754A, "Guidelines for Development of Civil Aircraft and Systems", December 2010
- Automotive SPICE PAM 4.0, VDA Quality Management Center, November 2023
- FAA AC 20-115D, "Airborne Software Development Assurance Using EUROCAE ED-12( ) and RTCA DO-178( )", July 2017
- MISRA C:2012, "Guidelines for the Use of the C Language in Critical Systems", March 2013
DO-178C DAL B Compliance Checklist
Software Planning Process (Section 4, Table A-1):
- Software Development Plan (SDP) developed and approved
- Software Verification Plan (SVP) developed and approved
- Software Configuration Management Plan (SCMP) developed and approved
- Software Quality Assurance Plan (SQAP) developed and approved
- Software Requirements Standards defined
- Software Design Standards defined
- Software Code Standards defined
Software Development Process (Section 5, Tables A-2 through A-5):
- High-level requirements developed from system requirements
- High-level requirements are accurate, consistent, and traceable
- Software architecture developed and documented
- Low-level requirements derived from high-level requirements
- Source code developed from low-level requirements
- Source code is traceable, verifiable, and conforms to coding standards
- Executable object code generated from source code
- Bidirectional traceability established at all levels
Software Verification Process (Section 6, Tables A-3 through A-7):
- High-level requirements reviewed for correctness and consistency
- Software architecture reviewed for correctness and consistency
- Low-level requirements reviewed for correctness and consistency
- Source code reviewed for correctness and traceability
- Software integration testing performed on target hardware
- Requirements-based testing achieves 100% requirements coverage
- Structural coverage analysis achieves: Statement (100%), Decision (100%), MC/DC (100%)
- Robustness testing demonstrates correct handling of abnormal inputs
- Verification performed with independence (reviews by person other than author)
Software Configuration Management (Section 7, Table A-9):
- Configuration management process established and followed
- Baselines established for requirements, design, code, and test artifacts
- Change control process enforced (CCB approval for all changes)
- Problem reporting process established
- Configuration status accounting maintained
Software Quality Assurance (Section 8, Table A-8):
- QA process established and followed
- QA reviews conducted at all lifecycle phases
- QA audits verify compliance with plans and standards
- Non-conformances tracked and resolved
- QA records maintained
Certification Liaison (Section 9, Table A-10):
- Plan for Software Aspects of Certification (PSAC) developed and submitted to FAA
- Certification liaison established with FAA
- Software Accomplishment Summary (SAS) developed documenting all objectives satisfied
- Software lifecycle data submitted to FAA per PSAC agreement
- Tool qualification data submitted to FAA (for qualified tools)
Tool Qualification (DO-330):
- All tools classified by Tool Qualification Level (TQL)
- Tool Qualification Plans (TQP) developed for TQL-1 through TQL-5 tools
- Tool Operational Requirements (TOR) defined
- Tool Verification Cases and Procedures developed
- Tool Verification Results demonstrate TOR satisfaction
- Tool Qualification Data (TQD) submitted to FAA
ASPICE Integration:
- ASPICE process mapping to DO-178C objectives documented
- ASPICE work products satisfy DO-178C lifecycle data requirements
- ASPICE base practices satisfy DO-178C verification activities
- Dual compliance demonstrated (ASPICE capability + DO-178C objective satisfaction)
Conclusion
This chapter demonstrated how ASPICE 4.0 processes integrate seamlessly with DO-178C requirements for aerospace software development. The case study of a DAL B Autopilot Mode Controller showed:
- Complete Process Mapping: ASPICE SWE.1-6, SUP.1/2/8, and MAN.3 processes directly satisfy DO-178C objectives
- Work Product Harmonization: ASPICE work products (SRS, Architecture, Code, Test Results) fulfill DO-178C lifecycle data requirements
- Verification Rigor: ASPICE verification practices (reviews, testing, coverage analysis) meet DO-178C DAL B independence and coverage requirements
- Tool Qualification: DO-330 tool qualification process integrates with ASPICE tool management practices
Key Takeaways:
- Organizations developing both automotive (ASPICE) and aerospace (DO-178C) systems can maintain a unified process framework
- DAL level determines verification rigor: DAL A requires highest independence and MC/DC coverage, while DAL D requires minimal verification
- Tool qualification is critical: All tools that eliminate or reduce verification must be qualified per DO-330
- Independence requirements vary by DAL: DAL A/B require independent verification teams, while DAL C/D allow peer reviews
Next Steps for Practitioners:
- Conduct gap analysis between existing ASPICE processes and DO-178C objectives
- Tailor ASPICE base practices to meet DAL-specific requirements (especially independence and coverage)
- Develop tool qualification strategy early in project (tool qualification can take 6-12 months)
- Establish certification liaison with FAA or EASA early to align on PSAC and compliance approach
- Train engineering teams on both ASPICE and DO-178C requirements to ensure dual compliance
By following the guidance in this chapter, organizations can achieve efficient dual compliance with ASPICE 4.0 and DO-178C, reducing certification risk and enabling knowledge transfer across automotive and aerospace domains.