1.5: SYS.5 System Verification
Learning Objectives
After reading this section, you will be able to:
- Validate system against stakeholder requirements
- Develop qualification test strategies
- Apply AI for coverage analysis
- Document qualification evidence
Process Definition
Purpose
SYS.5 Purpose: To verify that the integrated system fulfills the system requirements.
Outcomes
| Outcome | Description |
|---|---|
| O1 | Verification measures are specified for system verification |
| O2 | Verification measures are selected according to the release scope |
| O3 | The integrated system is verified; results are recorded |
| O4 | Consistency and bidirectional traceability are established between verification measures and system requirements |
| O5 | Bidirectional traceability is established between verification results and verification measures |
| O6 | Verification results are summarized and communicated to all affected parties |
Base Practices with AI Integration
AI Automation Levels:
- L1 (Assist): AI suggests, human decides and executes
- L2 (Collaborate): AI drafts/executes, human reviews and approves
- L3 (Automate): AI executes autonomously, human monitors results
| BP | Base Practice | AI Level | AI Application |
|---|---|---|---|
| BP1 | Specify verification measures for system verification | L1-L2 | Test specification |
| BP2 | Select verification measures | L2 | Coverage optimization |
| BP3 | Perform verification of the integrated system | L2-L3 | Test execution |
| BP4 | Ensure consistency and establish bidirectional traceability | L2 | Trace generation |
| BP5 | Summarize and communicate results | L2 | Report generation |
System Verification vs. Integration Verification
Note: Both SYS.4 and SYS.5 can use HIL environments; the key distinction is perspective (integration correctness vs. system requirement satisfaction) rather than test environment.
| Aspect | SYS.4 Integration Verification | SYS.5 System Verification |
|---|---|---|
| Focus | Interface correctness | System requirement satisfaction |
| Reference | Architecture | System requirements |
| Level | Internal interfaces | System-level behavior |
| Perspective | Developer/Architect | Customer/QA |
| Environment | HIL typical | HIL, VIL, or real vehicle |
Qualification Test Strategy
Test Types
| Test Type | Purpose | AI Automation |
|---|---|---|
| Functional | Verify features work | L2 |
| Performance | Verify timing, throughput | L2-L3 |
| Environmental | Temperature, humidity | L3 (execution) |
| EMC | Electromagnetic compatibility | L1 (analysis) |
| Reliability | Long-term operation | L3 (execution) |
| Safety | Safety function verification | L1 (human critical) |
Coverage Strategy
The following diagram maps qualification test cases to system requirements, showing traceability coverage and identifying any gaps requiring additional test measures.
Coverage Metrics:
| Metric | Value |
|---|---|
| Requirements covered | 47/48 (98%) |
| Test cases executed | 156/160 (97.5%) |
| Test cases passed | 152/156 (97.4%) |
AI Analysis: 4 failing tests, 1 uncovered requirement. Suggested action: Review STK-BCM-004
Qualification Test Specification
Test Case Example
---
ID: SYS-QUAL-001
Title: Door Lock User Expectation Validation
Type: Qualification Test
Priority: Critical
Requirement: STK-BCM-001
---
## Objective
Validate that the door locking function meets user expectations as
defined in stakeholder requirements.
## Test Environment
- Production ECU
- Vehicle prototype or equivalent HIL
- Ambient temperature 20°C ± 5°C
## Test Procedure
### Scenario 1: Driver Door Lock
| Step | Action | Expected | Actual | Status |
|------|--------|----------|--------|--------|
| 1 | Open driver door | Door open | | |
| 2 | Close driver door | Door closed | | |
| 3 | Press lock button | Immediate feedback | | |
| 4 | Verify lock state | All doors locked | | |
| 5 | Measure response time | < 200ms | | |
### Scenario 2: Remote Lock
| Step | Action | Expected | Actual | Status |
|------|--------|----------|--------|--------|
| 1 | Stand 10m from vehicle | - | | |
| 2 | Press key fob lock | Visual confirmation | | |
| 3 | Verify lock state | All doors locked | | |
| 4 | Measure response time | < 500ms (incl. RF) | | |
## Pass Criteria
- Response time <= 200ms (direct), <= 500ms (remote)
- User perceives immediate response (subjective: define via user study or threshold, e.g., <300ms perceived as immediate)
- No unintended behaviors
## Traceability
- Validates: STK-BCM-001
- Related: SYS-BCM-010, SYS-BCM-011
AI Integration for Qualification
L2: Coverage Analysis
AI Coverage Report:
───────────────────
Requirements Coverage Summary:
─────────────────────────────
Total stakeholder requirements: 48
Requirements with test coverage: 47 (98%)
Requirements fully verified: 45 (94%)
Gap Analysis:
─────────────
STK-BCM-004: Power consumption monitoring
Status: NO TEST COVERAGE
Suggested: Add SYS-QUAL-048 for power measurement
Priority: Medium
STK-BCM-015: Cold start performance
Status: PARTIAL COVERAGE
Issue: Only tested at -20°C, requirement specifies -40°C
Suggested: Extend temperature range in SYS-QUAL-012
Test Effectiveness:
──────────────────
Tests finding most defects: SYS-QUAL-001-010 (integration tests)
Redundant tests identified: SYS-QUAL-056, SYS-QUAL-057 (overlap)
Human Action: Review gaps, prioritize additions, eliminate redundancy
L1: Test Result Analysis
AI Failure Analysis:
────────────────────
Failed Test: SYS-QUAL-023 (Window anti-pinch)
Symptom: Window reversal delayed by 50ms
AI Analysis:
• Similar failure pattern in previous project BCM-2023
• Root cause was motor driver timing parameter
• Suggest checking HWE-BCM-205 configuration
Confidence: Medium (based on pattern matching)
Human Action: Investigate suggested root cause, verify fix
Work Products
| WP ID | Work Product | Content |
|---|---|---|
| 08-60 | Verification Measure | Test strategy, verification measures |
| 08-58 | Verification Measure Selection Set | Selected tests per release scope |
| 15-52 | Verification Results | Execution records, pass/fail status |
| 13-51 | Consistency Evidence | Traceability: measures ↔ requirements |
| 13-52 | Communication Evidence | Verification summary, stakeholder reports |
Note: Additional project-specific work products (e.g., coverage reports, defect logs) may be defined per organizational standards.
Common Challenges
| Challenge | AI Mitigation | Human Action |
|---|---|---|
| Incomplete coverage | Gap detection | Add missing tests |
| Environment limitations | Simulation recommendations | Approve alternatives |
| Subjective requirements | Test criteria suggestions | Define measurable criteria |
| Regression issues | Pattern matching | Root cause analysis |
Summary
SYS.5 System Verification:
- AI Level: L1-L2 (AI analysis, human validation)
- AI Value: Coverage analysis, failure pattern matching
- Human Essential: Verification judgment, release decisions
- Key Outputs: System verification results, coverage report
- Focus: System requirement satisfaction