3.4: Review and Report Templates

What You'll Learn

By the end of this chapter, you'll have:

  • Design review templates that capture findings, decisions, and action items
  • Lightweight code review checklists for PR workflows
  • Audit report templates for ASPICE pre-assessments
  • Weekly status reports that keep stakeholders informed

Introduction

"Did anyone review this?" is a question that should never come up during an assessment. But it does—constantly.

Reviews and reports are the unsung heroes of ASPICE compliance. They prove that your team isn't just producing work products—you're verifying them. SUP.1 (Quality Assurance) and MAN.3 (Project Management) both depend on solid review and reporting practices.

This section gives you templates that make reviews efficient and reports informative.


Design Review Record Template (SUP.1 BP4)

Complete Design Review Template

---
title: "Design Review Record"
review_id: "DR-{{COMPONENT}}-{{DATE}}"
project: "{{PROJECT_NAME}}"
review_type: "Architecture Review | Detailed Design Review | Code Review"
date: {{DATE}}
aspice_process: "SUP.1 Quality Assurance"
---

# Design Review Record
## {{COMPONENT_NAME}} - {{REVIEW_TYPE}}

## 1. Review Information

| Field | Value |
|-------|-------|
| **Review ID** | DR-DOORLOCK-2025-12-17 |
| **Review Type** | Software Architecture Review |
| **Work Product(s)** | [SWAD-001] Software Architecture Document v1.5 |
| **Review Date** | 2025-12-17 |
| **Duration** | 2 hours |
| **Review Method** | Inspection (Formal) |
| **ASPICE Process** | SUP.1 BP4 (Perform quality assurance activities) |

---

## 2. Participants

| Name | Role | Organization | Signature |
|------|------|--------------|-----------|
| Alice Smith | Moderator | Architecture Team | __________ |
| Bob Johnson | Author | SW Development | __________ |
| Carol Davis | Reviewer (Safety) | Safety Engineering | __________ |
| Dave Wilson | Reviewer (QA) | Quality Assurance | __________ |
| Eve Martinez | Scribe | Documentation | __________ |

**Required Roles Present**: [PASS] Yes (Moderator, Author, ≥2 Reviewers, Scribe)

---

## 3. Review Objectives

**Primary Objectives**:
1. Verify software architecture satisfies all software requirements (SWE.2 BP2)
2. Evaluate architectural design decisions for rationale and trade-offs (SWE.2 BP6)
3. Ensure interfaces are well-defined and consistent (SWE.2 BP3)
4. Assess safety architecture compliance with ASIL-B requirements (ISO 26262)
5. Check traceability from software requirements to architectural elements (SWE.2 BP7)

**Entry Criteria**:
- [PASS] Software Architecture Document v1.5 distributed 3 days prior
- [PASS] All reviewers confirmed availability
- [PASS] Software Requirements Specification (SWRS) v1.0 available (baseline)
- [PASS] No open critical defects from previous review

**Exit Criteria**:
- All review checklist items addressed
- Critical/Major findings documented with action items
- Review decision recorded (Approve/Conditional/Reject)

---

## 4. Review Checklist

### 4.1 Requirements Coverage (SWE.2 BP2)

| Checklist Item | Status | Findings |
|----------------|--------|----------|
| All software requirements allocated to components? | [PASS] Yes | - |
| Allocation rationale documented? | [PASS] Yes | - |
| Non-functional requirements (performance, RAM, Flash) addressed? | [WARN] Partial | **FINDING-001**: Flash budget missing |
| Safety requirements allocated to safety-relevant components? | [PASS] Yes | - |

### 4.2 Architectural Design (SWE.2 BP1)

| Checklist Item | Status | Findings |
|----------------|--------|----------|
| Component decomposition logical and complete? | [PASS] Yes | - |
| Layered architecture (AUTOSAR) correctly applied? | [PASS] Yes | - |
| Component responsibilities clearly defined? | [PASS] Yes | - |
| Design patterns appropriate? | [PASS] Yes | Observer pattern for event handling |
| Alternative architectures evaluated? | [FAIL] No | **FINDING-002**: No alternatives documented |

### 4.3 Interface Specifications (SWE.2 BP3)

| Checklist Item | Status | Findings |
|----------------|--------|----------|
| All inter-component interfaces defined? | [PASS] Yes | RTE ports specified |
| Interface data types specified? | [PASS] Yes | AUTOSAR types used |
| Communication mechanisms clear (SR/CS)? | [PASS] Yes | - |
| Error handling at interfaces defined? | [WARN] Partial | **FINDING-003**: Timeout behavior unclear |

### 4.4 Dynamic Behavior (SWE.2 BP4)

| Checklist Item | Status | Findings |
|----------------|--------|----------|
| State machines documented? | [PASS] Yes | PlantUML diagrams provided |
| Sequence diagrams for key scenarios? | [PASS] Yes | Unlock scenario covered |
| Concurrency/synchronization addressed? | [PASS] Yes | AUTOSAR OS tasks defined |
| Timing constraints specified? | [PASS] Yes | 10ms task period |

### 4.5 Safety Architecture (ISO 26262 + SWE.2)

| Checklist Item | Status | Findings |
|----------------|--------|----------|
| ASIL decomposition documented? | [PASS] Yes | ASIL-B(D) applied |
| Safety mechanisms identified? | [PASS] Yes | Watchdog, checksums, plausibility |
| Freedom from interference analysis? | [FAIL] No | **FINDING-004**: FFI analysis missing |
| Diagnostic coverage calculated? | [PASS] Yes | 95% claimed |

### 4.6 Traceability (SWE.2 BP7)

| Checklist Item | Status | Findings |
|----------------|--------|----------|
| Backward traceability: Architecture → Requirements? | [PASS] Yes | Matrix provided |
| Forward traceability: Architecture → Design? | [WARN] Partial | **FINDING-005**: Detailed design TBD |
| Traceability matrix complete? | [PASS] Yes | - |

---

## 5. Findings

### FINDING-001: Flash Budget Missing [MAJOR]

**Category**: Completeness
**Severity**: Major
**Description**: Section 5 (Resource Allocation) lacks Flash memory budget breakdown.
**Location**: SWAD-001, Section 5, Page 18

**Recommendation**:
Add table showing Flash allocation per SWC and BSW module. Include margin analysis.

**Action Item**:
- **Owner**: Bob Johnson (Author)
- **Due Date**: 2025-12-20
- **Status**: Open

---

### FINDING-002: No Alternative Architectures Evaluated [MINOR]

**Category**: Process Compliance (SWE.2 BP6)
**Severity**: Minor
**Description**: Document does not discuss alternative architectural approaches.

**Recommendation**:
Add section 3.X "Architecture Alternatives Considered" with at least 2 alternatives
(e.g., bare-metal vs AUTOSAR, monolithic vs modular).

**Action Item**:
- **Owner**: Bob Johnson
- **Due Date**: 2025-12-22
- **Status**: Open

---

### FINDING-003: Interface Timeout Behavior Unclear [MAJOR]

**Category**: Design Quality
**Severity**: Major
**Description**: Section 4.2 (Interfaces) does not specify behavior when RTE communication timeouts occur.

**Recommendation**:
For each Required Interface, specify:
- Timeout threshold
- Fallback behavior (use last valid value, safe state, etc.)
- Error reporting mechanism

**Action Item**:
- **Owner**: Bob Johnson
- **Due Date**: 2025-12-20
- **Status**: Open

---

### FINDING-004: Freedom From Interference Analysis Missing [CRITICAL]

**Category**: Safety Compliance
**Severity**: Critical (ASIL-B requirement)
**Description**: ISO 26262-6 Clause 7 requires FFI analysis for ASIL-B. Document does not address memory/timing interference between ASIL-B and QM components.

**Recommendation**:
Add Section 7 "Freedom From Interference":
- Memory partitioning (MPU configuration)
- Timing isolation (AUTOSAR OS task priorities)
- Analysis report reference

**Action Item**:
- **Owner**: Carol Davis (Safety Engineer)
- **Due Date**: 2025-12-27
- **Status**: Open (Blocking Release)

---

### FINDING-005: Forward Traceability Incomplete [MINOR]

**Category**: Traceability
**Severity**: Minor
**Description**: Architecture does not trace forward to detailed design (SWE.3 not yet created).

**Recommendation**:
Acceptable for current phase. Revisit in detailed design review.

**Action Item**:
- **Owner**: Bob Johnson
- **Due Date**: N/A (Deferred to SWE.3 review)
- **Status**: Deferred

---

## 6. Review Metrics

| Metric | Value |
|--------|-------|
| Preparation Time (per reviewer) | 1.5 hours |
| Review Meeting Duration | 2 hours |
| Pages Reviewed | 32 pages |
| Review Rate | 16 pages/hour |
| Defects Found | 5 (1 Critical, 2 Major, 2 Minor) |
| Defect Density | 0.16 defects/page |

**Benchmark**: Typical defect density for architecture reviews: 0.1-0.3 defects/page [PASS]

---

## 7. Review Decision

**Decision**: [FAIL] **Conditional Approval**

**Rationale**:
The software architecture is fundamentally sound, but critical safety-related gaps
(FINDING-004: FFI analysis) must be addressed before final approval.

**Conditions for Approval**:
1. FINDING-004 (Critical) resolved with FFI analysis added
2. FINDING-001, FINDING-003 (Major) resolved
3. Updated document v1.6 circulated for verification (no re-review needed)

**Next Steps**:
- Author addresses findings and updates document to v1.6
- Safety Engineer completes FFI analysis
- Moderator verifies closure of critical/major findings
- Final approval via email (no additional meeting required)

**Approval Signatures**:

| Role | Name | Decision | Date | Signature |
|------|------|----------|------|-----------|
| Moderator | Alice Smith | Conditional Approve | 2025-12-17 | __________ |
| Safety Reviewer | Carol Davis | Conditional (FFI req) | 2025-12-17 | __________ |
| QA Reviewer | Dave Wilson | Conditional Approve | 2025-12-17 | __________ |

---

## 8. Action Item Summary

| ID | Description | Owner | Due Date | Status |
|----|-------------|-------|----------|--------|
| AI-001 | Add Flash budget breakdown | Bob Johnson | 2025-12-20 | Open |
| AI-002 | Document architecture alternatives | Bob Johnson | 2025-12-22 | Open |
| AI-003 | Specify interface timeout behavior | Bob Johnson | 2025-12-20 | Open |
| AI-004 | Complete FFI analysis | Carol Davis | 2025-12-27 | Open (Blocking) |
| AI-005 | Update traceability in SWE.3 review | Bob Johnson | TBD | Deferred |

---

## 9. Appendix: Review Artifacts

**Artifacts Reviewed**:
- [SWAD-001] Software Architecture Document v1.5 (32 pages)
- [SWRS-001] Software Requirements Specification v1.0 (baseline)
- [SRS-001] System Requirements Specification v2.0 (reference)

**Supporting Materials**:
- PlantUML state machine diagrams
- AUTOSAR SWC description files (.arxml)
- Traceability matrix (requirements_architecture.csv)

**Review Recording**: Video recording available at `\\fileserver\reviews\DR-DOORLOCK-2025-12-17.mp4`

Code Review Checklist Template (SWE.3 BP7)

Lightweight Code Review Checklist

# Code Review Checklist
## Pull Request: #{{PR_NUMBER}} - {{PR_TITLE}}

**Author**: {{AUTHOR}}
**Reviewer**: {{REVIEWER}}
**Date**: {{DATE}}
**Files Changed**: {{FILE_COUNT}} files, +{{LINES_ADDED}} / -{{LINES_DELETED}} lines

---

## 1. Pre-Review Checks (Automated)

| Check | Status | Tool |
|-------|--------|------|
| Build passes | [PASS] | GitHub Actions |
| Unit tests pass (100%) | [PASS] | pytest/Unity |
| Code coverage ≥80% | [PASS] | lcov/Codecov |
| MISRA C compliance | [WARN] 2 warnings | Cppcheck |
| Static analysis clean | [PASS] | SonarQube |
| No security vulnerabilities | [PASS] | Trivy |

---

## 2. Functional Review

| Criterion | Pass? | Comments |
|-----------|-------|----------|
| **Code implements requirements correctly** | [PASS] | Implements [SWE-042] as specified |
| **Edge cases handled** | [PASS] | Null pointer checks added |
| **Error handling present** | [PASS] | All RTE calls checked |
| **No unnecessary complexity** | [PASS] | Logic clear and straightforward |

---

## 3. Code Quality

| Criterion | Pass? | Comments |
|-----------|-------|----------|
| **Naming conventions followed** | [PASS] | CamelCase for functions, snake_case for variables |
| **Comments where needed** | [WARN] | Add comment for CRC algorithm choice |
| **No dead code** | [PASS] | - |
| **No code duplication** | [PASS] | - |
| **Function size reasonable (<50 lines)** | [PASS] | Longest function: 42 lines |

---

## 4. MISRA C / Coding Standards

| Criterion | Pass? | Comments |
|-----------|-------|----------|
| **MISRA mandatory rules: 0 violations** | [PASS] | - |
| **MISRA required rules: deviations justified** | [WARN] | Rule 8.7 deviation documented |
| **Pointer usage safe** | [PASS] | Null checks before dereference |
| **Magic numbers avoided** | [PASS] | Constants defined (#define) |

---

## 5. Safety (ASIL-B Specific)

| Criterion | Pass? | Comments |
|-----------|-------|----------|
| **Safety-critical functions have defensive checks** | [PASS] | Speed check before unlock |
| **Watchdog triggering preserved** | [PASS] | - |
| **No unbounded loops** | [PASS] | All loops have exit conditions |
| **Stack usage acceptable** | [PASS] | No recursion, local vars minimal |

---

## 6. Testing

| Criterion | Pass? | Comments |
|-----------|-------|----------|
| **Unit tests added for new code** | [PASS] | 5 new test cases (TC-042-006 to TC-042-010) |
| **Tests cover happy path + edge cases** | [PASS] | Null input, invalid checksum, timeout tested |
| **Test names descriptive** | [PASS] | - |
| **No flaky tests** | [PASS] | All tests deterministic |

---

## 7. Review Decision

**Decision**: [PASS] **APPROVED with Minor Comments**

**Summary**:
Code is well-structured and implements requirements correctly. Minor comment
request (CRC algorithm) should be addressed before merge.

**Action Items**:
- Add comment explaining CRC-8 polynomial selection (line 87)
- Resolve 2 MISRA warnings (or justify deviation)

**Reviewer Signature**: {{REVIEWER_NAME}}, {{DATE}}

Audit Report Template (SUP.1 BP8)

Quality Audit Report

---
title: "Quality Audit Report"
audit_id: "QA-{{YYYY-MM}}-{{NUMBER}}"
project: "{{PROJECT_NAME}}"
audit_type: "Process Audit | Product Audit | ASPICE Pre-Assessment"
date: {{DATE}}
aspice_process: "SUP.1 Quality Assurance"
---

# Quality Audit Report
## {{PROJECT_NAME}} - {{AUDIT_TYPE}}

## 1. Audit Information

| Field | Value |
|-------|-------|
| **Audit ID** | QA-2025-12-001 |
| **Audit Type** | Process Audit (ASPICE CL2 Pre-Assessment) |
| **Audit Date** | 2025-12-15 to 2025-12-17 (3 days) |
| **Audited Processes** | SWE.1, SWE.2, SWE.3, SWE.4, SUP.8, MAN.3 |
| **Audit Team** | Jane Auditor (Lead), Mark Inspector |
| **Auditees** | Door Lock Controller Project Team (8 members) |

---

## 2. Audit Scope

**In-Scope Processes**:
- SWE.1 Software Requirements Analysis
- SWE.2 Software Architectural Design
- SWE.3 Software Detailed Design and Unit Construction
- SWE.4 Software Unit Verification
- SUP.8 Configuration Management
- MAN.3 Project Management

**Project Phase**: Development Phase (60% complete)

**Sample Size**:
- 20 software requirements reviewed
- 3 software components (DoorLockCtrl, SpeedMonitor, Diagnostics)
- 15 unit test specifications
- 10 Git commits (traceability check)

---

## 3. Audit Findings by Process

### 3.1 SWE.1 Software Requirements Analysis

**Capability Level Assessment**: **CL2 (Managed)** [PASS]

| Base Practice | Rating | Evidence | Gaps |
|---------------|--------|----------|------|
| BP1: Specify SW requirements | **Largely** | SWRS-001 v1.0 with 45 requirements | 3 requirements lack acceptance criteria |
| BP2: Structure requirements | **Fully** | Requirements categorized (Functional, Performance, Interface) | None |
| BP3: Analyze requirements | **Fully** | Feasibility analysis in sprint planning | None |
| BP5: Establish traceability | **Largely** | Traceability matrix present | 2 requirements not traced to SYS |
| BP6: Ensure consistency | **Fully** | Automated consistency checks via validator script | None |
| BP7: Communicate requirements | **Fully** | SWRS published to team + customer | None |

**Strengths**:
- Automated requirement validation script (checks for duplicates, missing IDs)
- Good use of YAML-structured requirements for AI augmentation

**Weaknesses**:
- **FINDING-101**: 3 requirements (SWE-061, SWE-062, SWE-063) lack verification criteria
- **FINDING-102**: 2 requirements not traced back to system requirements

**Recommendation**: Add acceptance criteria to all requirements. Update traceability matrix.

---

### 3.2 SWE.2 Software Architectural Design

**Capability Level Assessment**: **CL2 (Managed)** [PASS]

| Base Practice | Rating | Evidence | Gaps |
|---------------|--------|----------|------|
| BP1: Develop architecture | **Fully** | SWAD-001 v1.6 (post-review) | None |
| BP2: Allocate requirements | **Fully** | Allocation matrix complete | None |
| BP3: Define interfaces | **Fully** | RTE interfaces specified | None |
| BP4: Describe dynamic behavior | **Fully** | State machines + sequence diagrams | None |
| BP5: Define resource consumption | **Largely** | RAM/Flash budgets defined | CPU load not measured yet |
| BP6: Evaluate alternatives | **Partially** | Only 1 alternative documented | Insufficient analysis |
| BP7: Establish traceability | **Fully** | Architecture → Requirements matrix | None |

**Strengths**:
- Excellent use of PlantUML for visual documentation
- AUTOSAR architecture well-defined

**Weaknesses**:
- **FINDING-201**: BP6 rating limited by lack of architecture alternative evaluation
- **FINDING-202**: CPU load budget theoretical (not measured on target hardware)

**Recommendation**: Document architectural alternatives (e.g., bare-metal vs AUTOSAR). Measure CPU load on HIL test bench.

---

### 3.3 SWE.3 Software Detailed Design and Unit Construction

**Capability Level Assessment**: **CL1 (Performed)** [WARN] (CL2 at risk)

| Base Practice | Rating | Evidence | Gaps |
|---------------|--------|----------|------|
| BP1: Develop detailed design | **Partially** | Design docs inconsistent across components | 1 component missing design doc |
| BP2: Establish traceability | **Largely** | Code-to-req traceability via Git commit messages | Not all units traced |
| BP3: Define interfaces | **Fully** | Function prototypes in headers | None |
| BP5: Ensure consistency | **Fully** | Automated MISRA checks in CI | None |
| BP6: Develop unit | **Fully** | Source code complete | None |
| BP7: Ensure consistency (code-design) | **Partially** | Design docs not always updated with code | Drift detected |

**Strengths**:
- MISRA C compliance enforced via CI
- Good Git commit message discipline (requirement IDs present)

**Weaknesses**:
- **FINDING-301** [MAJOR]: Diagnostics_SWC lacks detailed design document
- **FINDING-302** [MINOR]: Design documents not always updated when code changes

**Recommendation**:
1. **CRITICAL**: Create detailed design doc for Diagnostics_SWC before CL2 assessment
2. Implement design review as mandatory step before code changes

---

### 3.4 SWE.4 Software Unit Verification

**Capability Level Assessment**: **CL2 (Managed)** [PASS]

| Base Practice | Rating | Evidence | Gaps |
|---------------|--------|----------|------|
| BP1: Develop unit test strategy | **Fully** | UTS-001 v1.0 defines strategy (Unity + CMock) | None |
| BP2: Develop unit test specs | **Fully** | Test cases documented with traceability | None |
| BP3: Test software units | **Fully** | All units tested, CI enforces | None |
| BP4: Achieve test coverage | **Fully** | 92% MC/DC (target: 80%+) | None |
| BP5: Establish traceability | **Fully** | Test-to-requirement matrix complete | None |

**Strengths**:
- Excellent coverage (92% exceeds target)
- Automated test execution in CI/CD
- Clear test case structure (Given/When/Then)

**Weaknesses**: None identified

---

### 3.5 SUP.8 Configuration Management

**Capability Level Assessment**: **CL2 (Managed)** [PASS]

| Base Practice | Rating | Evidence | Gaps |
|---------------|--------|----------|------|
| BP1: Develop CM strategy | **Fully** | Git branching strategy documented | None |
| BP2: Identify config items | **Fully** | All work products in Git | None |
| BP3: Control modifications | **Fully** | PR reviews mandatory (2 approvers) | None |
| BP4: Track status | **Fully** | GitHub releases + tags | None |
| BP5: Establish baselines | **Fully** | Semantic versioning applied | None |

**Strengths**:
- Mature Git workflow (main/develop/feature branches)
- Branch protection rules enforced

**Weaknesses**: None identified

---

### 3.6 MAN.3 Project Management

**Capability Level Assessment**: **CL1 (Performed)** [WARN]

| Base Practice | Rating | Evidence | Gaps |
|---------------|--------|----------|------|
| BP1: Define project scope | **Fully** | Project plan v1.0 | None |
| BP2: Define lifecycle model | **Fully** | Agile V-Model documented | None |
| BP3: Evaluate feasibility | **Fully** | Initial feasibility study | None |
| BP5: Define project activities | **Largely** | Jira epics/stories define work | Some epics lack estimates |
| BP6: Estimate effort | **Partially** | Story points used, but no velocity tracking | Inconsistent estimation |
| BP10: Review project progress | **Fully** | Sprint reviews + retrospectives | None |

**Strengths**:
- Agile ceremonies well-established
- Jira used effectively for task tracking

**Weaknesses**:
- **FINDING-601**: Effort estimation inconsistent (no velocity tracking)
- **FINDING-602**: Some epics lack acceptance criteria

**Recommendation**: Implement velocity tracking in Jira. Add acceptance criteria to all epics.

---

## 4. Overall Assessment Summary

| Process | Target CL | Assessed CL | Gap Analysis |
|---------|-----------|-------------|--------------|
| SWE.1 | CL2 | **CL2** [PASS] | Minor gaps in BP1, BP5 |
| SWE.2 | CL2 | **CL2** [PASS] | BP6 needs improvement |
| SWE.3 | CL2 | **CL1** [WARN] | Missing design doc (critical) |
| SWE.4 | CL2 | **CL2** [PASS] | No gaps |
| SUP.8 | CL2 | **CL2** [PASS] | No gaps |
| MAN.3 | CL2 | **CL1** [WARN] | Estimation process weak |

**Overall Project CL**: **CL1** (limited by SWE.3, MAN.3)

**CL2 Readiness**: **70%** (4 out of 6 processes at CL2)

**Estimated Time to CL2**: 4-6 weeks (if findings addressed promptly)

---

## 5. Critical Findings (Blocking CL2)

### FINDING-301: Missing Detailed Design Document [CRITICAL]

**Process**: SWE.3 BP1
**Impact**: CL2 achievement blocked for SWE.3
**Description**: Diagnostics_SWC has source code but no detailed design document.

**Required Action**:
Create detailed design document covering:
- Component internal structure
- Function specifications
- Data structures
- Algorithm descriptions

**Owner**: Development Team
**Due Date**: 2026-01-15
**Priority**: **CRITICAL (Blocks CL2)**

---

### FINDING-601: No Velocity Tracking [MAJOR]

**Process**: MAN.3 BP6
**Impact**: CL2 achievement at risk for MAN.3
**Description**: Agile velocity not tracked, making effort estimation unreliable.

**Required Action**:
- Configure Jira to track velocity (story points completed per sprint)
- Use historical velocity for future sprint planning
- Review and adjust estimates quarterly

**Owner**: Project Manager
**Due Date**: 2026-01-20
**Priority**: MAJOR

---

## 6. Positive Observations

1. **Excellent Test Coverage**: 92% MC/DC coverage exceeds industry standards
2. **Strong CM Process**: Git workflow mature, branch protection enforced
3. **AI Augmentation**: Innovative use of AI for requirement generation and validation
4. **Automation**: CI/CD pipeline comprehensive (builds, tests, MISRA checks)
5. **Team Engagement**: High participation in sprint ceremonies and reviews

---

## 7. Recommendations

### Short-Term (1-2 weeks)
1. Address FINDING-301 (critical design doc)
2. Fix traceability gaps (FINDING-102)
3. Add acceptance criteria to requirements (FINDING-101)

### Medium-Term (4-6 weeks)
4. Implement velocity tracking (FINDING-601)
5. Document architecture alternatives (FINDING-201)
6. Measure CPU load on target hardware (FINDING-202)

### Long-Term (3 months)
7. Conduct formal ASPICE CL2 assessment (external assessor)
8. Target CL3 for safety-critical processes (SWE.1, SWE.2, SWE.4)

---

## 8. Audit Conclusion

**Overall Rating**: **Satisfactory with Improvements Required**

The Door Lock Controller project demonstrates strong engineering practices,
particularly in testing and configuration management. However, gaps in design
documentation (SWE.3) and project estimation (MAN.3) must be addressed to achieve
ASPICE CL2.

**Confidence in CL2 Achievement**: **High** (if critical findings resolved)

**Next Audit**: 2026-02-15 (Follow-up to verify findings closure)

---

## 9. Audit Team Signatures

| Role | Name | Date | Signature |
|------|------|------|-----------|
| Lead Auditor | Jane Auditor | 2025-12-17 | __________ |
| Co-Auditor | Mark Inspector | 2025-12-17 | __________ |

---

## 10. Auditee Acknowledgment

| Role | Name | Date | Signature |
|------|------|------|-----------|
| Project Manager | {{PM_NAME}} | ______ | __________ |
| Technical Lead | {{TL_NAME}} | ______ | __________ |
| Quality Manager | {{QM_NAME}} | ______ | __________ |

Status Report Template (MAN.3 BP10)

Weekly Project Status Report

# Project Status Report
## Door Lock Controller - Week 50 (2025-12-09 to 2025-12-15)

**Project**: Door Lock Controller ECU
**Reporting Period**: Week 50, 2025
**Report Date**: 2025-12-15
**Reported By**: {{PROJECT_MANAGER}}

---

## 1. Executive Summary

**Overall Status**: [OK] **GREEN** (On Track)

**Summary**:
Project progressing well. Architecture review completed with conditional approval.
Unit testing ahead of schedule (92% coverage). Integration testing starts next week.

**Key Achievements This Week**:
- [PASS] Software architecture review completed (conditional approval)
- [PASS] Unit test coverage reached 92% (target: 80%)
- [PASS] MISRA C compliance achieved (0 mandatory violations)

**Risks/Issues**:
- [WARN] FFI analysis delayed (blocks architecture final approval)
- [WARN] HIL test bench availability limited (sharing with another project)

---

## 2. Progress Against Plan

### 2.1 Milestones

| Milestone | Plan Date | Forecast Date | Status |
|-----------|-----------|---------------|--------|
| M1: Requirements Freeze | 2025-11-30 | 2025-11-30 | [PASS] Complete |
| M2: Architecture Review | 2025-12-15 | 2025-12-15 | [PASS] Complete (Conditional) |
| M3: Alpha Release | 2026-01-15 | 2026-01-18 | [MED] At Risk (+3 days) |
| M4: Beta Release | 2026-02-28 | 2026-02-28 | [OK] On Track |
| M5: Production Release | 2026-04-30 | 2026-04-30 | [OK] On Track |

**Milestone Health**:
- [PASS] 2 complete
- [MED] 1 at risk (M3: FFI analysis dependency)
- [OK] 2 on track

---

### 2.2 Work Package Status

| Work Package | Progress | This Week | Plan | Variance |
|--------------|----------|-----------|------|----------|
| SWE.1 Requirements | 100% | - | 100% | [OK] 0% |
| SWE.2 Architecture | 95% | Design review, FFI pending | 100% | [MED] -5% |
| SWE.3 Detailed Design | 80% | Diagnostics SWC design started | 75% | [OK] +5% |
| SWE.4 Unit Testing | 100% | All tests passing, 92% coverage | 90% | [OK] +10% |
| SWE.5 Integration Test | 10% | Test specs drafted | 10% | [OK] 0% |
| SWE.6 Qualification Test | 0% | - | 0% | [OK] 0% |

**Overall Progress**: **70%** (planned: 68%) - [OK] **Ahead of Schedule**

---

## 3. Metrics

### 3.1 Quality Metrics

| Metric | This Week | Last Week | Target | Trend |
|--------|-----------|-----------|--------|-------|
| Unit Test Coverage | 92% | 87% | 80% | ↑ Improving |
| MISRA Mandatory Violations | 0 | 2 | 0 | [PASS] Target Met |
| MISRA Required Violations | 3 | 5 | <10 | ↑ Improving |
| Critical Bugs | 0 | 1 | 0 | [PASS] Fixed |
| Code Review Turnaround | 18h avg | 24h avg | <24h | ↑ Improving |

### 3.2 Productivity Metrics

| Metric | This Week | Last Week | Trend |
|--------|-----------|-----------|-------|
| Story Points Completed | 42 | 38 | ↑ |
| Velocity (3-sprint avg) | 40 | 37 | ↑ |
| PR Merge Rate | 95% | 92% | ↑ |
| Build Success Rate | 98% | 96% | ↑ |

### 3.3 Risk Metrics

| Risk Category | Open Risks | Trend |
|---------------|------------|-------|
| Technical | 2 | → |
| Schedule | 1 | ↑ New |
| Resource | 1 | → |
| External Dependency | 0 | [PASS] Resolved |

---

## 4. Key Activities This Week

### Completed
- [x] Software architecture design review (DR-DOORLOCK-2025-12-17)
- [x] Unit tests for DoorLockCtrl_SWC (100% coverage)
- [x] MISRA C compliance fixes (2 mandatory violations resolved)
- [x] Sprint 12 planning and retrospective

### In Progress
- [ ] Freedom From Interference (FFI) analysis (Safety Engineer)
- [ ] Diagnostics_SWC detailed design
- [ ] Integration test environment setup (HIL)

### Blocked
- [ ] Architecture final approval (waiting for FFI analysis)
- [ ] HIL integration testing (test bench availability)

---

## 5. Risks and Issues

### Active Risks

#### RISK-003: FFI Analysis Delayed [HIGH]

**Status**: [HIGH] Active
**Impact**: Blocks architecture final approval (M2), may delay M3 (Alpha)
**Probability**: 60%
**Mitigation**:
- Safety engineer allocated full-time to FFI analysis starting 2025-12-18
- External consultant on standby if needed by 2025-12-22
**Owner**: Safety Manager
**Review Date**: 2025-12-20

---

#### RISK-005: HIL Test Bench Availability [MEDIUM]

**Status**: [MED] Active
**Impact**: Integration testing may be delayed by 1 week
**Probability**: 40%
**Mitigation**:
- Negotiated time slots with other project (Tue/Thu afternoons)
- Fallback: Use software-in-the-loop (SIL) for initial integration tests
**Owner**: Test Manager
**Review Date**: 2025-12-22

---

### Closed Risks (This Week)

#### RISK-001: CAN DBC File Format Incompatibility [RESOLVED]

**Closed Date**: 2025-12-12
**Resolution**: Customer provided updated DBC file v2.3, compatible with CANoe

---

## 6. Resource Status

| Resource | Allocation | Availability | Notes |
|----------|------------|--------------|-------|
| SW Developers (6) | 100% | 100% | - |
| Safety Engineer (1) | 100% | 100% | Focused on FFI analysis |
| Test Engineer (1) | 100% | 80% | Shared with Project X |
| HIL Test Bench | 50% | 40% | Sharing with Project Y |

**Resource Issues**: Test bench contention with Project Y. Negotiated shared schedule.

---

## 7. Dependencies

| Dependency | Status | Expected Date | Impact if Delayed |
|------------|--------|---------------|-------------------|
| CAN DBC from Customer | [PASS] Received | 2025-12-10 | [PASS] None (resolved) |
| AUTOSAR BSW from Vendor | [OK] On Track | 2026-01-05 | Low (integration buffer) |
| Safety Certification Docs | [MED] In Progress | 2026-01-20 | Medium (affects M3) |

---

## 8. Forecast and Look-Ahead

### Next Week (Week 51: 2025-12-16 to 2025-12-22)

**Planned Activities**:
1. Complete FFI analysis (Safety Engineer)
2. Finalize architecture document v1.7 (post-FFI)
3. Start integration test execution (DoorLockCtrl ↔ SpeedMonitor)
4. Continue detailed design for Diagnostics_SWC
5. Sprint 13 planning

**Expected Completions**:
- Architecture final approval (M2)
- Integration test environment ready
- 5% progress on integration testing

**Anticipated Issues**:
- Holiday season (team availability reduced 50% Dec 23 - Jan 2)

---

## 9. Action Items

| Action | Owner | Due Date | Status |
|--------|-------|----------|--------|
| Complete FFI analysis | Safety Eng | 2025-12-20 | Open |
| Update architecture doc to v1.7 | SW Architect | 2025-12-22 | Open |
| Reserve HIL test bench time slots | Test Eng | 2025-12-16 | Open |
| Create Diagnostics_SWC design doc | Developer B | 2025-12-27 | Open |

---

## 10. Approvals

| Role | Name | Date | Signature |
|------|------|------|-----------|
| Project Manager | {{PM_NAME}} | 2025-12-15 | __________ |
| Technical Lead | {{TL_NAME}} | 2025-12-15 | __________ |

**Distribution List**: Project Team, Steering Committee, Customer (monthly summary)

Summary

Review and Report Templates for ASPICE Compliance:

  • Design Review Records (SUP.1 BP4): Formal review with checklist, findings, action items
  • Code Review Checklist (SWE.3 BP7): Lightweight PR review for quality gates
  • Audit Reports (SUP.1 BP8): ASPICE process/product audits with capability level assessment
  • Status Reports (MAN.3 BP10): Weekly progress tracking with metrics and risk management

Key Features:

  1. Structured Templates: Consistent format across all review/report types
  2. Actionable Findings: Clear severity, owner, due date for all issues
  3. Metrics-Driven: Quantitative data (coverage, defect density, velocity)
  4. Traceability: Findings linked to ASPICE base practices
  5. Decision Records: Explicit approval/rejection with rationale