4.1: Requirements Analysis Prompt Templates
SWE.1 Requirements Analysis Prompts
Purpose
Use Cases:
- Extract software requirements from system specifications
- Detect ambiguities, incompleteness, inconsistencies
- Generate traceability matrices
- Create requirements in DOORS/ReqIF format
Template Variables
The following placeholders are used throughout these templates:
| Variable | Description | Example |
|---|---|---|
{PROJECT_NAME} |
ECU/project identifier | "ACC", "LKA", "AEB" |
{SAFETY_CLASS} |
ISO 26262 ASIL level | "ASIL-B", "QM" |
{SYSTEM_REQUIREMENTS_TEXT} |
Copy-paste of system specification excerpt | SYS-045 text |
{REQUIREMENT_TEXT} |
Single requirement to analyze | Natural language requirement |
{SOFTWARE_REQUIREMENTS_LIST} |
List of SWE requirements | Multiple SWE-XXX entries |
{SYSTEM_REQUIREMENTS_LIST} |
List of SYS requirements | Multiple SYS-XXX entries |
{PLAIN_TEXT_REQUIREMENTS} |
Unformatted requirements text | Informal specification |
ID Format: [Prefix]-[System_Req_Number]-[Sub_Req_Index]
- Prefix: SWE, SYS, HWE, etc. (from ASPICE process areas)
- System_Req_Number: Links to parent system requirement (e.g., 045 from SYS-045)
- Sub_Req_Index: Ordinal number (1, 2, 3...) for subrequirements derived from same parent
Template 1: Extract Software Requirements
Extract SWE Requirements from SYS Spec
Prompt:
You are an AI requirements engineer specialized in automotive embedded systems (ISO 26262, ASPICE).
Context: I'm developing an {PROJECT_NAME} ECU for automotive {SAFETY_CLASS} (ISO 26262).
Input: System requirements specification (excerpt):
{SYSTEM_REQUIREMENTS_TEXT}
Task: Derive software requirements (SWE) from system requirements (SYS):
1. Decompose system requirements into software requirements (one SYS may generate multiple SWE)
2. Add implementation details (data types, ranges, units, latency)
3. Include traceability (@implements [SYS-XXX])
4. Specify safety class (inherit from system requirement)
Output Format:
```markdown
[SWE-XXX-Y] Requirement Title
Description: {detailed description}
- Input: {data type, range, units}
- Output: {data type, range, units}
- Latency: ≤ {value} ms
- Safety Class: {ASIL level}
- Implements: [SYS-XXX]
- Verification Method: {unit test / integration test / HIL test}
Constraints:
- All numeric values must have units (m, s, kg, °C, etc.)
- Requirements must be testable (measurable criteria)
- Follow ASPICE SWE.1 best practices
**Example Usage**:
{PROJECT_NAME} = "ACC" {SAFETY_CLASS} = "ASIL-B" {SYSTEM_REQUIREMENTS_TEXT} = " [SYS-045] Emergency Braking Function The vehicle shall activate emergency braking if an obstacle is detected at distance <5 m and closing speed >10 km/h. Acceptance Criteria:
- Detection latency ≤ 50 ms
- Braking force applied within 100 ms Safety Class: ASIL-B "
---
## Template 2: Detect Ambiguities
### Find Vague or Incomplete Requirements
**Prompt**:
You are an AI requirements analyst specialized in detecting ambiguities in safety-critical specifications (ISO 26262, IEC 62304).
Requirement Text:
{REQUIREMENT_TEXT}
Task: Analyze requirement for ambiguities, incompleteness, inconsistencies:
- Vague Terms: Flag undefined terms ("quickly", "robust", "user-friendly")
- Missing Units: Identify numeric values without units ("distance < 5" → should be "distance < 5 meters")
- Untestable: Determine if requirement has measurable pass/fail criteria
- Conflicts: Check for contradictions with other requirements (if provided)
Output Format:
## Ambiguity Report
| Issue Type | Location | Description | Suggested Fix |
|------------|----------|-------------|---------------|
| Vague Term | {line/phrase} | "quickly" not quantified | Specify "≤ 100 ms" |
| Missing Unit | {line/phrase} | "distance < 5" | Add unit: "distance < 5 meters" |
| Untestable | Requirement | No pass/fail criteria | Define measurable acceptance criteria |
## Recommendation
PRIORITY: [LOW | MEDIUM | HIGH | CRITICAL]
ACTION: [Clarify with stakeholder | Update requirement | Request clarification]
Constraints:
- Flag all issues (even minor ones)
- Provide actionable suggestions
- Prioritize by safety impact
**Example Usage**:
{REQUIREMENT_TEXT} = "The system shall respond quickly to user input and provide adequate feedback."
**Expected Output**:
```markdown
## Ambiguity Report
| Issue Type | Location | Description | Suggested Fix |
|------------|----------|-------------|---------------|
| Vague Term | "respond quickly" | No latency specified | Specify "≤ 100 ms response time" |
| Vague Term | "adequate feedback" | Undefined quality | Define feedback type (visual, audible, haptic) |
| Untestable | Requirement | No measurable criteria | Add: "Display confirmation message within 100 ms" |
## Recommendation
PRIORITY: HIGH
Action: Clarify with stakeholder (requirement engineer), update spec with quantified values
Template 3: Generate Traceability Matrix
Create SYS → SWE Traceability
Prompt:
You are an AI traceability specialist for ASPICE-compliant projects.
Input:
1. System Requirements:
{SYSTEM_REQUIREMENTS_LIST}
2. Software Requirements:
{SOFTWARE_REQUIREMENTS_LIST}
Task: Generate bidirectional traceability matrix (SYS ↔ SWE):
1. Forward traceability: SYS → SWE (each system requirement links to software requirements)
2. Backward traceability: SWE → SYS (each software requirement links to system requirement)
3. Coverage analysis: Identify orphan requirements (SYS with no SWE, or vice versa)
Output Format:
```markdown
| System Req | Software Req | Rationale | Status |
|------------|--------------|-----------|--------|
| SYS-045 | SWE-045-1 | Distance calculation | [PASS] Verified |
| SYS-045 | SWE-045-2 | Speed calculation | [PASS] Verified |
| SYS-045 | SWE-045-3 | Brake trigger logic | [PASS] Verified |
## Coverage Analysis
- **Forward Coverage**: 100% (12/12 system requirements have software implementations)
- **Backward Coverage**: 100% (34/34 software requirements trace to system requirements)
- **Orphan Requirements**: None [PASS]
## Issues
- {List any orphan requirements or traceability gaps}
---
## Template 4: Requirements Quality Check
### Verify ASPICE SWE.1 Compliance
**ASPICE SWE.1 Base Practices (BP1-6)**:
| Practice | Name | Description |
|----------|------|-------------|
| BP1 | Software requirements defined | Specify software functional and non-functional requirements |
| BP2 | Consistency with system requirements | Ensure SWE requirements align with SYS requirements |
| BP3 | Interfaces specified | Define software interfaces (APIs, protocols, messages) |
| BP4 | Feasibility analyzed | Verify requirements can be implemented within constraints |
| BP5 | Bidirectional traceability | Establish SYS ↔ SWE traceability links |
| BP6 | Verifiability | Ensure requirements have testable, measurable criteria |
**Prompt**:
You are an AI ASPICE assessor reviewing software requirements for SWE.1 compliance (Capability Level 3).
Requirements:
{SOFTWARE_REQUIREMENTS_LIST}
Task: Assess requirements quality against ASPICE SWE.1 base practices:
- BP1: Software requirements defined (completeness)
- BP2: Consistency with system requirements (no conflicts)
- BP3: Interfaces specified (APIs, messages, protocols)
- BP4: Feasibility analyzed (can be implemented within constraints)
- BP5: Bidirectional traceability (SYS ↔ SWE)
- BP6: Verifiability (testable, measurable criteria)
Output Format:
## ASPICE SWE.1 Assessment
| Base Practice | Status | Score | Findings |
|---------------|--------|-------|----------|
| BP1 (Defined) | [PASS] PASS | 100% | All 34 requirements well-defined |
| BP2 (Consistency) | [WARN] PARTIAL | 85% | 3 conflicts detected (see below) |
| BP3 (Interfaces) | [PASS] PASS | 100% | CAN, API interfaces specified |
| BP4 (Feasibility) | [PASS] PASS | 100% | All requirements feasible |
| BP5 (Traceability) | [PASS] PASS | 100% | 100% bidirectional traceability |
| BP6 (Verifiability) | [WARN] PARTIAL | 90% | 3 requirements lack test criteria |
Overall: **Level 2 (Managed)** - 90% compliant
## Non-Conformances
1. **BP2 Violation**: Requirement [SWE-045] conflicts with [SWE-089] (latency: 50 ms vs 30 ms)
2. **BP6 Violation**: Requirement [SWE-123] not testable ("shall be user-friendly")
## Recommendations
1. Resolve conflicts (clarify latency budget allocation)
2. Add test criteria for [SWE-123] (e.g., "User completes task in ≤ 5 clicks")
---
## Template 5: Requirements Conversion (Natural Language → DOORS)
### Convert Text to DOORS/ReqIF Format
**Prompt**:
You are an AI requirements formatter for IBM DOORS Next Generation.
Input: Natural language requirements:
{PLAIN_TEXT_REQUIREMENTS}
Task: Convert to DOORS-compatible format (ReqIF XML) with attributes:
- ID: SWE-XXX-Y
- Title: Brief summary
- Description: Detailed requirement text
- Type: Functional / Non-functional / Interface
- Priority: High / Medium / Low
- Safety Class: ASIL-A/B/C/D / QM
- Verification Method: Test / Inspection / Analysis / Demo
- Traceability: Links to parent (SYS) and child (code, tests)
Output Format:
<?xml version="1.0" encoding="UTF-8"?>
<REQ-IF xmlns="http://www.omg.org/spec/ReqIF/20110401/reqif.xsd">
<SPEC-OBJECT>
<IDENTIFIER>SWE-045-1</IDENTIFIER>
<TYPE>Functional Requirement</TYPE>
<VALUES>
<ATTRIBUTE-VALUE-STRING>
<DEFINITION>Title</DEFINITION>
<THE-VALUE>Obstacle Distance Calculation</THE-VALUE>
</ATTRIBUTE-VALUE-STRING>
<ATTRIBUTE-VALUE-XHTML>
<DEFINITION>Description</DEFINITION>
<THE-VALUE>The ACC software shall calculate obstacle distance using radar sensor data...</THE-VALUE>
</ATTRIBUTE-VALUE-XHTML>
<ATTRIBUTE-VALUE-ENUMERATION>
<DEFINITION>Safety Class</DEFINITION>
<VALUES>
<ENUM-VALUE>ASIL-B</ENUM-VALUE>
</VALUES>
</ATTRIBUTE-VALUE-ENUMERATION>
</VALUES>
</SPEC-OBJECT>
</REQ-IF>
---
## Summary
**Requirements Prompts Covered**:
1. **Extract Requirements**: Derive SWE from SYS (decomposition, traceability)
2. **Detect Ambiguities**: Find vague terms, missing units, untestable criteria
3. **Generate Traceability**: Create SYS ↔ SWE matrix, coverage analysis
4. **Quality Check**: Assess ASPICE SWE.1 compliance (BP1-6)
5. **Format Conversion**: Convert text → DOORS/ReqIF
**Success Metrics**: 70-80% extraction accuracy, 85-90% ambiguity detection, and 95-100% traceability generation
**Next**: Code Generation Prompts (32.02) - Generate C functions, MISRA-compliant code
---
**Navigation**: [← 32.00 Prompt Templates](32.00_Prompt_Templates.md) | [Contents](../00_Front_Matter/00.06_Table_of_Contents.md) | [32.2 Code Generation Prompts →](32.02_Code_Generation_Prompts.md)