2.1: Process Reference Model (PRM)
What You'll Learn
By the end of this section, you will be able to:
- Explain what the PRM is and why it matters
- Describe the components that make up a process definition
- Understand how outcomes prove a process's purpose has been achieved
- Use PRM concepts to assess your own process implementation
- Navigate the full ASPICE 4.0 process landscape across all categories and groups
- Distinguish between the PRM and the PAM and know when to use each
- Tailor the PRM for specific project contexts
- Identify how AI changes the practical implementation of each process
So What Is the PRM?
The Process Reference Model (PRM) answers a simple but essential question: What processes should exist, and what should they accomplish?
For each process, the PRM gives you:
- Process ID: Unique identifier (like SWE.1)
- Process Name: What we call it
- Process Purpose: What the process is meant to achieve
- Process Outcomes: Specific results that prove the process has been completed correctly
The PRM is completely technology-agnostic. It defines what to achieve, not how to get there. The implementation approach is left to the organization.
PRM Structure
Let's visualize how this works:
The PRM is organized as a hierarchy of four layers:
| Layer | Contains | Example |
|---|---|---|
| Process Categories | Groupings by organizational function | Primary Life Cycle, Support, Management, Organizational |
| Process Groups | Related processes within a category | SYS, SWE, HWE, MLE, SUP, SEC, MAN, ACQ, SPL |
| Individual Processes | A named process with an ID | SWE.1 Software Requirements Analysis |
| Process Outcomes | Observable results of a process | "Software requirements are specified" |
Each layer narrows the focus. Categories give you the big picture, groups organize the discipline, individual processes define the work, and outcomes tell you what success looks like.
Key Insight: The PRM is intentionally abstract. It defines what must be achieved but never prescribes tools, templates, or methodologies. This abstraction is what makes it possible to integrate AI into any process without violating the model.
Process Categories
ASPICE 4.0 organizes all processes into four categories. Understanding these categories helps you see the forest before getting lost in the trees.
Primary Life Cycle Processes
These are the engineering processes that directly produce the product. They follow the V-Model and include the decomposition (left side) and verification (right side) activities.
| Group | Focus | Process Count |
|---|---|---|
| SYS | System engineering | 5 |
| SWE | Software engineering | 6 |
| HWE | Hardware engineering | 4 |
| MLE | Machine learning engineering | 5 |
What This Means: Primary processes are where the actual product takes shape. Every requirement, architecture decision, line of code, and test case lives within these processes. If you are building a product, you are executing primary processes.
Support Processes
Support processes operate horizontally across all engineering activities. They do not produce product artifacts directly but ensure that the artifacts produced by primary processes are correct, consistent, and controlled.
| Group | Focus | Process Count |
|---|---|---|
| SUP | Quality assurance, configuration management, problem/change management, ML data management | 5 |
| SEC | Cybersecurity requirements, implementation, verification | 3 |
Management Processes
Management processes provide the project-level planning, risk oversight, and measurement that keep engineering activities on track.
| Group | Focus | Process Count |
|---|---|---|
| MAN | Project management, risk management, measurement | 3 |
Organizational Life Cycle Processes
These processes operate at the organizational boundary, managing the relationship between the organization and its suppliers or customers.
| Group | Focus | Process Count |
|---|---|---|
| ACQ | Supplier monitoring | 1 |
| SPL | Product release | 1 |
Process Groups
SYS - System Engineering Group
The SYS group covers the complete system lifecycle, starting from stakeholder needs and ending with system-level verification.
| ID | Name | Purpose Summary |
|---|---|---|
| SYS.1 | Requirements Elicitation | Gather and capture stakeholder requirements |
| SYS.2 | System Requirements Analysis | Transform stakeholder requirements into system requirements |
| SYS.3 | System Architectural Design | Establish system architecture and allocate requirements to elements |
| SYS.4 | System Integration and Integration Verification | Integrate system elements and verify interactions |
| SYS.5 | System Verification | Confirm system satisfies system requirements |
V-Model Mapping: SYS.1-SYS.3 form the left (decomposition) side. SYS.4-SYS.5 form the right (verification) side. SYS.3 is the pivot point where the system is decomposed into software, hardware, and mechanical elements.
SWE - Software Engineering Group
The SWE group mirrors the V-Model specifically for software, from requirements through to qualification testing.
| ID | Name | Purpose Summary |
|---|---|---|
| SWE.1 | Software Requirements Analysis | Establish structured, analyzed software requirements |
| SWE.2 | Software Architectural Design | Establish software architecture consistent with requirements |
| SWE.3 | Software Detailed Design and Unit Construction | Provide detailed design and develop software units |
| SWE.4 | Software Unit Verification | Verify units meet detailed design |
| SWE.5 | Software Integration and Integration Verification | Integrate units and verify interactions |
| SWE.6 | Software Qualification Test | Confirm software satisfies software requirements |
HWE - Hardware Engineering Group
The HWE group parallels SWE but for hardware development. It was restructured in ASPICE 4.0.
| ID | Name | Purpose Summary |
|---|---|---|
| HWE.1 | Hardware Requirements Analysis | Establish hardware requirements from system allocation |
| HWE.2 | Hardware Design | Design hardware elements |
| HWE.3 | Verification against Hardware Design | Verify hardware against design specifications |
| HWE.4 | Verification against Hardware Requirements | Verify hardware meets requirements |
MLE - Machine Learning Engineering Group
The MLE group is entirely new in ASPICE 4.0 and addresses the unique lifecycle of ML components.
| ID | Name | Purpose Summary |
|---|---|---|
| MLE.1 | ML Requirements Analysis | Establish ML-specific requirements |
| MLE.2 | ML Architecture | Define ML model architecture |
| MLE.3 | ML Training and Learning | Train and optimize ML models |
| MLE.4 | ML Model Testing | Verify ML model behavior under specified conditions |
| MLE.5 | ML Model Deployment | Deploy ML models to target environment |
SUP - Support Process Group
| ID | Name | Purpose Summary |
|---|---|---|
| SUP.1 | Quality Assurance | Ensure work products and processes comply with plans |
| SUP.8 | Configuration Management | Establish and maintain integrity of all work products |
| SUP.9 | Problem Resolution Management | Identify, analyze, manage, and resolve problems |
| SUP.10 | Change Request Management | Manage and control change requests |
| SUP.11 | Machine Learning Data Management | Manage ML data throughout the lifecycle |
SEC - Cybersecurity Engineering Group
| ID | Name | Purpose Summary |
|---|---|---|
| SEC.1 | Cybersecurity Requirements | Define cybersecurity requirements from threat analysis |
| SEC.2 | Cybersecurity Implementation | Implement cybersecurity measures |
| SEC.3 | Cybersecurity Verification | Verify cybersecurity implementation |
MAN - Management Process Group
| ID | Name | Purpose Summary |
|---|---|---|
| MAN.3 | Project Management | Plan and control project activities |
| MAN.5 | Risk Management | Identify and manage project risks |
| MAN.6 | Measurement | Collect and analyze process and product data |
Process Attributes
Each process in the PRM is defined through three interconnected elements: its purpose, its outcomes, and the base practices that achieve those outcomes.
Purpose
A process purpose is a single statement describing the high-level objective. It always uses active voice and begins with "To...":
| Pattern | Example |
|---|---|
| "To establish..." | SWE.1: "To establish a structured and analyzed set of software requirements..." |
| "To define..." | MLE.2: "To define ML model architecture suitable for requirements" |
| "To verify..." | SWE.4: "To verify that software units meet the software detailed design" |
| "To ensure..." | SUP.1: "To ensure that work products and processes comply with plans and requirements" |
| "To confirm..." | SYS.5: "To confirm that the system satisfies stakeholder requirements" |
Tip: Purpose statements are your compass. If you ever wonder whether an activity belongs in a particular process, check whether it contributes to the purpose. If it does not, the activity belongs somewhere else.
Outcomes
Outcomes are observable, measurable results that collectively prove the purpose has been achieved. Each process typically defines 4 to 8 outcomes.
| Characteristic | Description |
|---|---|
| Observable | You can collect evidence for it |
| Measurable | You can assess how well it is achieved |
| Technology-agnostic | Works with any implementation approach |
| Complete set | Together, they fully achieve the purpose |
Base Practices
Base practices are the fundamental activities that implement a process. They exist in the PAM (Process Assessment Model) rather than the PRM, but they directly map to PRM outcomes. Each base practice targets one or more outcomes.
| BP Characteristic | What It Means |
|---|---|
| Specific | Describes a concrete, identifiable activity |
| Assessable | Evidence can be collected to verify execution |
| Comprehensive | Together, all BPs achieve all outcomes |
| Process-specific | Each process has its own unique set of BPs |
PRM vs PAM: The PRM defines the "what" (purpose and outcomes). The PAM adds the "how to assess" (base practices, work products, assessment indicators). This separation is deliberate. You can change your implementation approach without changing what you need to achieve.
Purpose Statements: What You're Trying to Achieve
Every process has a single purpose statement that:
- Describes the overall intent of the process
- Uses active voice ("To establish...", "To define...")
- Can be measured through the outcomes
- Works regardless of what technology or methodology you use
Example Purpose Statements
| Process | Purpose |
|---|---|
| SYS.2 | To transform stakeholder requirements into a set of system requirements |
| SWE.2 | To establish a software architecture consistent with software requirements |
| SWE.3 | To provide a software detailed design and to develop software units |
| SWE.4 | To verify that software units meet the software detailed design |
| SUP.8 | To establish and maintain the integrity of all work products |
Process Outcomes: How You Know You've Succeeded
Outcomes are the evidence of successful process implementation. Each process typically has 4-8 outcomes.
What Makes a Good Outcome?
- Observable: You can collect evidence for it
- Measurable: You can assess how well it is achieved
- Technology-agnostic: Works with any implementation approach
- Complete set: Together, they achieve the purpose
SWE.1 Outcomes Example
| Outcome | What It Means | Evidence Examples |
|---|---|---|
| O1 | Software requirements are defined and documented | SRS document, requirements database |
| O2 | Software requirements are allocated from system requirements | Traceability matrix, allocation records |
| O3 | Software requirements are analyzed for correctness and verifiability | Review records, analysis reports |
| O4 | Impact of requirements on operating environment is identified | Interface specifications, resource estimates |
| O5 | Prioritization of requirements is established | Priority attributes, release plans |
| O6 | Requirements are agreed with stakeholders | Approval records, meeting minutes |
| O7 | Software requirements are communicated to all affected parties | Communication logs, distribution records |
The Complete ASPICE 4.0 PRM
SYS Process Purposes
| Process | Purpose Statement |
|---|---|
| SYS.1 | To gather and capture stakeholder requirements to provide a basis for system requirements |
| SYS.2 | To transform stakeholder requirements into a set of system requirements |
| SYS.3 | To establish a system architecture and allocate requirements to system elements |
| SYS.4 | To integrate system elements and verify that they interact correctly |
| SYS.5 | To confirm that the system satisfies stakeholder requirements |
SWE Process Purposes
| Process | Purpose Statement |
|---|---|
| SWE.1 | To establish a structured and analyzed set of software requirements consistent with system requirements and architecture |
| SWE.2 | To establish a software architecture consistent with software requirements |
| SWE.3 | To provide a software detailed design and develop software units |
| SWE.4 | To verify that software units meet the software detailed design |
| SWE.5 | To integrate software units and verify their interaction |
| SWE.6 | To confirm that software satisfies software requirements |
MLE Process Purposes (New in ASPICE 4.0)
| Process | Purpose Statement |
|---|---|
| MLE.1 | To establish ML requirements addressing ML-specific aspects |
| MLE.2 | To define ML model architecture suitable for requirements |
| MLE.3 | To optimize ML models to meet the defined ML requirements |
| MLE.4 | To verify ML model behavior under specified conditions |
| MLE.5 | To deploy ML models to target environment |
SUP Process Purposes
| Process | Purpose Statement |
|---|---|
| SUP.1 | To ensure that work products and processes comply with plans and requirements |
| SUP.8 | To establish and maintain the integrity of all work products |
| SUP.9 | To ensure that problems are identified, analyzed, managed, and resolved |
| SUP.10 | To ensure that change requests are managed and controlled |
| SUP.11 | To manage machine learning data throughout the lifecycle |
ASPICE 4.0 Changes from 3.1
ASPICE 4.0 introduced substantial structural changes to the PRM. Understanding these changes is critical for teams migrating from 3.1 or referencing older documentation.
New Process Groups
| Change | Details |
|---|---|
| MLE group added | Five entirely new processes (MLE.1-MLE.5) addressing machine learning engineering |
| SEC group added | Three cybersecurity processes (SEC.1-SEC.3) aligned with ISO/SAE 21434 |
| SUP.11 added | Machine Learning Data Management to handle training data lifecycle |
Removed or Restructured Processes
| Change | Details |
|---|---|
| SUP.2 removed | Verification was removed as a standalone support process; verification activities are now embedded within each engineering process (SWE.4-6, SYS.4-5, HWE.3-4) |
| SUP.3-7 removed | Validation (SUP.3), Joint Review (SUP.4), Audit (SUP.5), Product Evaluation (SUP.6), and Documentation (SUP.7) were eliminated as separate processes; their intents are absorbed into other processes or considered implicit |
| HWE restructured | Hardware processes were reorganized into four processes (HWE.1-HWE.4) with clearer verification separation |
Renaming and Scope Changes
| Process | ASPICE 3.1 Name | ASPICE 4.0 Name | Impact |
|---|---|---|---|
| SWE.5 | Software Integration and Integration Test | Software Integration and Integration Verification | Emphasis on "verification" rather than "test" |
| SWE.6 | Software Qualification Test | Software Qualification Test | Retained but scope clarified |
| SYS.4 | System Integration and Integration Test | System Integration and Integration Verification | Aligned with SWE.5 naming |
Migration Tip: If your organization is assessed against ASPICE 3.1, the transition to 4.0 requires mapping your existing process implementations to the new structure. The most impactful change is the removal of SUP.2-7. You must verify that verification, validation, and review activities previously housed in those SUP processes are now demonstrably covered within your engineering processes.
Process Relationships
Processes do not operate in isolation. The PRM implies a network of dependencies and interactions that determine execution order and information flow.
Vertical Dependencies (V-Model Flow)
The V-Model creates a strict decomposition-verification chain:
| Decomposition (Left Side) | Verification (Right Side) | Relationship |
|---|---|---|
| SYS.1 Requirements Elicitation | SYS.5 System Verification | SYS.5 verifies against SYS.2 outputs |
| SYS.2 System Requirements | SYS.4 System Integration | SYS.4 verifies integration against SYS.3 |
| SYS.3 System Architecture | - | Pivot: allocates to SWE, HWE, MLE |
| SWE.1 SW Requirements | SWE.6 SW Qualification | SWE.6 verifies against SWE.1 outputs |
| SWE.2 SW Architecture | SWE.5 SW Integration | SWE.5 verifies against SWE.2 outputs |
| SWE.3 Detailed Design/Construction | SWE.4 Unit Verification | SWE.4 verifies against SWE.3 outputs |
Horizontal Dependencies (Cross-Group)
| Source Process | Target Process | Information Flow |
|---|---|---|
| SYS.3 | SWE.1, HWE.1, MLE.1 | System architecture allocates requirements to disciplines |
| SWE.6, HWE.4, MLE.4 | SYS.4 | Verified components feed into system integration |
| MAN.3 | All processes | Project plan governs schedule, resources, and milestones |
| SUP.8 | All processes | Configuration management controls all work products |
| SUP.1 | All processes | Quality assurance audits compliance across all processes |
| SUP.10 | All processes | Change requests can affect any process work product |
| SEC.1 | SYS.2, SWE.1 | Security requirements feed into system and software requirements |
Feedback Loops
| Loop | Description |
|---|---|
| Verification findings to requirements | Defects found in SWE.4-6 may trigger requirement changes in SWE.1 |
| Problem resolution to any process | SUP.9 findings can trigger rework in any engineering process |
| Risk to project management | MAN.5 risk events feed back into MAN.3 project replanning |
| Measurement to improvement | MAN.6 metrics inform process improvement decisions |
Practical Implication: When planning a project, map these dependencies explicitly. A delay in SYS.3 (system architecture) will cascade to SWE.1, HWE.1, and MLE.1 simultaneously. AI-assisted dependency tracking can help identify these cascading impacts early.
PRM vs PAM
The Process Reference Model and the Process Assessment Model are complementary but serve different purposes. Confusing them is a common mistake.
| Aspect | PRM | PAM |
|---|---|---|
| Question answered | What processes should exist and what should they achieve? | How do we assess whether a process is implemented effectively? |
| Contents | Process ID, name, purpose, outcomes | Base practices, work products, assessment indicators, generic practices |
| Abstraction level | High (what, not how) | Detailed (specific activities and artifacts) |
| Audience | Process designers, project managers, anyone planning | Assessors, quality managers, anyone evaluating |
| Standard basis | ISO/IEC 33004 (process model requirements) | ISO/IEC 33002 (assessment requirements) |
| Stability | Rarely changes within a version | May include interpretive guidance that evolves |
Analogy: The PRM is the exam syllabus—it tells you what topics will be covered and what you need to know. The PAM is the marking scheme—it tells the examiner exactly what to look for in your answers and how to grade them.
When to Use Which
| Situation | Use |
|---|---|
| Designing a new process from scratch | Start with the PRM to understand purpose and outcomes |
| Preparing for an assessment | Use the PAM to identify required base practices and work products |
| Improving an existing process | Use the PRM to confirm you are targeting the right outcomes, then use the PAM to identify specific practice gaps |
| Training new team members | Start with the PRM for conceptual understanding, then introduce the PAM for practical guidance |
Tailoring the PRM
The PRM is a reference model, not a mandate. Organizations must tailor it to their specific project context while maintaining the integrity of process purposes and outcomes.
Tailoring Dimensions
| Dimension | Tailoring Options | Example |
|---|---|---|
| Scope | Include or exclude entire process groups | Software-only project excludes HWE |
| Depth | Adjust the rigor applied to each process | Low-risk subsystem may apply SWE.4 with reduced coverage |
| Sequence | Modify the execution order of processes | Agile iterations may interleave SWE.1-SWE.4 |
| Work products | Merge or split deliverables | Combine SRS and architecture into a single document for small projects |
| Roles | Adjust staffing and responsibility assignments | One engineer may cover SWE.1 and SWE.2 on a small team |
Common Tailoring Scenarios
| Project Type | Typical Tailoring |
|---|---|
| Software-only (no HW) | Exclude HWE group; SYS.3 allocates only to SWE |
| ML-enabled product | Include MLE group; add SUP.11 for data management |
| Cybersecurity-critical | Include SEC group; feed SEC.1 outputs into SYS.2 and SWE.1 |
| Agile/iterative | Apply all processes but within sprint cadence; incremental outcomes |
| Maintenance/update project | Focus on SWE.1 (change impact), SWE.3-4 (implementation/verification), SUP.10 |
| Safety-critical (ASIL C/D) | Full scope with enhanced rigor; formal verification in SWE.4 |
Important: Tailoring must be justified and documented. An assessor will look for a tailoring rationale that explains why certain processes were scoped out and confirms that the remaining processes still achieve their purposes.
AI Impact on PRM
AI does not change the PRM itself—the purposes and outcomes remain the same. What changes is how those outcomes are achieved in practice. The table below maps each process group to the AI capabilities most relevant to its outcomes.
Engineering Processes
| Process | AI Impact Area | Practical Effect |
|---|---|---|
| SYS.1 | NLP analysis of stakeholder documents | Automated extraction of requirements from meeting transcripts, emails, and legacy documents |
| SYS.2 | Requirement generation and consistency checking | AI suggests derived requirements and flags contradictions |
| SYS.3 | Architecture pattern recommendation | AI proposes allocation strategies based on historical projects |
| SWE.1 | Requirements drafting and analysis | AI generates initial SRS content from system requirements; detects ambiguity and incompleteness |
| SWE.2 | Architecture validation | AI evaluates architecture against quality attributes and known anti-patterns |
| SWE.3 | Code generation and detailed design | AI generates code from design specifications; produces design documentation from code |
| SWE.4 | Test generation and static analysis | AI generates unit tests targeting coverage goals; augments static analysis |
| SWE.5 | Integration test generation | AI identifies integration scenarios from architecture interfaces |
| SWE.6 | Qualification test case derivation | AI maps requirements to test cases and identifies gaps |
| MLE.1 | ML requirement elicitation | AI assists in defining performance metrics, data requirements, and operational constraints |
| MLE.3 | Hyperparameter optimization | AI automates model training pipelines and experiment tracking |
Support and Management Processes
| Process | AI Impact Area | Practical Effect |
|---|---|---|
| SUP.1 | Automated compliance checking | AI scans work products against process rules and flags deviations |
| SUP.8 | Intelligent configuration management | AI detects configuration drift and suggests baseline candidates |
| SUP.9 | Problem triage and root cause analysis | AI classifies defects, suggests root causes, and recommends fixes |
| SUP.10 | Change impact analysis | AI traces change requests through the dependency graph and estimates impact |
| MAN.3 | Schedule and resource optimization | AI predicts bottlenecks and recommends resource allocation |
| MAN.5 | Risk identification and assessment | AI monitors project signals and flags emerging risks |
| MAN.6 | Metric collection and trend analysis | AI automates data collection and generates trend dashboards |
Non-Negotiable Principle: AI can support outcome achievement, but the outcomes must still be achieved regardless of automation level. The PRM does not care how you get there — just that you arrive. Human accountability for process outcomes remains mandatory under ASPICE.
Putting the PRM to Work
For Implementing Processes
- Read the process purpose to understand the intent
- Review each outcome to identify what you need to achieve
- Design activities that produce those outcomes
- Identify work products that evidence the outcomes
For Assessing Processes
- Collect evidence of each outcome
- Rate achievement level for each outcome
- Aggregate to determine overall process achievement
- Use outcome gaps to identify improvements
For Improving Processes
- Identify outcomes that are not fully achieved
- Analyze root causes of the gaps
- Design process changes to close the gaps
- Verify improvements through outcome measurement
Practical Application: PRM as a Project Planning Tool
The PRM can serve as the backbone of your project plan when combined with AI assistance. Here is a step-by-step approach.
Step 1: Scope Determination
Start by selecting the applicable process groups based on your project type.
| Question | If Yes | If No |
|---|---|---|
| Does the project include hardware? | Include HWE.1-HWE.4 | Exclude HWE |
| Does the project include ML components? | Include MLE.1-MLE.5 and SUP.11 | Exclude MLE and SUP.11 |
| Is the product network-connected? | Include SEC.1-SEC.3 | Evaluate SEC need case-by-case |
| Is there a supplier relationship? | Include ACQ.4 | Exclude ACQ |
| Is this a product release? | Include SPL.2 | Exclude SPL |
Step 2: Outcome Mapping
For each selected process, list every outcome and map it to a project deliverable.
| Process | Outcome | Deliverable | Owner | AI Assist? |
|---|---|---|---|---|
| SWE.1 O1 | SW requirements specified | SRS document | SW Lead | Yes — draft generation |
| SWE.1 O3 | Requirements analyzed | Review records | Reviewer | Yes — consistency check |
| SWE.2 O1 | Architecture established | SAD document | Architect | Yes — pattern suggestion |
| SWE.4 O1 | Units verified | Test results | Test Engineer | Yes — test generation |
Step 3: Dependency Sequencing
Use the process relationships (vertical and horizontal) to sequence activities. AI can assist by automatically detecting dependency chains and flagging scheduling conflicts.
Step 4: AI Integration Planning
For each process where AI will assist, document:
- The specific AI capability being used
- The human review checkpoint
- The acceptance criteria for AI-generated output
- The fallback procedure if AI output is inadequate
Best Practice: Create a PRM coverage matrix at project start. Review it at each milestone to confirm all outcomes remain on track. AI-powered dashboards can maintain this matrix automatically, flagging outcomes that are falling behind.
Implementation Checklist
Use this checklist when applying the PRM to a new project.
| Step | Action | Status |
|---|---|---|
| 1 | Identify project type and applicable domains (SW, HW, ML, SEC) | |
| 2 | Select applicable process groups from the PRM | |
| 3 | Document tailoring rationale for excluded processes | |
| 4 | For each included process, list all outcomes | |
| 5 | Map outcomes to project deliverables and work products | |
| 6 | Assign process owners and responsible roles | |
| 7 | Identify AI-assisted activities and define HITL checkpoints | |
| 8 | Establish traceability links between process outcomes and deliverables | |
| 9 | Define evidence collection approach for each outcome | |
| 10 | Create a PRM coverage matrix and review schedule | |
| 11 | Align with PAM base practices for assessment readiness | |
| 12 | Conduct initial gap analysis against target capability level |
Summary
The Process Reference Model (PRM):
- Defines what processes exist and their purposes
- Specifies outcomes that indicate successful implementation
- Remains technology-agnostic and implementation-neutral
- Provides the foundation for assessment and improvement
- Organizes processes into categories (Primary, Support, Management, Organizational) and groups (SYS, SWE, HWE, MLE, SUP, SEC, MAN, ACQ, SPL)
- Changed significantly from ASPICE 3.1 to 4.0 with new MLE and SEC groups and removal of SUP.2-7
- Serves as a project planning tool when combined with outcome mapping and AI integration
The PRM answers "What should we achieve?" Next up, the PAM answers "How do we assess achievement?"