1.1: The Systems Engineering Mindset

Thinking Patterns of Successful Systems Engineers

Pattern 1: Start with "Why?"

A Systems Engineer Always Asks:

  • Why is this feature needed? (Stakeholder need)
  • Why this architecture? (Trade-off rationale)
  • Why this requirement? (Business value)

Example Dialogue:

Software Engineer: "We should add a config file for PID parameters."
Systems Engineer: "Why do we need runtime configuration?"
Software Engineer: "So we can tune the controller without recompiling."
Systems Engineer: "Is that a customer requirement? Or engineering preference?"
Software Engineer: "Engineering preference... makes testing easier."
Systems Engineer: "But it adds complexity—file I/O, parsing, error handling—
                    for ASIL-B, we need to verify all configurations. Let's check
                    if the customer actually needs this before adding it."

Lesson: Question every feature. If no requirement justifies it, defer it.

Resolution Outcome for Dialogue Above: After this discussion, the team reviewed customer requirements and found no runtime tuning requirement. The PID parameters were hard-coded and validated during calibration phase, simplifying the design and reducing ASIL-B verification scope. This saved approximately 2 weeks of development effort.


Pattern 2: Think in Scenarios, Not Features

Bad Thinking (feature-first):

"We need an ACC system with:
- Radar sensor
- Brake control
- Speed display"

Good Thinking (scenario-first):

Scenario 1: Highway cruising (75 mph, clear road)
  → System shall maintain set speed ±2 mph
  → System shall detect vehicles >200m ahead

Scenario 2: Traffic jam (15 mph, stop-and-go)
  → System shall maintain 2-second following distance
  → System shall brake smoothly (deceleration ≤3 m/s²)

Scenario 3: Emergency (obstacle <5m, closing speed >10 mph)
  → System shall apply emergency brake
  → System shall alert driver (visual + audible)

Benefit: Scenarios reveal interactions between features that feature-list thinking misses.


Pattern 3: Assume Nothing Works

Defensive Systems Thinking:

  • "What if the sensor fails?" → Fail-safe behavior
  • "What if CAN times out?" → Error handling
  • "What if both radar and camera fail?" → Redundancy strategy

Example: ACC Sensor Failure Analysis

Scenario: Radar sensor fails (CAN timeout)

Naive Response:

if (CAN_timeout) {
    return -1;  // Error
}

[FAIL] Problem: What happens to the vehicle? Does ACC disable? Does it brake? Undefined behavior.

Systems Engineer Response:

Requirement [SWE-089-1]: If radar sensor fails (CAN timeout >100ms):
1. System shall transition to safe state (disable ACC, release brake/throttle)
2. System shall alert driver (HMI: "ACC UNAVAILABLE - SENSOR FAULT")
3. System shall log fault (diagnostic code: DTC_RADAR_TIMEOUT)
4. System shall NOT attempt restart until ignition cycle

Rationale: ISO 26262 Part 6 requires safe state transition on sensor fault.

[PASS] Correct: Explicit fail-safe behavior, meets safety requirements


Pattern 4: Design for Verification

Principle: If you can't test it, don't build it

Bad Requirement (untestable):

[SYS-045] The system shall respond quickly to obstacles.

[FAIL] Problem: "Quickly" is vague, not measurable

Good Requirement (testable):

[SYS-045] The system shall detect obstacles <5m within 50ms (95th percentile).

Verification Method: HIL test with simulated radar data.
Test Procedure: Inject obstacle at 5m, measure detection latency over 1000 trials.
Pass Criteria: ≥95% of trials have latency ≤50ms.

[PASS] Correct: Quantified, measurable, testable

Design-for-Test Checklist:

  • ☐ Requirement has numeric criteria? (latency, accuracy, range)
  • ☐ Verification method defined? (unit test, HIL, proving ground)
  • ☐ Test environment feasible? (can we simulate this?)
  • ☐ Pass/fail criteria clear? (no subjective judgment)

Pattern 5: Document Decisions, Not Just Requirements

What to Document:

  1. Requirements: What the system must do
  2. Architecture Decisions: Why we chose this design (ADRs)
  3. Trade-offs: What we gave up to get this benefit

Example: ADR for AUTOSAR Classic vs Adaptive

Context: Need to choose OS/middleware for ACC ECU

Decision: Use AUTOSAR Classic R4.4 (not Adaptive R21-11)

Rationale:

  • Pro Classic: Proven (100M vehicles), lower cost (€50k tools), mature ecosystem
  • Pro Adaptive: Modern, flexible, OTA updates, better for future ML
  • Constraint: Customer does not require OTA (not in contract)
  • Trade-off: Accept lack of OTA updates to save €150k in tooling and training

Consequences:

  • Positive: Lower cost, faster development, less risk
  • Negative: No OTA (must reflash ECU for updates)
  • Mitigation: If OTA needed in next generation, migrate then

Benefit of ADR: Future engineers understand why Classic was chosen, not just what was decided. See Section 33.03 for ADR details.

Mindset Self-Assessment: Ask yourself these questions regularly: (1) Can I trace this decision to a requirement? (2) Have I considered all failure modes? (3) Am I solving the right problem? (4) What are the trade-offs? If you struggle to answer, apply the thinking patterns from this section.


Common Pitfalls and How to Avoid Them

Pitfall 1: Gold Plating

Problem: Adding features not in requirements ("It would be nice if...")

Example:

Engineer: "Let's add a mobile app to control ACC from your phone!"
Manager: "Is that in the customer requirements?"
Engineer: "No, but it's cool and easy to add."
Manager: "No. We don't add features not in requirements. It adds:
          - Testing burden (app + ECU integration tests)
          - Security risk (wireless attack surface)
          - Cost (app development, maintenance)
          - Schedule risk (scope creep)
         If the customer wants it next year, we'll quote it separately."

Rule: Never add features not explicitly required. Every feature has cost (testing, verification, maintenance).


Pitfall 2: Premature Optimization

Problem: Optimizing before requirements are stable

Example:

Engineer: "I optimized the Kalman filter to run in 5ms (was 20ms)."
Systems Engineer: "Great, but the requirement is ≤50ms. You spent 2 weeks
                    optimizing for 45ms margin. Was that the best use of time?"
Engineer: "Uh... I guess we could have used that time to implement [SWE-089]
           which is still pending..."

Rule: Meet requirements first, optimize only if needed (or if risk mitigation for safety-critical code).


Pitfall 3: Solutioning Without Requirements

Problem: Jumping to solutions before understanding the problem

Bad Sequence:

1. Engineer proposes solution: "Let's use a Kalman filter!"
2. Manager asks: "For what?"
3. Engineer realizes: "Uh... I don't actually know the requirement..."

Good Sequence:

1. Understand requirement: "Fuse radar + camera to achieve 95% accuracy"
2. Research options: Simple averaging (85%), Kalman (95%), ML (98%)
3. Evaluate trade-offs: Cost, complexity, accuracy, verification effort
4. Propose solution: "Kalman filter meets requirement at lowest cost/risk"
5. Document decision: ADR-007

Pitfall 4: Ignoring Non-Functional Requirements

Problem: Focusing only on features, ignoring performance, safety, usability

Incomplete Requirements:

[SYS-045] The system shall detect obstacles.

[FAIL] Missing:

  • Performance: Detection latency? (≤50ms?)
  • Accuracy: Detection rate? (≥95%?)
  • Range: Obstacle distance? (0-200m?)
  • Safety: What if sensor fails? (Fail-safe behavior?)
  • Environment: Operational conditions? (Day/night, rain/snow?)

Complete Requirements:

[SYS-045] Obstacle Detection
The system shall detect obstacles in the vehicle's path under the following conditions:
- Range: 0-200 meters
- Accuracy: ≥95% detection rate (no false negatives for objects >0.5m² cross-section)
- Latency: Detection within 50ms of object entering sensor field of view
- Environment: Day/night, dry/rain conditions
  - Degraded mode (snow): 80% accuracy acceptable
- Fail-Safe: If sensor fails, system transitions to safe state (disable ACC, alert driver)
- Safety Class: ASIL-B (ISO 26262)

[PASS] Complete: Functional + non-functional requirements


Practical Exercises

Exercise 1: Requirements Quality Check

Given Requirement:

[SYS-123] The infusion pump shall deliver medication safely.

Your Task: Identify issues, rewrite requirement to be testable.

Solution:

Issues:
- "Safely" is vague (not quantified)
- No delivery accuracy specified
- No failure behavior defined

Rewritten:
[SYS-123] Medication Delivery Accuracy
The infusion pump shall deliver medication with accuracy ±5% of prescribed dose
over 1-hour infusion period.

Verification: Bench test with calibrated flow meter, measure delivered volume
vs prescribed volume over 100 trials.

[SYS-123-1] Occlusion Detection
If tubing occlusion detected (pressure >300 mmHg), pump shall:
1. Stop delivery within 1 second
2. Sound alarm (85 dB at 1 meter)
3. Display "OCCLUSION DETECTED" on screen

Verification: Bench test with clamped tubing, measure alarm latency.
Safety Class: Class C (IEC 62304)

Exercise 2: Trade-Off Analysis

Scenario: Choose sensor for ACC system

Options:

  • Option A: Radar only (€100, 90% accuracy, works in rain/fog)
  • Option B: Camera only (€80, 95% accuracy day, 60% night)
  • Option C: Radar + Camera fusion (€180, 98% accuracy, all conditions)

Requirements:

  • Accuracy: ≥95%
  • Budget: €150
  • Environment: Day/night, rain/fog

Your Task: Which option? Justify.

Solution:

Analysis:
- Option A: Does not meet accuracy requirement (90% < 95%) [FAIL]
- Option B: Does not meet night accuracy (60% < 95%) [FAIL]
- Option C: Meets all requirements (98% ≥ 95%) but €180 exceeds €150 budget [WARN]

Trade-off Decision:
Option C exceeds budget by €30. Options:
1. Negotiate budget increase (justify: safety-critical, customer satisfaction)
2. Request relaxed accuracy requirement (negotiate with customer: 90% acceptable?)
3. Choose Option B, add IR illumination for night (cost: €120 total, meets 95%)

Recommendation: Option 3 (Camera + IR) - Meets requirements within budget.
Document in ADR-008.

Summary

Systems Engineering Mindset Patterns:

  1. Start with "Why?": Question every feature, trace to stakeholder need
  2. Think in Scenarios: Use cases reveal interactions missed by feature lists
  3. Assume Nothing Works: Design fail-safe behavior for all failure modes
  4. Design for Verification: If untestable, don't build it
  5. Document Decisions: Capture rationale (ADRs), not just requirements

Common Pitfalls: Gold plating, premature optimization, proposing solutions without requirements, ignoring non-functional requirements

Next: Requirements Engineering Practice (33.02) — Hands-on techniques for eliciting, analyzing, and managing requirements


Self-Assessment Quiz

Test your understanding of the systems engineering mindset. Answers are at the bottom.

Question 1: A developer proposes adding Bluetooth connectivity to an ECU that currently only requires CAN communication. What should be your first response?

  • A) Approve it—Bluetooth adds future flexibility
  • B) Ask "Why?"—Is this traced to a stakeholder requirement?
  • C) Reject it—Bluetooth is a security risk
  • D) Defer to the project manager

Question 2: Which of the following is a testable requirement?

  • A) "The system shall respond quickly to user input"
  • B) "The system shall be user-friendly"
  • C) "The system shall display sensor values within 100ms of acquisition"
  • D) "The system shall perform well under load"

Question 3: A radar sensor fails (CAN timeout). What is the correct systems engineering response?

  • A) Return an error code and let the calling function handle it
  • B) Retry the sensor 3 times before failing
  • C) Define explicit safe state transition, driver alert, and fault logging
  • D) Ignore the timeout if it's less than 500ms

Question 4: You've optimized a function from 50ms to 5ms execution time. The requirement specifies ≤100ms. What should you consider?

  • A) Great work! Faster is always better
  • B) Was this the best use of development time vs other pending requirements?
  • C) Document the optimization for future performance credits
  • D) Request a more stringent requirement to match the new performance

Question 5: What is the primary purpose of an Architecture Decision Record (ADR)?

  • A) To document the final architecture design
  • B) To capture why a design choice was made, including trade-offs
  • C) To satisfy ASPICE documentation requirements
  • D) To provide training material for new team members

Quiz Answers

  1. B - Always trace features to requirements before approving. "Why?" reveals if it's a real need or gold plating.

  2. C - Contains a specific, measurable criterion (100ms). Options A, B, and D use vague terms that cannot be objectively verified.

  3. C - Systems engineers define complete failure behavior including safe state, user notification, and diagnostic logging. Option A leaves behavior undefined.

  4. B - Premature optimization consumes time that could address pending requirements. Meeting requirements is the goal, not exceeding them unnecessarily.

  5. B - ADRs document the rationale and trade-offs, enabling future engineers to understand why decisions were made, not just what was decided.

Score Interpretation:

  • 5/5: Excellent systems engineering mindset
  • 3-4/5: Good foundation, review the patterns for missed questions
  • 1-2/5: Re-read the chapter, focusing on the five thinking patterns