2.5: SWE.5 Software Component Verification and Integration Verification


Process Definition

Purpose

SWE.5 Purpose: To verify the integration of the software elements and to verify the software components.

Outcomes

Outcome Description
O1 Verification measures are specified for software integration verification
O2 Verification measures for software components are specified
O3 Software elements are integrated up to a complete integrated software
O4 Verification measures are selected according to the release scope
O5 Software components are verified and results recorded
O6 Integrated software elements are verified and results recorded
O7 Consistency and bidirectional traceability are established
O8 Results are summarized and communicated to all affected parties

Base Practices with AI Integration

BP Base Practice AI Level AI Application
BP1 Specify software integration verification measures L1-L2 Integration test specification from architecture
BP2 Specify verification measures for software component behavior L1-L2 Component test specification
BP3 Select verification measures L2 Coverage optimization, test prioritization
BP4 Integrate software elements and perform integration verification L2-L3 Automated integration and test execution
BP5 Perform software component verification L2-L3 Automated component verification
BP6 Ensure consistency and establish bidirectional traceability L2 Trace generation, consistency checking
BP7 Summarize and communicate results L2 Test report generation

Integration Test Levels

Test Environment Options

The following diagram compares the three primary integration test environments (MIL, SIL, PIL), showing their scope, fidelity level, and typical use cases within the V-Model verification flow.

Note: MIL, SIL, and PIL are test environment options, not mandatory stages. Projects select environments based on verification needs. Typical progression is MIL→SIL→PIL, but order and selection vary by project.

Test Environments

Environment Focus When to Use
MIL Algorithm correctness Early validation, control algorithms
SIL Interface, logic flow Most integration verification
PIL Timing, target-specific Performance-critical code

Integration Strategy

Integration Sequence

The diagram below shows the step-by-step integration sequence, illustrating how software components are progressively combined and verified at each integration stage.

Integration Steps


Integration Test Specification

Test Case Example

---
ID: SWE-IT-050
Title: DoorLock Service Integration
Type: Integration Test
Priority: High
Layer: Application to Service
Components: DoorLockControl, ActuatorService, GPIO_Driver
Architecture: SWE.2 BCM Layered Architecture
---

## Objective

Verify correct integration between DoorLockControl application
component and ActuatorService/GPIO_Driver stack.

## Test Configuration

- Platform: SIL (host simulation)
- GPIO: Simulated with state capture
- Timing: Host timer (scaled for accuracy)

## Test Procedure

### Scenario 1: Lock Command Flow

| Step | Action | Verification Point |
|------|--------|-------------------|
| 1 | Initialize all components | Init returns OK for all |
| 2 | Call DoorLockControl_SetAllDoors(LOCK) | Function returns E_OK |
| 3 | Capture GPIO states | All 4 lock GPIOs = HIGH |
| 4 | Verify call sequence | DoorLock → Actuator → GPIO |
| 5 | Measure timing | < 10ms end-to-end |

### Scenario 2: Error Propagation

| Step | Action | Verification Point |
|------|--------|-------------------|
| 1 | Configure GPIO to fail | Inject DOOR_FL GPIO error |
| 2 | Call DoorLockControl_SetAllDoors(LOCK) | Function returns E_NOT_OK |
| 3 | Check error propagation | Error propagates up stack |
| 4 | Verify DTC | DEM event recorded |

## Expected Results

- Interface contracts honored between layers
- Timing within budget
- Error propagation correct

## Coverage

- Architecture interfaces: SWE.2 Interface I-001, I-002, I-003
- Integration points: 12 API calls verified

## Traceability

- Architecture: SWE.2 Layered Architecture
- Requirements: SWE-BCM-103, SWE-BCM-105

AI-Assisted Integration Testing

L2: Test Case Generation

integration_test_generation:
  input:
    architecture: "swe2_bcm_architecture.xml"
    interfaces:
      - "DoorLockControl → ActuatorService"
      - "ActuatorService → GPIO_Driver"
      - "GPIO_Driver → DIO_Module"

  ai_analysis:
    - Extract interface definitions
    - Identify data flow paths
    - Generate call sequence tests
    - Create error injection scenarios
    - Calculate coverage matrix

  generated_tests:
    normal_flow:
      - "test_LockCommand_FullStack_Success"
      - "test_UnlockCommand_FullStack_Success"
      - "test_GetState_AllLayers_Consistent"

    error_scenarios:
      - "test_DriverFailure_PropagatesUp"
      - "test_ServiceTimeout_HandledCorrectly"
      - "test_InvalidParameter_RejectedAtBoundary"

    boundary_conditions:
      - "test_MaxConcurrentCommands"
      - "test_RapidLockUnlockSequence"
      - "test_TimingBoundary_10ms"

  human_review:
    - Verify scenario completeness
    - Confirm error injection validity
    - Approve timing constraints

Generated SIL Test

/**
 * @file test_DoorLock_Integration.c
 * @brief Integration tests for DoorLock stack
 * @trace SWE-IT-050, SWE-IT-051
 * @level SIL
 */

#include "unity.h"
#include "DoorLockControl.h"
#include "ActuatorService.h"
#include "GPIO_Driver.h"
#include "test_harness_sil.h"

/* GPIO state capture */
static uint8 CapturedGpioStates[4];
static uint32 GpioCallSequence[4];
static uint32 CallSequenceIndex;

/*===========================================================================*/
/* TEST HARNESS SETUP                                                         */
/*===========================================================================*/

void setUp(void)
{
    /* Initialize all stack components */
    GPIO_Driver_Init();
    ActuatorService_Init();
    DoorLockControl_Init();

    /* Reset capture buffers */
    memset(CapturedGpioStates, 0, sizeof(CapturedGpioStates));
    memset(GpioCallSequence, 0, sizeof(GpioCallSequence));
    CallSequenceIndex = 0U;

    /* Install GPIO capture hook */
    TestHarness_InstallGpioHook(GpioCaptureCallback);
}

void tearDown(void)
{
    TestHarness_RemoveGpioHook();
}

/* GPIO capture callback for verifying call sequence */
void GpioCaptureCallback(uint8 pin, uint8 state)
{
    if (CallSequenceIndex < 4U)
    {
        GpioCallSequence[CallSequenceIndex] = pin;
        CapturedGpioStates[pin] = state;
        CallSequenceIndex++;
    }
}

/*===========================================================================*/
/* INTEGRATION TESTS                                                          */
/*===========================================================================*/

/**
 * @test SWE-IT-050-001
 * @brief Verify lock command flows through entire stack
 * @trace SWE-BCM-103
 */
void test_LockCommand_FullStack_InterfaceFlow(void)
{
    Std_ReturnType result;
    uint32 startTime, endTime;

    /* Act - Lock all doors */
    startTime = TestHarness_GetTime_us();
    result = DoorLockControl_SetAllDoors(LOCK_CMD_LOCK);
    endTime = TestHarness_GetTime_us();

    /* Assert - Function succeeds */
    TEST_ASSERT_EQUAL(E_OK, result);

    /* Assert - All GPIOs set to HIGH (lock active) */
    TEST_ASSERT_EQUAL(GPIO_STATE_HIGH, CapturedGpioStates[GPIO_PIN_DOOR_FL]);
    TEST_ASSERT_EQUAL(GPIO_STATE_HIGH, CapturedGpioStates[GPIO_PIN_DOOR_FR]);
    TEST_ASSERT_EQUAL(GPIO_STATE_HIGH, CapturedGpioStates[GPIO_PIN_DOOR_RL]);
    TEST_ASSERT_EQUAL(GPIO_STATE_HIGH, CapturedGpioStates[GPIO_PIN_DOOR_RR]);

    /* Assert - Correct sequence (FL, FR, RL, RR) */
    TEST_ASSERT_EQUAL(GPIO_PIN_DOOR_FL, GpioCallSequence[0]);
    TEST_ASSERT_EQUAL(GPIO_PIN_DOOR_FR, GpioCallSequence[1]);
    TEST_ASSERT_EQUAL(GPIO_PIN_DOOR_RL, GpioCallSequence[2]);
    TEST_ASSERT_EQUAL(GPIO_PIN_DOOR_RR, GpioCallSequence[3]);

    /* Assert - Timing within budget */
    TEST_ASSERT_LESS_THAN(10000U, endTime - startTime);  /* < 10ms */
}

/**
 * @test SWE-IT-050-002
 * @brief Verify error propagates from driver to application
 * @trace SWE-BCM-105
 */
void test_DriverError_PropagatesThrough_AllLayers(void)
{
    Std_ReturnType result;
    Dem_EventStatusType dtcStatus;

    /* Arrange - Inject GPIO driver failure */
    TestHarness_InjectGpioError(GPIO_PIN_DOOR_FL, GPIO_ERROR_WRITE_FAIL);

    /* Act */
    result = DoorLockControl_SetAllDoors(LOCK_CMD_LOCK);

    /* Assert - Error returned to application */
    TEST_ASSERT_EQUAL(E_NOT_OK, result);

    /* Assert - DTC recorded */
    dtcStatus = Dem_GetEventStatus(DEM_EVENT_DOORLOCK_ACTUATOR);
    TEST_ASSERT_EQUAL(DEM_EVENT_STATUS_FAILED, dtcStatus);

    /* Cleanup */
    TestHarness_ClearGpioError(GPIO_PIN_DOOR_FL);
}

/**
 * @test SWE-IT-050-003
 * @brief Verify layer boundary parameter validation
 */
void test_InvalidParameter_RejectedAtLayerBoundary(void)
{
    Std_ReturnType result;
    DoorLockControl_State_t state;

    /* Test Application → Service boundary */
    result = DoorLockControl_GetState(DOOR_COUNT, &state);  /* Invalid door ID */
    TEST_ASSERT_EQUAL(E_NOT_OK, result);

    /* Verify no calls to lower layers */
    TEST_ASSERT_EQUAL(0U, CallSequenceIndex);  /* No GPIO calls made */
}

/**
 * @test SWE-IT-050-004
 * @brief Verify concurrent access handling
 * @note TestHarness_ScheduleCall API is illustrative; real concurrent testing
 *       requires proper task scheduling simulation or RTOS-specific handling.
 */
void test_ConcurrentAccess_HandledCorrectly(void)
{
    /* This test simulates concurrent access from multiple tasks */
    /* In SIL environment, use test scheduler to interleave calls */

    Std_ReturnType result1, result2;

    /* Schedule overlapping lock/unlock requests */
    TestHarness_ScheduleCall(0,    DoorLockControl_SetAllDoors, LOCK_CMD_LOCK, &result1);
    TestHarness_ScheduleCall(5000, DoorLockControl_SetAllDoors, LOCK_CMD_UNLOCK, &result2);

    /* Execute scheduled calls */
    TestHarness_ExecuteSchedule();

    /* Both should complete (with resource protection) */
    TEST_ASSERT_EQUAL(E_OK, result1);
    TEST_ASSERT_EQUAL(E_OK, result2);
}

Coverage Analysis

Interface Coverage Matrix

The following diagram presents the interface coverage matrix, mapping each software interface to its corresponding integration tests and showing overall coverage status.

AI Integration Testing


Work Products

Note: Work Product IDs follow ASPICE 4.0 standard numbering.

WP ID Work Product AI Role
08-60 Verification Measure Test specification generation
03-50 Verification Measure Data Result analysis
15-52 Verification Results Report generation
13-51 Consistency Evidence Traceability checking

Summary

SWE.5 Software Component Verification and Integration Verification:

  • AI Level: L2 (automated test generation and execution)
  • Primary AI Value: Component/interface verification, coverage analysis
  • Human Essential: Integration strategy, sequence decisions
  • Key Environments: SIL, PIL, MIL
  • Focus: Component and inter-layer interface verification