5.0: CI/CD Integration

What You'll Learn

By the end of this chapter, you will be able to:

  • Understand CI/CD principles and their application to embedded systems development
  • Design multi-stage pipelines for cross-compilation and hardware testing
  • Integrate AI-powered optimization into build and test workflows
  • Configure quality gates for ASPICE compliance
  • Implement release automation with proper change control

Key Terms

Term Definition
CI Continuous Integration—automated building and testing on every commit
CD Continuous Delivery/Deployment—automated release to staging or production
Pipeline Defined sequence of stages that code passes through before release
Quality Gate Checkpoint that blocks progression if quality criteria are not met
Artifact Build output (binary, documentation, reports) stored for downstream use
Self-Hosted Runner CI agent running on your own infrastructure (required for HIL)

Chapter Overview

Cross-Reference: For ASPICE processes related to integration and continuous improvement, see Part II ASPICE Processes:

  • SWE.5: Software Integration and Integration Testing
  • MAN.3: Project Management
  • SUP.8: Configuration Management
  • SUP.10: Change Request Management

CI/CD transforms embedded development from manual, error-prone processes into automated, repeatable workflows. This chapter covers the unique challenges of CI/CD for embedded systems and provides practical implementation guidance.


CI/CD for Embedded Systems

Embedded CI/CD differs from web application CI/CD in several key ways:

Aspect Web Application Embedded Systems
Build Native compilation Cross-compilation for target MCU
Testing Run on build server Requires simulators, emulators, or hardware
Deployment Push to servers Flash to target devices
Environment Containers/VMs Hardware-in-the-Loop (HIL) rigs
Artifacts Docker images ELF, HEX, BIN, S-Record files
Qualification Optional Required for ISO 26262, IEC 61508

Pipeline Architecture

Multi-Stage Embedded Pipeline

This diagram illustrates the multi-stage CI/CD pipeline for embedded systems, showing build, static analysis, unit test, integration test, and deployment stages with quality gates between them.

Embedded CI/CD Pipeline


Platform Comparison

Platform Self-Hosted Docker Support HIL Integration Cost
GitHub Actions Yes Excellent Via runners Free tier + $
GitLab CI Yes Excellent Via runners Free tier + $$
Jenkins Only Plugin-based Excellent Free (OSS)
Azure DevOps Yes Good Via agents Free tier + $
CircleCI Yes Excellent Via runners Free tier + $$
Buildkite Yes Excellent Excellent $$

Recommendation for Embedded:

  • GitLab CI: Best all-around for embedded teams
  • Jenkins: Most flexibility for complex HIL setups
  • GitHub Actions: Best for open-source or GitHub-centric teams

GitLab CI Configuration

Complete Embedded Pipeline

# .gitlab-ci.yml
stages:
  - lint
  - build
  - test
  - analyze
  - integration
  - release

variables:
  TARGET: stm32f407vg
  TOOLCHAIN: arm-none-eabi
  GCC_VERSION: "12.2"

# Default settings for all jobs
default:
  image: registry.gitlab.com/embedded/toolchain:${GCC_VERSION}
  cache:
    key: ${CI_COMMIT_REF_SLUG}
    paths:
      - build/

# ====================
# STAGE: Lint
# ====================
lint:format:
  stage: lint
  script:
    - clang-format --dry-run --Werror src/*.c include/*.h
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event"

lint:commit:
  stage: lint
  script:
    - pip install commitlint
    - commitlint --from ${CI_MERGE_REQUEST_DIFF_BASE_SHA}
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event"

# ====================
# STAGE: Build
# ====================
build:debug:
  stage: build
  script:
    - mkdir -p build/debug
    - cmake -B build/debug -DCMAKE_BUILD_TYPE=Debug
    - cmake --build build/debug -j$(nproc)
    - ${TOOLCHAIN}-size build/debug/*.elf
  artifacts:
    paths:
      - build/debug/*.elf
      - build/debug/*.map
    expire_in: 1 week

build:release:
  stage: build
  script:
    - mkdir -p build/release
    - cmake -B build/release -DCMAKE_BUILD_TYPE=Release
    - cmake --build build/release -j$(nproc)
    - ${TOOLCHAIN}-objcopy -O ihex build/release/firmware.elf build/release/firmware.hex
    - ${TOOLCHAIN}-objcopy -O binary build/release/firmware.elf build/release/firmware.bin
  artifacts:
    paths:
      - build/release/*.elf
      - build/release/*.hex
      - build/release/*.bin
      - build/release/*.map
    expire_in: 1 month

# ====================
# STAGE: Test
# ====================
test:unit:
  stage: test
  script:
    - cmake -B build/test -DBUILD_TESTS=ON -DCOVERAGE=ON
    - cmake --build build/test
    - cd build/test && ctest --output-on-failure
    - gcovr --xml-pretty -r ../.. -o coverage.xml
  coverage: '/Total:.*\s+(\d+\.?\d*)%/'
  artifacts:
    reports:
      coverage_report:
        coverage_format: cobertura
        path: build/test/coverage.xml
      junit: build/test/results.xml

test:sil:
  stage: test
  image: renode/renode:latest
  dependencies:
    - build:debug
  script:
    - renode --disable-xwt -e "include @tests/sil/run_tests.resc; quit"
  artifacts:
    reports:
      junit: test-results/sil-results.xml

# ====================
# STAGE: Analyze
# ====================
analyze:cppcheck:
  stage: analyze
  script:
    - cppcheck --enable=all --xml --xml-version=2 src/ 2> cppcheck.xml
    - cppcheck-htmlreport --file=cppcheck.xml --report-dir=cppcheck-report
  artifacts:
    paths:
      - cppcheck-report/
    reports:
      codequality: cppcheck.xml

analyze:misra:
  stage: analyze
  script:
    - cppcheck --addon=misra.py --xml src/ 2> misra.xml
  artifacts:
    reports:
      codequality: misra.xml
  allow_failure: true  # Advisory in MR, blocking in release

# ====================
# STAGE: Integration (after merge)
# ====================
integration:hil:
  stage: integration
  tags:
    - hil-runner  # Runs on self-hosted runner with hardware
  dependencies:
    - build:release
  script:
    - python3 hil/flash_and_test.py build/release/firmware.hex
    - python3 hil/run_integration_suite.py
  artifacts:
    reports:
      junit: hil/results.xml
    paths:
      - hil/logs/
  rules:
    - if: $CI_COMMIT_BRANCH == "main"
    - if: $CI_COMMIT_BRANCH == "develop"
  resource_group: hil-rig-1  # Exclusive access to hardware

integration:coverity:
  stage: integration
  image: coverity/analysis:latest
  script:
    - cov-build --dir cov-int cmake --build build/release
    - cov-analyze --dir cov-int --all --security
    - cov-format-errors --dir cov-int --json-output-v7 coverity-results.json
  artifacts:
    paths:
      - coverity-results.json
  rules:
    - if: $CI_COMMIT_BRANCH == "main"

# ====================
# STAGE: Release
# ====================
release:sign:
  stage: release
  dependencies:
    - build:release
    - integration:hil
    - integration:coverity
  script:
    - python3 scripts/sign_firmware.py build/release/firmware.bin
    - sha256sum build/release/firmware.bin > build/release/firmware.sha256
  artifacts:
    paths:
      - build/release/firmware.bin.signed
      - build/release/firmware.sha256
  rules:
    - if: $CI_COMMIT_TAG

release:create:
  stage: release
  image: registry.gitlab.com/gitlab-org/release-cli:latest
  dependencies:
    - release:sign
  script:
    - echo "Creating release ${CI_COMMIT_TAG}"
  release:
    tag_name: ${CI_COMMIT_TAG}
    description: "Release ${CI_COMMIT_TAG}"
    assets:
      links:
        - name: "Firmware Binary"
          url: "${CI_PROJECT_URL}/-/jobs/${CI_JOB_ID}/artifacts/file/build/release/firmware.bin.signed"
        - name: "Firmware HEX"
          url: "${CI_PROJECT_URL}/-/jobs/${CI_JOB_ID}/artifacts/file/build/release/firmware.hex"
  rules:
    - if: $CI_COMMIT_TAG

GitHub Actions Configuration

# .github/workflows/embedded-ci.yml
name: Embedded CI/CD

on:
  push:
    branches: [main, develop]
  pull_request:
    branches: [main, develop]
  release:
    types: [created]

env:
  TARGET: stm32f407vg
  ARM_GCC_VERSION: "12.2.rel1"

jobs:
  build:
    runs-on: ubuntu-latest
    container:
      image: ghcr.io/embedded/arm-toolchain:latest
    
    steps:
      - uses: actions/checkout@v4
      
      - name: Configure
        run: cmake -B build -DCMAKE_BUILD_TYPE=Release
      
      - name: Build
        run: cmake --build build -j$(nproc)
      
      - name: Size Report
        run: arm-none-eabi-size build/firmware.elf
      
      - name: Upload Artifacts
        uses: actions/upload-artifact@v4
        with:
          name: firmware
          path: |
            build/*.elf
            build/*.hex
            build/*.bin

  test-unit:
    runs-on: ubuntu-latest
    needs: build
    
    steps:
      - uses: actions/checkout@v4
      
      - name: Build Tests
        run: |
          cmake -B build-test -DBUILD_TESTS=ON -DCOVERAGE=ON
          cmake --build build-test
      
      - name: Run Tests
        run: ctest --test-dir build-test --output-on-failure
      
      - name: Generate Coverage
        run: |
          gcovr --xml-pretty -r . -o coverage.xml
      
      - name: Upload Coverage
        uses: codecov/codecov-action@v4
        with:
          files: coverage.xml

  test-sil:
    runs-on: ubuntu-latest
    needs: build
    container:
      image: antmicro/renode:latest
    
    steps:
      - uses: actions/checkout@v4
      
      - uses: actions/download-artifact@v4
        with:
          name: firmware
          path: build/
      
      - name: Run SIL Tests
        run: |
          renode --disable-xwt tests/sil/run_all.resc

  test-hil:
    runs-on: [self-hosted, hil-rig]
    needs: [test-unit, test-sil]
    if: github.event_name == 'push' && github.ref == 'refs/heads/main'
    
    steps:
      - uses: actions/checkout@v4
      
      - uses: actions/download-artifact@v4
        with:
          name: firmware
          path: build/
      
      - name: Flash and Test
        run: |
          python3 hil/flash_and_test.py build/firmware.hex
          python3 hil/run_integration_suite.py

  analyze:
    runs-on: ubuntu-latest
    needs: build
    
    steps:
      - uses: actions/checkout@v4
      
      - name: Cppcheck
        run: |
          cppcheck --enable=all --xml src/ 2> cppcheck.xml
      
      - name: MISRA Check
        run: |
          cppcheck --addon=misra.py src/ 2> misra.xml
        continue-on-error: true
      
      - name: Upload SARIF
        uses: github/codeql-action/upload-sarif@v3
        with:
          sarif_file: cppcheck.xml

  release:
    runs-on: ubuntu-latest
    needs: [test-hil, analyze]
    if: github.event_name == 'release'
    
    steps:
      - uses: actions/download-artifact@v4
        with:
          name: firmware
          path: build/
      
      - name: Sign Firmware
        run: |
          python3 scripts/sign_firmware.py build/firmware.bin
      
      - name: Upload Release Assets
        uses: softprops/action-gh-release@v1
        with:
          files: |
            build/firmware.bin.signed
            build/firmware.hex

Jenkins Pipeline

// Jenkinsfile
pipeline {
    agent none
    
    environment {
        TARGET = 'stm32f407vg'
        TOOLCHAIN = 'arm-none-eabi'
    }
    
    stages {
        stage('Build') {
            agent {
                docker {
                    image 'arm-toolchain:latest'
                    args '-v $HOME/.ccache:/ccache'
                }
            }
            steps {
                sh '''
                    cmake -B build -DCMAKE_BUILD_TYPE=Release
                    cmake --build build -j$(nproc)
                '''
            }
            post {
                success {
                    archiveArtifacts artifacts: 'build/*.elf, build/*.hex, build/*.bin'
                    stash includes: 'build/**', name: 'firmware'
                }
            }
        }
        
        stage('Test') {
            parallel {
                stage('Unit Tests') {
                    agent {
                        docker { image 'arm-toolchain:latest' }
                    }
                    steps {
                        sh '''
                            cmake -B build-test -DBUILD_TESTS=ON
                            cmake --build build-test
                            cd build-test && ctest --output-on-failure
                        '''
                    }
                    post {
                        always {
                            junit 'build-test/results.xml'
                        }
                    }
                }
                
                stage('SIL Tests') {
                    agent {
                        docker { image 'renode/renode:latest' }
                    }
                    steps {
                        unstash 'firmware'
                        sh 'renode --disable-xwt tests/sil/run_all.resc'
                    }
                }
            }
        }
        
        stage('HIL Tests') {
            agent { label 'hil-rig-1' }
            when { branch 'main' }
            options {
                lock(resource: 'hil-rig-1')
            }
            steps {
                unstash 'firmware'
                sh '''
                    python3 hil/flash_and_test.py build/firmware.hex
                    python3 hil/run_integration_suite.py
                '''
            }
            post {
                always {
                    junit 'hil/results.xml'
                    archiveArtifacts artifacts: 'hil/logs/**'
                }
            }
        }
        
        stage('Release') {
            agent { docker { image 'release-tools:latest' } }
            when { tag pattern: "v\\d+\\.\\d+\\.\\d+", comparator: "REGEXP" }
            steps {
                unstash 'firmware'
                sh '''
                    python3 scripts/sign_firmware.py build/firmware.bin
                    python3 scripts/create_release_package.py
                '''
            }
            post {
                success {
                    archiveArtifacts artifacts: 'release/**'
                }
            }
        }
    }
    
    post {
        failure {
            emailext subject: "Build Failed: ${env.JOB_NAME} #${env.BUILD_NUMBER}",
                     body: "Check console output at ${env.BUILD_URL}",
                     recipientProviders: [culprits(), developers()]
        }
    }
}

AI-Powered Pipeline Optimization

Intelligent Test Selection

Use ML to select only relevant tests based on code changes:

# scripts/ai_test_selector.py
"""
AI-powered test selection based on code changes.
Uses historical test results and code coverage to predict which tests to run.
"""

import json
from pathlib import Path
from anthropic import Anthropic

def get_changed_files() -> list:
    """Get files changed in current commit/MR."""
    import subprocess
    result = subprocess.run(
        ['git', 'diff', '--name-only', 'HEAD~1'],
        capture_output=True, text=True
    )
    return result.stdout.strip().split('\n')

def load_test_coverage_map() -> dict:
    """Load test-to-file coverage mapping."""
    coverage_path = Path('.test-coverage-map.json')
    if coverage_path.exists():
        return json.loads(coverage_path.read_text())
    return {}

def select_tests_with_ai(changed_files: list, all_tests: list) -> list:
    """Use LLM to intelligently select tests."""
    client = Anthropic()
    
    prompt = f"""Given the following changed files in an embedded C project:
{json.dumps(changed_files, indent=2)}

And the available test suites:
{json.dumps(all_tests, indent=2)}

Select the minimum set of tests that should run to verify these changes.
Consider:
1. Direct unit tests for changed modules
2. Integration tests for affected interfaces
3. Regression tests for related functionality

Return a JSON array of test names to run, prioritized by importance.
Only return the JSON array, no explanation."""

    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[{"role": "user", "content": prompt}]
    )
    
    return json.loads(response.content[0].text)

def main():
    changed_files = get_changed_files()
    coverage_map = load_test_coverage_map()
    
    # Get all available tests
    all_tests = list(Path('tests').glob('test_*.c'))
    all_tests = [t.stem for t in all_tests]
    
    # Rule-based selection for critical files
    critical_tests = []
    if any('safety' in f for f in changed_files):
        critical_tests.append('test_safety_critical')
    if any('can_' in f for f in changed_files):
        critical_tests.append('test_can_integration')
    
    # AI-powered selection for remaining
    ai_selected = select_tests_with_ai(changed_files, all_tests)
    
    # Combine and deduplicate
    selected = list(set(critical_tests + ai_selected))
    
    # Output for CI
    print(json.dumps(selected))

if __name__ == '__main__':
    main()

Pipeline Analytics Dashboard

# scripts/pipeline_analytics.py
"""
Collect and analyze CI/CD pipeline metrics.
"""

import pandas as pd
from datetime import datetime, timedelta
import requests

class PipelineAnalytics:
    def __init__(self, gitlab_url: str, project_id: int, token: str):
        self.base_url = f"{gitlab_url}/api/v4/projects/{project_id}"
        self.headers = {"PRIVATE-TOKEN": token}
    
    def get_pipeline_metrics(self, days: int = 30) -> pd.DataFrame:
        """Fetch pipeline metrics for the last N days."""
        since = (datetime.now() - timedelta(days=days)).isoformat()
        
        response = requests.get(
            f"{self.base_url}/pipelines",
            headers=self.headers,
            params={"updated_after": since, "per_page": 100}
        )
        
        pipelines = response.json()
        
        data = []
        for p in pipelines:
            data.append({
                'id': p['id'],
                'status': p['status'],
                'duration': p.get('duration', 0),
                'created_at': p['created_at'],
                'ref': p['ref']
            })
        
        return pd.DataFrame(data)
    
    def calculate_metrics(self, df: pd.DataFrame) -> dict:
        """Calculate key pipeline metrics."""
        return {
            'total_pipelines': len(df),
            'success_rate': (df['status'] == 'success').mean() * 100,
            'avg_duration_minutes': df['duration'].mean() / 60,
            'p95_duration_minutes': df['duration'].quantile(0.95) / 60,
            'failure_rate': (df['status'] == 'failed').mean() * 100,
            'main_branch_success': (
                df[df['ref'] == 'main']['status'] == 'success'
            ).mean() * 100
        }
    
    def identify_slow_stages(self) -> list:
        """Identify consistently slow pipeline stages."""
        # Implementation would analyze job-level data
        pass
    
    def predict_failure(self, commit_data: dict) -> float:
        """Predict likelihood of pipeline failure based on commit characteristics."""
        # ML model would analyze:
        # - Files changed
        # - Lines added/removed
        # - Author's historical success rate
        # - Time of day
        # - Recent pipeline history
        pass

Quality Gates

ASPICE-Compliant Quality Gate

# quality-gates.yaml
gates:
  pr_merge:
    name: "PR Merge Gate"
    description: "Required checks before merging to main"
    rules:
      - metric: unit_test_pass_rate
        operator: eq
        value: 100
        description: "All unit tests must pass"
      
      - metric: code_coverage
        operator: gte
        value: 80
        description: "Code coverage >= 80%"
      
      - metric: static_analysis_critical
        operator: eq
        value: 0
        description: "No critical static analysis findings"
      
      - metric: build_warnings
        operator: lte
        value: 10
        description: "Maximum 10 build warnings"

  release:
    name: "Release Gate"
    description: "Required checks before release"
    rules:
      - metric: hil_test_pass_rate
        operator: eq
        value: 100
        description: "All HIL tests must pass"
      
      - metric: misra_mandatory_violations
        operator: eq
        value: 0
        description: "No MISRA mandatory violations"
      
      - metric: mcdc_coverage
        operator: gte
        value: 90
        description: "MC/DC coverage >= 90% for safety-critical code"
      
      - metric: all_requirements_traced
        operator: eq
        value: true
        description: "All requirements have linked test cases"
      
      - metric: coverity_defects_high
        operator: eq
        value: 0
        description: "No high-severity Coverity defects"

Summary

CI/CD transforms embedded development by automating builds, tests, and deployments:

Stage Purpose Tools Trigger
Lint Code style, commit format clang-format, commitlint Every commit
Build Cross-compilation CMake, Make, IAR, Keil Every commit
Unit Test Component verification Unity, gtest Every commit
SIL Test Simulated integration Renode, QEMU Every commit
Analyze Static analysis Cppcheck, MISRA Every commit
HIL Test Hardware verification dSPACE, NI, custom Main branch
Coverity Deep analysis Coverity Scan Main branch
Release Sign, package, publish Scripts, release-cli Tags

Key Success Factors:

  1. Self-hosted runners for HIL testing with exclusive resource locks
  2. Artifact management for firmware binaries across stages
  3. Quality gates blocking unqualified code from release
  4. AI optimization for test selection and failure prediction
  5. Traceability linking commits to requirements and test results
  6. Reproducible builds using containers and version-locked toolchains

The following chapters provide detailed configuration for specific CI/CD platforms.


Chapters in This Section

Chapter Title Key Topics
16.01 Jenkins Configuration Pipeline as code, plugins, HIL integration
16.02 GitHub Actions Workflows, reusable actions, matrix builds
16.03 GitLab CI Runners, artifacts, environments
16.04 Bitbucket Pipelines Pipeline configuration, deployment
16.05 Container Security with AI Image scanning, runtime security, SBOM
16.06 Pipeline Patterns for Embedded Cross-compilation, HIL triggers, artifact management

References

  • Humble, J., & Farley, D. (2010). Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation. Addison-Wesley.
  • Kim, G., Humble, J., Debois, P., & Willis, J. (2016). The DevOps Handbook. IT Revolution Press.
  • GitLab CI/CD Documentation: https://docs.gitlab.com/ee/ci/
  • GitHub Actions Documentation: https://docs.github.com/en/actions