Security

Container Image Scanning in Pipelines

Implement container image scanning in Azure DevOps pipelines with Trivy, SBOM generation, image signing, and automated vulnerability gating

Container Image Scanning in Pipelines

Overview

Container image scanning is the process of analyzing Docker images for known vulnerabilities, misconfigurations, and license compliance issues before they reach production. Every container you deploy carries an operating system, system libraries, and application dependencies — any of which can harbor critical CVEs that attackers actively exploit. Integrating scanning directly into your Azure DevOps pipeline turns security from a manual afterthought into an automated gate that stops vulnerable images before they ever touch a registry.

Prerequisites

  • Azure DevOps project with a pipeline configured
  • Docker installed on the build agent (or a self-hosted agent with Docker)
  • Basic familiarity with Docker multi-stage builds
  • An Azure Container Registry (ACR) or other OCI-compliant registry
  • Node.js 18+ for the scan report aggregator example
  • Cosign installed for image signing (optional but recommended)

Why Container Scanning Matters

The numbers tell the story. In 2024, the National Vulnerability Database recorded over 28,000 new CVEs. A significant portion of those affect packages commonly found in container base images. Pull down a standard node:18 image and run a scan — you will find dozens of vulnerabilities before you write a single line of application code.

The problem compounds because containers inherit everything from their base image. If you build on node:18-bullseye, you inherit every Debian Bullseye package and every vulnerability in those packages. Your application layer adds Node.js dependencies on top. The combination creates an attack surface that is far larger than most teams realize.

Here is what a typical scan of an unoptimized Node.js image reveals:

node:18 (debian 11.8)
Total: 387 (UNKNOWN: 2, LOW: 241, MEDIUM: 98, HIGH: 38, CRITICAL: 8)

Node.js (node_modules)
Total: 12 (LOW: 4, MEDIUM: 5, HIGH: 2, CRITICAL: 1)

Eight critical vulnerabilities in the OS layer alone. That is eight potential entry points an attacker can use to break out of your container or escalate privileges. Automated scanning catches these before deployment.

Trivy Integration in Azure Pipelines

Trivy is the most widely adopted open-source container scanner. It is fast, requires no database server, and covers OS packages, language-specific dependencies, IaC misconfigurations, and secrets. Integrating Trivy into an Azure Pipeline is straightforward.

Installing Trivy on the Build Agent

# azure-pipelines.yml
steps:
  - script: |
      curl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/install.sh | sh -s -- -b /usr/local/bin v0.50.1
      trivy --version
    displayName: 'Install Trivy'

Pin the version. Using latest in a pipeline means your build behavior changes without a code change — that is a recipe for mysterious failures on a Tuesday morning.

Scanning a Built Image

  - script: |
      docker build -t myapp:$(Build.BuildId) .
    displayName: 'Build Docker Image'

  - script: |
      trivy image \
        --exit-code 1 \
        --severity CRITICAL,HIGH \
        --no-progress \
        --format table \
        myapp:$(Build.BuildId)
    displayName: 'Scan Image with Trivy'
    continueOnError: false

The --exit-code 1 flag causes Trivy to return a non-zero exit code when vulnerabilities at the specified severity are found. Combined with continueOnError: false, this fails the pipeline. That is the entire point — you want the build to break when critical vulnerabilities exist.

JSON Output for Downstream Processing

  - script: |
      trivy image \
        --exit-code 0 \
        --severity CRITICAL,HIGH,MEDIUM \
        --format json \
        --output $(Build.ArtifactStagingDirectory)/trivy-report.json \
        myapp:$(Build.BuildId)
    displayName: 'Generate Trivy JSON Report'

  - task: PublishBuildArtifacts@1
    inputs:
      pathToPublish: '$(Build.ArtifactStagingDirectory)/trivy-report.json'
      artifactName: 'security-reports'
    displayName: 'Publish Scan Results'

Notice the --exit-code 0 here. We generate the report regardless of findings, then handle the pass/fail logic separately. This lets you publish the full report even when vulnerabilities are found.

Alternative Scanners: Grype and Aqua

Trivy is not the only option. Grype (from Anchore) and Aqua Security offer different trade-offs.

Grype

  - script: |
      curl -sSfL https://raw.githubusercontent.com/anchore/grype/main/install.sh | sh -s -- -b /usr/local/bin
      grype myapp:$(Build.BuildId) \
        --fail-on critical \
        --output table
    displayName: 'Scan with Grype'

Grype is fast and focused purely on vulnerability matching. It pairs well with Syft for SBOM generation, since both come from Anchore. If you want a scanner that does one thing well, Grype is a solid choice.

Aqua Security

Aqua offers a commercial scanner with a free tier for open-source projects. It provides deeper analysis including runtime profiling and drift detection. The pipeline integration uses the aqua CLI:

  - script: |
      curl -sfL https://get.aquasec.com/scanner | sh
      ./scanner scan \
        --host $(AQUA_HOST) \
        --token $(AQUA_TOKEN) \
        --registry "Azure" \
        myapp:$(Build.BuildId)
    displayName: 'Scan with Aqua'

For most teams starting out, Trivy gives you 90% of the value at zero cost. Move to a commercial scanner when you need centralized policy management across dozens of pipelines.

Scanning Base Images vs Application Layers

A container image is a stack of layers. Understanding which layer introduced a vulnerability determines who needs to fix it.

# Base image layer - OS vulnerabilities live here
FROM node:18-alpine

# Application layer - npm vulnerabilities live here
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .

CMD ["node", "server.js"]

Trivy can scan specific layers, but the more practical approach is scanning at two stages:

  # Scan the base image before building
  - script: |
      trivy image --severity CRITICAL,HIGH node:18-alpine
    displayName: 'Scan Base Image'

  # Scan the final built image
  - script: |
      trivy image --severity CRITICAL,HIGH myapp:$(Build.BuildId)
    displayName: 'Scan Application Image'

This separation matters because base image vulnerabilities require updating the FROM line or waiting for an upstream fix. Application layer vulnerabilities require running npm audit fix or updating specific packages. Different vulnerabilities, different owners, different fixes.

Use Alpine or distroless base images to shrink the OS attack surface:

# Better: Alpine reduces OS vulnerability count by 80%+
FROM node:18-alpine

# Best: Distroless eliminates the OS shell entirely
FROM gcr.io/distroless/nodejs18-debian11

Azure Container Registry Scanning with Defender for Containers

Azure Container Registry has built-in scanning through Microsoft Defender for Containers. When enabled, every image pushed to ACR is automatically scanned by Qualys.

  - task: Docker@2
    inputs:
      containerRegistry: 'myACR'
      repository: 'myapp'
      command: 'buildAndPush'
      Dockerfile: '**/Dockerfile'
      tags: '$(Build.BuildId)'
    displayName: 'Build and Push to ACR'

  # Defender scans automatically on push
  # Query results via Azure CLI
  - script: |
      az acr assessment show \
        --registry myacr \
        --repository myapp \
        --tag $(Build.BuildId) \
        --query "vulnerabilities[?severity=='Critical' || severity=='High']" \
        --output table
    displayName: 'Check Defender Scan Results'

Defender for Containers provides continuous monitoring — it rescans images as new CVEs are published. A clean image today can become vulnerable tomorrow when a new CVE drops. Pipeline scanning catches issues at build time; Defender catches them afterward.

Custom Scanning Policies and Severity Thresholds

Real-world pipelines need nuance. Not every HIGH severity vulnerability warrants blocking a release. Define policies using a Trivy config file:

# trivy.yaml
severity:
  - CRITICAL
  - HIGH

vulnerability:
  ignore-unfixed: true

ignore:
  # Accepted risk: no fix available, mitigated by network policy
  - id: CVE-2023-44487
    statement: "HTTP/2 rapid reset mitigated by ALB rate limiting"
    expires: "2026-06-01"

  # Low-risk in our context: we don't use this feature
  - id: CVE-2024-21626
    statement: "runc vulnerability not exploitable in our config"
    expires: "2026-04-15"

Reference the config in your pipeline:

  - script: |
      trivy image \
        --config trivy.yaml \
        --exit-code 1 \
        --format table \
        myapp:$(Build.BuildId)
    displayName: 'Scan with Custom Policy'

The ignore-unfixed: true flag is critical. There is no point in failing a build over a vulnerability that has no available fix. Flag it, track it, but do not block deployments over something nobody can resolve yet.

Set expiration dates on every ignore entry. Accepted risk without an expiration date becomes forgotten risk.

Scanning Multi-Stage Docker Builds

Multi-stage builds introduce complexity because intermediate stages carry build tools and dev dependencies that never reach the final image. Only scan the final stage:

# Stage 1: Build (carries dev dependencies, build tools)
FROM node:18-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build

# Stage 2: Production (minimal footprint)
FROM node:18-alpine AS production
WORKDIR /app
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./
EXPOSE 3000
USER node
CMD ["node", "dist/server.js"]
  - script: |
      docker build --target production -t myapp:$(Build.BuildId) .
      trivy image --severity CRITICAL,HIGH myapp:$(Build.BuildId)
    displayName: 'Build and Scan Production Stage'

If you want to audit the builder stage separately for compliance, scan both but only gate on the production stage:

  - script: |
      docker build --target builder -t myapp-builder:$(Build.BuildId) .
      trivy image --exit-code 0 --output builder-report.json --format json myapp-builder:$(Build.BuildId)
    displayName: 'Scan Builder Stage (Audit Only)'

  - script: |
      docker build --target production -t myapp:$(Build.BuildId) .
      trivy image --exit-code 1 --severity CRITICAL,HIGH myapp:$(Build.BuildId)
    displayName: 'Scan Production Stage (Gating)'

SBOM Generation

A Software Bill of Materials (SBOM) is an inventory of every component in your container image. Regulations like the US Executive Order on Cybersecurity increasingly require SBOMs for software sold to government agencies.

Syft

Syft (from Anchore) generates SBOMs in SPDX and CycloneDX formats:

  - script: |
      curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s -- -b /usr/local/bin
      syft myapp:$(Build.BuildId) \
        -o spdx-json \
        --file $(Build.ArtifactStagingDirectory)/sbom-spdx.json
      syft myapp:$(Build.BuildId) \
        -o cyclonedx-json \
        --file $(Build.ArtifactStagingDirectory)/sbom-cyclonedx.json
    displayName: 'Generate SBOM with Syft'

Trivy SBOM

Trivy can also generate SBOMs, keeping your tooling consolidated:

  - script: |
      trivy image \
        --format cyclonedx \
        --output $(Build.ArtifactStagingDirectory)/sbom.cdx.json \
        myapp:$(Build.BuildId)
    displayName: 'Generate SBOM with Trivy'

Docker SBOM (BuildKit)

Docker Desktop includes built-in SBOM generation:

docker sbom myapp:latest --format spdx-json > sbom.json

Pick one tool and standardize on it. Having three different SBOM formats across your organization helps nobody.

Image Signing and Verification

Signing images ensures that what you scanned is what you deploy. Without signing, an attacker who compromises your registry can swap images after scanning.

Cosign

Cosign (from Sigstore) is the standard for keyless container image signing:

  - script: |
      curl -sSfL https://github.com/sigstore/cosign/releases/download/v2.2.3/cosign-linux-amd64 -o /usr/local/bin/cosign
      chmod +x /usr/local/bin/cosign
    displayName: 'Install Cosign'

  - script: |
      cosign sign \
        --key env://COSIGN_PRIVATE_KEY \
        --yes \
        myacr.azurecr.io/myapp:$(Build.BuildId)
    displayName: 'Sign Image with Cosign'
    env:
      COSIGN_PRIVATE_KEY: $(COSIGN_PRIVATE_KEY)
      COSIGN_PASSWORD: $(COSIGN_PASSWORD)

Attaching Scan Results to the Image

Cosign can attach attestations directly to the image in the registry:

  - script: |
      cosign attest \
        --key env://COSIGN_PRIVATE_KEY \
        --predicate $(Build.ArtifactStagingDirectory)/trivy-report.json \
        --type vuln \
        --yes \
        myacr.azurecr.io/myapp:$(Build.BuildId)
    displayName: 'Attach Scan Attestation'
    env:
      COSIGN_PRIVATE_KEY: $(COSIGN_PRIVATE_KEY)
      COSIGN_PASSWORD: $(COSIGN_PASSWORD)

Notation (Microsoft's Alternative)

Azure has its own signing tool called Notation, integrated with Azure Key Vault:

  - script: |
      notation sign \
        --signature-format cose \
        --id $(KEY_ID) \
        --plugin azure-kv \
        --plugin-config "self_signed"=true \
        myacr.azurecr.io/myapp:$(Build.BuildId)
    displayName: 'Sign Image with Notation'

Notation integrates tightly with Azure services. Cosign integrates with everything else. Choose based on your ecosystem.

Verification at Deploy Time

  - script: |
      cosign verify \
        --key env://COSIGN_PUBLIC_KEY \
        myacr.azurecr.io/myapp:$(Build.BuildId)
    displayName: 'Verify Image Signature'
    env:
      COSIGN_PUBLIC_KEY: $(COSIGN_PUBLIC_KEY)

Scan Result Publishing to Azure DevOps

Publishing scan results as pipeline test results makes vulnerability trends visible in the Azure DevOps UI.

Converting Trivy Output to JUnit Format

  - script: |
      trivy image \
        --format template \
        --template "@contrib/junit.tpl" \
        --output $(Build.ArtifactStagingDirectory)/trivy-junit.xml \
        myapp:$(Build.BuildId)
    displayName: 'Generate JUnit Report'

  - task: PublishTestResults@2
    inputs:
      testResultsFormat: 'JUnit'
      testResultsFiles: '$(Build.ArtifactStagingDirectory)/trivy-junit.xml'
      testRunTitle: 'Container Security Scan'
    displayName: 'Publish Scan Results to Azure DevOps'

This surfaces scan results in the Tests tab of your pipeline run. Each vulnerability appears as a test case — failed tests are vulnerabilities above your threshold.

Registry Admission Control

Scanning in the pipeline is half the equation. Admission control ensures only scanned, signed images can run in your cluster.

With Azure Policy and AKS:

{
  "mode": "All",
  "policyRule": {
    "if": {
      "allOf": [
        {
          "field": "type",
          "equals": "Microsoft.ContainerService/managedClusters"
        }
      ]
    },
    "then": {
      "effect": "deny",
      "details": {
        "templateRef": "K8sImageSignatureVerification",
        "constraint": {
          "allowedRegistries": [
            "myacr.azurecr.io"
          ],
          "requireSignature": true
        }
      }
    }
  }
}

With OPA Gatekeeper in Kubernetes:

apiVersion: templates.gatekeeper.sh/v1
kind: ConstraintTemplate
metadata:
  name: k8sallowedimages
spec:
  crd:
    spec:
      names:
        kind: K8sAllowedImages
  targets:
    - target: admission.k8s.gatekeeper.sh
      rego: |
        package k8sallowedimages
        violation[{"msg": msg}] {
          container := input.review.object.spec.containers[_]
          not startswith(container.image, "myacr.azurecr.io/")
          msg := sprintf("Image '%v' is not from an allowed registry", [container.image])
        }

Continuous Monitoring of Deployed Images

Pipeline scanning catches vulnerabilities at build time. New CVEs appear daily. You need continuous monitoring.

# Scheduled pipeline - runs nightly against deployed images
trigger: none
schedules:
  - cron: '0 2 * * *'
    displayName: 'Nightly Image Scan'
    branches:
      include:
        - master

steps:
  - script: |
      az acr repository list --name myacr --output tsv | while read repo; do
        LATEST_TAG=$(az acr repository show-tags --name myacr --repository $repo --orderby time_desc --top 1 --output tsv)
        echo "Scanning $repo:$LATEST_TAG"
        trivy image --severity CRITICAL,HIGH --format json --output "scan-$repo.json" "myacr.azurecr.io/$repo:$LATEST_TAG"
      done
    displayName: 'Scan All Deployed Images'

Complete Working Example

Here is a complete Azure Pipeline that builds a Node.js Docker image, scans it with Trivy, generates an SBOM, signs the image with Cosign, and includes a Node.js scan report aggregator.

The Dockerfile

FROM node:18-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build
RUN npm prune --production

FROM node:18-alpine AS production
RUN apk --no-cache add dumb-init
WORKDIR /app
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./
USER node
EXPOSE 3000
CMD ["dumb-init", "node", "dist/server.js"]

The Pipeline

trigger:
  branches:
    include:
      - master

pool:
  vmImage: 'ubuntu-latest'

variables:
  REGISTRY: 'myacr.azurecr.io'
  REPOSITORY: 'myapp'
  TAG: '$(Build.BuildId)'
  IMAGE: '$(REGISTRY)/$(REPOSITORY):$(TAG)'
  TRIVY_VERSION: 'v0.50.1'
  COSIGN_VERSION: 'v2.2.3'

stages:
  - stage: Build
    displayName: 'Build and Scan'
    jobs:
      - job: BuildScanSign
        displayName: 'Build, Scan, SBOM, Sign'
        steps:
          # Install tools
          - script: |
              curl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/install.sh | sh -s -- -b /usr/local/bin $(TRIVY_VERSION)
              curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s -- -b /usr/local/bin
              curl -sSfL https://github.com/sigstore/cosign/releases/download/$(COSIGN_VERSION)/cosign-linux-amd64 -o /usr/local/bin/cosign
              chmod +x /usr/local/bin/cosign
            displayName: 'Install Security Tools'

          # Scan base image first
          - script: |
              echo "=== Scanning Base Image ==="
              trivy image --severity CRITICAL,HIGH --exit-code 0 --format table node:18-alpine
            displayName: 'Scan Base Image'

          # Build production image
          - script: |
              docker build --target production -t $(IMAGE) .
            displayName: 'Build Docker Image'

          # Scan with Trivy - table output for logs
          - script: |
              trivy image \
                --severity CRITICAL,HIGH \
                --exit-code 0 \
                --format table \
                $(IMAGE)
            displayName: 'Scan Image (Table Output)'

          # Scan with Trivy - JSON output for processing
          - script: |
              trivy image \
                --severity CRITICAL,HIGH,MEDIUM \
                --format json \
                --output $(Build.ArtifactStagingDirectory)/trivy-report.json \
                $(IMAGE)
            displayName: 'Generate Trivy JSON Report'

          # Scan with Trivy - JUnit output for Azure DevOps
          - script: |
              trivy image \
                --format template \
                --template "@contrib/junit.tpl" \
                --output $(Build.ArtifactStagingDirectory)/trivy-junit.xml \
                $(IMAGE)
            displayName: 'Generate JUnit Report'

          # Publish test results
          - task: PublishTestResults@2
            inputs:
              testResultsFormat: 'JUnit'
              testResultsFiles: '$(Build.ArtifactStagingDirectory)/trivy-junit.xml'
              testRunTitle: 'Container Security Scan - $(REPOSITORY):$(TAG)'
            displayName: 'Publish Scan Results'

          # Generate SBOM
          - script: |
              syft $(IMAGE) -o spdx-json --file $(Build.ArtifactStagingDirectory)/sbom.spdx.json
              syft $(IMAGE) -o cyclonedx-json --file $(Build.ArtifactStagingDirectory)/sbom.cdx.json
              echo "SBOM generated with $(cat $(Build.ArtifactStagingDirectory)/sbom.cdx.json | python3 -c 'import sys,json; d=json.load(sys.stdin); print(len(d.get("components",[])))') components"
            displayName: 'Generate SBOM'

          # Run the Node.js scan aggregator
          - script: |
              node scripts/scan-aggregator.js \
                $(Build.ArtifactStagingDirectory)/trivy-report.json \
                $(Build.ArtifactStagingDirectory)/scan-summary.json
            displayName: 'Aggregate Scan Results'

          # Gate on critical vulnerabilities
          - script: |
              trivy image \
                --severity CRITICAL \
                --ignore-unfixed \
                --exit-code 1 \
                --no-progress \
                $(IMAGE)
            displayName: 'Gate - Fail on Critical Vulnerabilities'

          # Push to ACR (only if scan passes)
          - task: Docker@2
            inputs:
              containerRegistry: 'myACR-connection'
              repository: '$(REPOSITORY)'
              command: 'push'
              tags: '$(TAG)'
            displayName: 'Push to ACR'

          # Sign the image
          - script: |
              cosign sign \
                --key env://COSIGN_PRIVATE_KEY \
                --yes \
                $(IMAGE)
            displayName: 'Sign Image'
            env:
              COSIGN_PRIVATE_KEY: $(COSIGN_PRIVATE_KEY)
              COSIGN_PASSWORD: $(COSIGN_PASSWORD)

          # Attach SBOM attestation
          - script: |
              cosign attest \
                --key env://COSIGN_PRIVATE_KEY \
                --predicate $(Build.ArtifactStagingDirectory)/sbom.cdx.json \
                --type cyclonedx \
                --yes \
                $(IMAGE)
            displayName: 'Attach SBOM Attestation'
            env:
              COSIGN_PRIVATE_KEY: $(COSIGN_PRIVATE_KEY)
              COSIGN_PASSWORD: $(COSIGN_PASSWORD)

          # Publish all artifacts
          - task: PublishBuildArtifacts@1
            inputs:
              pathToPublish: '$(Build.ArtifactStagingDirectory)'
              artifactName: 'security-reports'
            displayName: 'Publish Security Artifacts'

Node.js Scan Report Aggregator

This script parses Trivy JSON output, aggregates findings by severity and package type, and produces a summary report. Drop it in scripts/scan-aggregator.js:

// scripts/scan-aggregator.js
var fs = require("fs");
var path = require("path");

function loadReport(filePath) {
  var raw = fs.readFileSync(filePath, "utf8");
  return JSON.parse(raw);
}

function aggregateVulnerabilities(report) {
  var summary = {
    totalVulnerabilities: 0,
    bySeverity: { CRITICAL: 0, HIGH: 0, MEDIUM: 0, LOW: 0, UNKNOWN: 0 },
    byType: {},
    fixableCount: 0,
    unfixableCount: 0,
    criticalFindings: [],
    topAffectedPackages: {}
  };

  var results = report.Results || [];

  results.forEach(function (result) {
    var targetType = result.Type || "unknown";
    var vulnerabilities = result.Vulnerabilities || [];

    if (!summary.byType[targetType]) {
      summary.byType[targetType] = { total: 0, critical: 0, high: 0 };
    }

    vulnerabilities.forEach(function (vuln) {
      summary.totalVulnerabilities++;
      summary.byType[targetType].total++;

      var severity = vuln.Severity || "UNKNOWN";
      if (summary.bySeverity[severity] !== undefined) {
        summary.bySeverity[severity]++;
      }

      if (severity === "CRITICAL") {
        summary.byType[targetType].critical++;
      }
      if (severity === "HIGH") {
        summary.byType[targetType].high++;
      }

      if (vuln.FixedVersion) {
        summary.fixableCount++;
      } else {
        summary.unfixableCount++;
      }

      if (severity === "CRITICAL") {
        summary.criticalFindings.push({
          id: vuln.VulnerabilityID,
          package: vuln.PkgName,
          installedVersion: vuln.InstalledVersion,
          fixedVersion: vuln.FixedVersion || "none",
          title: vuln.Title || "No title available",
          target: result.Target
        });
      }

      var pkgKey = vuln.PkgName;
      if (!summary.topAffectedPackages[pkgKey]) {
        summary.topAffectedPackages[pkgKey] = 0;
      }
      summary.topAffectedPackages[pkgKey]++;
    });
  });

  return summary;
}

function formatConsoleReport(summary) {
  console.log("\n========================================");
  console.log("  CONTAINER SCAN SUMMARY");
  console.log("========================================\n");
  console.log("Total Vulnerabilities: " + summary.totalVulnerabilities);
  console.log(
    "  CRITICAL: " + summary.bySeverity.CRITICAL +
    "  HIGH: " + summary.bySeverity.HIGH +
    "  MEDIUM: " + summary.bySeverity.MEDIUM +
    "  LOW: " + summary.bySeverity.LOW
  );
  console.log(
    "  Fixable: " + summary.fixableCount +
    "  Unfixable: " + summary.unfixableCount
  );

  console.log("\nBy Target Type:");
  Object.keys(summary.byType).forEach(function (type) {
    var data = summary.byType[type];
    console.log(
      "  " + type + ": " + data.total +
      " total (" + data.critical + " critical, " + data.high + " high)"
    );
  });

  if (summary.criticalFindings.length > 0) {
    console.log("\nCritical Findings:");
    summary.criticalFindings.forEach(function (finding) {
      console.log("  - " + finding.id + " in " + finding.package +
        " (" + finding.installedVersion + ")");
      console.log("    Fix: " + finding.fixedVersion);
      console.log("    " + finding.title);
    });
  }

  // Sort packages by vulnerability count
  var sortedPackages = Object.entries(summary.topAffectedPackages)
    .sort(function (a, b) { return b[1] - a[1]; })
    .slice(0, 10);

  if (sortedPackages.length > 0) {
    console.log("\nTop Affected Packages:");
    sortedPackages.forEach(function (entry) {
      console.log("  " + entry[0] + ": " + entry[1] + " vulnerabilities");
    });
  }

  console.log("\n========================================\n");
}

function generateAzureDevOpsAnnotations(summary) {
  summary.criticalFindings.forEach(function (finding) {
    console.log(
      "##vso[task.logissue type=error]CRITICAL: " +
      finding.id + " in " + finding.package +
      " (" + finding.installedVersion + ") - " + finding.title
    );
  });

  if (summary.bySeverity.CRITICAL > 0) {
    console.log(
      "##vso[task.logissue type=warning]Found " +
      summary.bySeverity.CRITICAL + " CRITICAL vulnerabilities"
    );
  }

  // Set pipeline variables for downstream consumption
  console.log("##vso[task.setvariable variable=VULN_CRITICAL]" + summary.bySeverity.CRITICAL);
  console.log("##vso[task.setvariable variable=VULN_HIGH]" + summary.bySeverity.HIGH);
  console.log("##vso[task.setvariable variable=VULN_TOTAL]" + summary.totalVulnerabilities);
}

function main() {
  var args = process.argv.slice(2);

  if (args.length < 1) {
    console.error("Usage: node scan-aggregator.js <trivy-report.json> [output.json]");
    process.exit(1);
  }

  var inputPath = args[0];
  var outputPath = args[1] || null;

  if (!fs.existsSync(inputPath)) {
    console.error("Report file not found: " + inputPath);
    process.exit(1);
  }

  var report = loadReport(inputPath);
  var summary = aggregateVulnerabilities(report);

  formatConsoleReport(summary);
  generateAzureDevOpsAnnotations(summary);

  if (outputPath) {
    var outputDir = path.dirname(outputPath);
    if (!fs.existsSync(outputDir)) {
      fs.mkdirSync(outputDir, { recursive: true });
    }
    fs.writeFileSync(outputPath, JSON.stringify(summary, null, 2));
    console.log("Summary written to: " + outputPath);
  }

  // Exit with error if critical vulnerabilities are fixable
  var fixableCritical = summary.criticalFindings.filter(function (f) {
    return f.fixedVersion !== "none";
  });

  if (fixableCritical.length > 0) {
    console.error(
      "\nFAILED: " + fixableCritical.length +
      " fixable CRITICAL vulnerabilities found"
    );
    process.exit(1);
  }
}

main();

Sample output when run:

========================================
  CONTAINER SCAN SUMMARY
========================================

Total Vulnerabilities: 47
  CRITICAL: 2  HIGH: 8  MEDIUM: 24  LOW: 13
  Fixable: 31  Unfixable: 16

By Target Type:
  alpine 3.18.4: 29 total (1 critical, 5 high)
  Node.js: 18 total (1 critical, 3 high)

Critical Findings:
  - CVE-2024-21626 in runc (1.1.9)
    Fix: 1.1.12
    runc container breakout through process.cwd
  - CVE-2023-45853 in zlib (1.2.13)
    Fix: 1.3.1
    MiniZip integer overflow and heap-based buffer overflow

Top Affected Packages:
  openssl: 6 vulnerabilities
  libcurl: 4 vulnerabilities
  busybox: 3 vulnerabilities

========================================

##vso[task.logissue type=error]CRITICAL: CVE-2024-21626 in runc (1.1.9) - runc container breakout
##vso[task.logissue type=error]CRITICAL: CVE-2023-45853 in zlib (1.2.13) - MiniZip integer overflow
##vso[task.logissue type=warning]Found 2 CRITICAL vulnerabilities
##vso[task.setvariable variable=VULN_CRITICAL]2
##vso[task.setvariable variable=VULN_HIGH]8
##vso[task.setvariable variable=VULN_TOTAL]47

FAILED: 2 fixable CRITICAL vulnerabilities found

Common Issues and Troubleshooting

1. Trivy Database Download Fails Behind Corporate Proxy

2024-02-15T10:23:44.123Z  FATAL  failed to download vulnerability DB
Error: failed to download artifact: GET https://ghcr.io/v2/aquasecurity/trivy-db/manifests/2: dial tcp: lookup ghcr.io: no such host

Fix: Configure Trivy to use a mirrored database or pre-download it:

  - script: |
      trivy image --download-db-only --db-repository myacr.azurecr.io/trivy-db
      trivy image --skip-db-update --severity CRITICAL,HIGH $(IMAGE)
    displayName: 'Scan with Local DB Mirror'

Or cache the database between pipeline runs:

  - task: Cache@2
    inputs:
      key: 'trivy-db | "$(Agent.OS)"'
      path: '/root/.cache/trivy'
    displayName: 'Cache Trivy DB'

2. Scan Timeout on Large Images

FATAL  scan error: image scan failed: failed to analyze layer: context deadline exceeded

Fix: Increase the timeout and consider using --skip-files to exclude large binaries:

  - script: |
      trivy image \
        --timeout 15m \
        --skip-files "/app/data/*.db,/app/assets/*.wasm" \
        $(IMAGE)
    displayName: 'Scan with Extended Timeout'

3. Cosign Sign Fails with "no matching signatures"

Error: signing [myacr.azurecr.io/myapp:42]: getting signer: reading key: decrypt: x509: decryption password incorrect

Fix: Ensure the COSIGN_PASSWORD pipeline variable matches the password used when generating the key pair. Also verify the private key variable does not have trailing newlines:

  - script: |
      echo "$COSIGN_PRIVATE_KEY" | head -c -1 > /tmp/cosign.key
      cosign sign --key /tmp/cosign.key --yes $(IMAGE)
      rm -f /tmp/cosign.key
    displayName: 'Sign Image (Trimmed Key)'
    env:
      COSIGN_PRIVATE_KEY: $(COSIGN_PRIVATE_KEY)
      COSIGN_PASSWORD: $(COSIGN_PASSWORD)

4. JUnit Template Not Found

FATAL  template error: template: output:1:1: executing "output" at <.>: error calling toJunit: open contrib/junit.tpl: no such file or directory

Fix: The built-in templates moved in newer Trivy versions. Use the full path or download the template:

  - script: |
      TRIVY_TEMPLATES=$(trivy --cache-dir 2>&1 | grep -oP '/[^ ]+')
      if [ ! -f "contrib/junit.tpl" ]; then
        curl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/junit.tpl -o /tmp/junit.tpl
        TEMPLATE_PATH="/tmp/junit.tpl"
      else
        TEMPLATE_PATH="contrib/junit.tpl"
      fi
      trivy image --format template --template "@${TEMPLATE_PATH}" --output results.xml $(IMAGE)
    displayName: 'Scan with Template Fallback'

5. False Positives on Distroless Images

myacr.azurecr.io/myapp:42 (unknown)
Total: 0 (UNKNOWN: 0, LOW: 0, MEDIUM: 0, HIGH: 0, CRITICAL: 0)

usr/local/lib/node_modules (node_modules)
Total: 0

WARN  unable to detect OS. Detected results may be inaccurate.

Fix: Trivy sometimes cannot detect the OS on distroless images. This is actually fine — the warning means OS-level scanning is limited, but application-level scanning still works. Suppress the warning or add explicit OS detection:

  - script: |
      trivy image \
        --vuln-type library \
        --severity CRITICAL,HIGH \
        $(IMAGE)
    displayName: 'Scan Distroless (Libraries Only)'

Best Practices

  • Pin your scanner version. Using latest means your pipeline behavior changes without a code change. Pin Trivy, Grype, Cosign, and Syft to specific versions and update them deliberately in a tracked PR.

  • Scan before push, not after. Always scan the local image before pushing to your registry. This prevents vulnerable images from ever entering your registry, even briefly. A vulnerable image in ACR is a vulnerable image someone might accidentally deploy.

  • Separate gating from reporting. Generate the full scan report with --exit-code 0, then run a second scan with --exit-code 1 and your threshold for the actual gate. This ensures you always get the report, even when the gate fails.

  • Use --ignore-unfixed for gating decisions. Do not block deployments over vulnerabilities with no available fix. Track them, monitor them, but gating on unfixable issues only teaches teams to ignore the scanner.

  • Sign every image that passes scanning. Image signing closes the gap between "we scanned it" and "we deployed what we scanned." Without signing, there is no cryptographic proof that the deployed image is the same one that passed the scan.

  • Generate SBOMs as a standard practice. Even if nobody asks for them today, having SBOMs for every release gives you instant answers when the next Log4Shell-scale vulnerability drops. You can search your SBOMs to know which deployments are affected in minutes instead of days.

  • Use multi-stage builds with minimal final images. Alpine or distroless base images dramatically reduce your vulnerability surface area. A node:18-alpine image has roughly 80% fewer OS vulnerabilities than node:18-bullseye.

  • Cache the vulnerability database. Trivy downloads a vulnerability database on every run by default. Cache it between pipeline runs to reduce build time and avoid hitting rate limits on the database registry.

  • Set expiration dates on vulnerability exceptions. Every ignored CVE should have an expiration date. An exception without an expiration is a vulnerability you have decided to accept forever. Revisit exceptions quarterly at minimum.

  • Run nightly scans on deployed images. Pipeline scanning catches issues at build time. New CVEs appear daily. A scheduled pipeline that scans your currently deployed images catches vulnerabilities discovered after your last build.

References

Powered by Contentful