Security

Pipeline Security Hardening Checklist

A comprehensive security hardening checklist for Azure DevOps pipelines covering permissions, secrets, agents, and runtime controls

Pipeline Security Hardening Checklist

Overview

Azure DevOps pipelines are the backbone of your deployment infrastructure, and a compromised pipeline is a direct path to production. This article walks through a systematic hardening checklist that covers every layer of pipeline security, from YAML configuration and service connections to agent pools and runtime controls. If you are running CI/CD in Azure DevOps and have not audited your pipeline security recently, this guide will expose the gaps you did not know you had.

Prerequisites

  • Azure DevOps organization with Project Collection Administrator or Project Administrator access
  • Familiarity with YAML-based Azure Pipelines
  • Understanding of Azure Active Directory (Entra ID) and service principals
  • Node.js 18+ installed for validation scripts
  • Azure CLI installed and authenticated
  • At least one deployed pipeline you can use as a reference

YAML Pipeline Security Fundamentals

The shift from classic (GUI) pipelines to YAML pipelines was a security improvement by design. YAML pipelines live in source control, which means every change is tracked, reviewed, and auditable. But YAML pipelines also introduce new attack surfaces if you do not lock them down.

The first rule: never allow pipeline edits outside of pull requests. Every YAML pipeline should reference a file checked into the repository, and that repository should enforce branch policies.

# azure-pipelines.yml - Baseline secure pipeline structure
trigger:
  branches:
    include:
      - main
  paths:
    exclude:
      - docs/*
      - '*.md'

pr:
  branches:
    include:
      - main

pool:
  name: 'SecureAgentPool'
  demands:
    - Agent.OS -equals Linux

variables:
  - group: production-secrets
  - name: buildConfiguration
    value: 'Release'
  - name: System.Debug
    value: false

stages:
  - stage: Build
    displayName: 'Secure Build'
    jobs:
      - job: BuildJob
        timeoutInMinutes: 30
        cancelTimeoutInMinutes: 5
        steps:
          - checkout: self
            clean: true
            fetchDepth: 1
            persistCredentials: false

There are several things happening here that matter. Setting persistCredentials: false on the checkout step prevents the OAuth token from lingering on disk after the source is fetched. Setting fetchDepth: 1 limits the amount of repository history exposed to the agent. Setting clean: true ensures no artifacts from previous builds contaminate the workspace.

The timeoutInMinutes setting is a security control, not just a convenience. A compromised build that hangs indefinitely can be used for cryptocurrency mining or as a persistent foothold. Always set explicit timeouts.

Restricting Pipeline Permissions

Azure DevOps has a layered permission model, and the defaults are too permissive for most organizations. Here is what you need to lock down.

Repository-Level Permissions

Restrict which pipelines can access which repositories. By default, all pipelines in a project can access all repositories in that project. This is dangerous.

Navigate to Project Settings > Repositories > Security and disable "Allow access to all pipelines" for every repository. Then explicitly grant access only to the pipelines that need it.

Pipeline-Specific Permissions

Each pipeline has its own permission set. At minimum, configure these:

Pipeline Permissions Checklist:
├── Disable "Allow access to all pipelines" on all resources
├── Restrict "Edit build pipeline" to pipeline owners only
├── Remove "Queue builds" from Contributors group
├── Set "Manage build queue" to Project Administrators only
├── Disable "Override check gate evaluations"
└── Set "Edit build quality" to Build Administrators only

You can audit current permissions with the Azure DevOps REST API:

// audit-pipeline-permissions.js
var https = require('https');

var org = process.env.AZDO_ORG;
var project = process.env.AZDO_PROJECT;
var pat = process.env.AZDO_PAT;

var auth = Buffer.from(':' + pat).toString('base64');

function getPipelinePermissions(pipelineId, callback) {
  var options = {
    hostname: 'dev.azure.com',
    path: '/' + org + '/' + project + '/_apis/pipelines/pipelinepermissions/build/' + pipelineId + '?api-version=7.1-preview.1',
    method: 'GET',
    headers: {
      'Authorization': 'Basic ' + auth,
      'Content-Type': 'application/json'
    }
  };

  var req = https.request(options, function(res) {
    var body = '';
    res.on('data', function(chunk) { body += chunk; });
    res.on('end', function() {
      var data = JSON.parse(body);
      callback(null, data);
    });
  });

  req.on('error', function(err) { callback(err); });
  req.end();
}

function auditPermissions(pipelineId) {
  getPipelinePermissions(pipelineId, function(err, permissions) {
    if (err) {
      console.error('Failed to fetch permissions:', err.message);
      process.exit(1);
    }

    console.log('Pipeline ID:', pipelineId);
    console.log('All Pipelines Authorized:', permissions.allPipelines ? 'YES - INSECURE' : 'NO - GOOD');

    if (permissions.pipelines && permissions.pipelines.length > 0) {
      console.log('Authorized Pipelines:');
      permissions.pipelines.forEach(function(p) {
        console.log('  -', p.id, '| Authorized:', p.authorized, '| Authorized By:', p.authorizedBy);
      });
    }
  });
}

var pipelineId = process.argv[2];
if (!pipelineId) {
  console.error('Usage: node audit-pipeline-permissions.js <pipeline-id>');
  process.exit(1);
}

auditPermissions(pipelineId);

Running this against your pipelines will quickly reveal which ones have overly broad access.

Securing Service Connections

Service connections are the most dangerous resource in Azure DevOps. A compromised service connection with Contributor access to your Azure subscription can deploy anything, anywhere. Here is how to lock them down.

Principle of Least Privilege

Each service connection should have the minimum permissions required. Never use a single service connection across environments.

Service Connection Strategy:
├── dev-azure-connection      → Reader + limited RBAC on dev RG
├── staging-azure-connection  → Contributor on staging RG only
├── prod-azure-connection     → Contributor on prod RG only (with approval gate)
└── shared-acr-connection     → AcrPush on container registry only

Workload Identity Federation

Stop using client secrets for service connections. Workload Identity Federation eliminates the need for stored secrets entirely.

# Service connection using Workload Identity Federation
# Configured in Project Settings > Service Connections
# Type: Azure Resource Manager using Workload Identity Federation

# In your pipeline, reference it like this:
steps:
  - task: AzureCLI@2
    displayName: 'Deploy with Federated Identity'
    inputs:
      azureSubscription: 'prod-azure-connection'
      scriptType: 'bash'
      scriptLocation: 'inlineScript'
      inlineScript: |
        az account show
        echo "Deploying with federated identity - no secrets stored"

Service Connection Checks

Add approval and business hours checks to production service connections:

# This is configured via the UI, but here is the equivalent API call
# POST https://dev.azure.com/{org}/{project}/_apis/pipelines/checks/configurations

# Required checks for production service connections:
# 1. Approval gate (minimum 1 approver, no self-approval)
# 2. Business hours check (Mon-Fri 9AM-5PM)
# 3. Branch control (only main branch)
# 4. Required template (must extend approved template)

Variable Group Security

Variable groups are where secrets live, and they need special attention.

Secrets Management

Never store secrets directly in pipeline YAML. Use variable groups linked to Azure Key Vault.

variables:
  # WRONG - secret in YAML (even if marked as secret, it is in source control)
  # - name: dbPassword
  #   value: 'supersecret123'

  # RIGHT - reference a variable group linked to Key Vault
  - group: production-keyvault-secrets

  # RIGHT - use runtime parameters for sensitive inputs
  # (these are not stored in source control)

Key Vault Integration

Link your variable groups to Azure Key Vault so secrets are fetched at runtime and never stored in Azure DevOps.

// validate-variable-groups.js
// Checks that all variable groups use Key Vault backing
var https = require('https');

var org = process.env.AZDO_ORG;
var project = process.env.AZDO_PROJECT;
var pat = process.env.AZDO_PAT;
var auth = Buffer.from(':' + pat).toString('base64');

function listVariableGroups(callback) {
  var options = {
    hostname: 'dev.azure.com',
    path: '/' + org + '/' + project + '/_apis/distributedtask/variablegroups?api-version=7.1-preview.2',
    method: 'GET',
    headers: {
      'Authorization': 'Basic ' + auth,
      'Content-Type': 'application/json'
    }
  };

  var req = https.request(options, function(res) {
    var body = '';
    res.on('data', function(chunk) { body += chunk; });
    res.on('end', function() {
      callback(null, JSON.parse(body));
    });
  });

  req.on('error', function(err) { callback(err); });
  req.end();
}

listVariableGroups(function(err, result) {
  if (err) {
    console.error('Error:', err.message);
    process.exit(1);
  }

  var groups = result.value || [];
  var insecure = [];

  groups.forEach(function(group) {
    var isKeyVaultBacked = group.type === 'AzureKeyVault';
    var hasSecrets = Object.keys(group.variables || {}).some(function(key) {
      return group.variables[key].isSecret;
    });

    if (hasSecrets && !isKeyVaultBacked) {
      insecure.push({
        name: group.name,
        id: group.id,
        secretCount: Object.keys(group.variables || {}).filter(function(key) {
          return group.variables[key].isSecret;
        }).length
      });
    }
  });

  if (insecure.length > 0) {
    console.error('SECURITY FINDING: Variable groups with inline secrets:');
    insecure.forEach(function(g) {
      console.error('  - ' + g.name + ' (ID: ' + g.id + ') - ' + g.secretCount + ' inline secrets');
    });
    console.error('\nRecommendation: Migrate these to Azure Key Vault-backed variable groups');
    process.exit(1);
  } else {
    console.log('All variable groups with secrets are Key Vault-backed. PASS');
  }
});

Expected output for a secure configuration:

All variable groups with secrets are Key Vault-backed. PASS

Expected output for an insecure configuration:

SECURITY FINDING: Variable groups with inline secrets:
  - legacy-deploy-secrets (ID: 42) - 3 inline secrets
  - api-keys (ID: 67) - 5 inline secrets

Recommendation: Migrate these to Azure Key Vault-backed variable groups

Approval Gates and Checks

Approval gates are non-negotiable for production environments. Azure DevOps supports several types of checks that should be layered together.

Environment Checks

# Define environments with checks in your pipeline
stages:
  - stage: DeployProduction
    displayName: 'Deploy to Production'
    dependsOn: DeployStaging
    condition: succeeded()
    jobs:
      - deployment: ProductionDeploy
        environment: 'production'
        strategy:
          runOnce:
            deploy:
              steps:
                - script: echo "Deploying to production"

Configure these checks on the production environment in the Azure DevOps UI:

  1. Approvals: Require at least 2 approvers. Disable "allow approvers to approve their own runs." Set a 72-hour timeout.
  2. Branch control: Only allow deployments from the main branch.
  3. Business hours: Restrict deployments to weekdays 9AM-4PM in your timezone.
  4. Invoke Azure Function: Call a custom validation function that checks deployment prerequisites.
  5. Required template: Force pipelines to extend an approved template.

Exclusive Lock Check

Prevent concurrent deployments to the same environment:

jobs:
  - deployment: ProductionDeploy
    environment: 'production'
    lockBehavior: sequential  # Queue deployments instead of running in parallel
    strategy:
      runOnce:
        deploy:
          steps:
            - script: echo "Safe sequential deployment"

Template Restrictions

Templates are the most powerful security enforcement mechanism in Azure Pipelines. By requiring pipelines to extend approved templates, you can enforce security controls organization-wide.

Required Template

Create a secured base template:

# templates/secured-build.yml
parameters:
  - name: buildSteps
    type: stepList
    default: []
  - name: environment
    type: string
    values:
      - development
      - staging
      - production

steps:
  # Security: Always run credential scanner first
  - task: CredScan@3
    displayName: 'Scan for credentials'
    inputs:
      toolVersion: 'Latest'

  # Security: Verify no secrets in source
  - script: |
      echo "Scanning for hardcoded secrets..."
      if grep -rn "password\s*=\s*['\"]" --include="*.js" --include="*.json" --include="*.yml" .; then
        echo "##vso[task.logissue type=error]Hardcoded secrets detected in source code"
        exit 1
      fi
    displayName: 'Secret detection scan'

  # Run user-defined build steps in a controlled context
  - ${{ each step in parameters.buildSteps }}:
    - ${{ step }}

  # Security: Always run dependency vulnerability scan
  - script: |
      npm audit --audit-level=high
      if [ $? -ne 0 ]; then
        echo "##vso[task.logissue type=error]High severity vulnerabilities detected"
        exit 1
      fi
    displayName: 'Dependency vulnerability scan'

  # Security: Sign build artifacts
  - script: |
      HASH=$(sha256sum $(Build.ArtifactStagingDirectory)/* | tee checksums.txt)
      echo "##vso[task.setvariable variable=artifactHash]$HASH"
    displayName: 'Generate artifact checksums'

  - publish: checksums.txt
    artifact: 'security-checksums'
    displayName: 'Publish artifact checksums'

Consuming the Required Template

# azure-pipelines.yml - Must extend the secured template
extends:
  template: templates/secured-build.yml
  parameters:
    environment: production
    buildSteps:
      - script: npm ci
        displayName: 'Install dependencies'
      - script: npm test
        displayName: 'Run tests'
      - script: npm run build
        displayName: 'Build application'

To enforce that all pipelines in a project use this template, configure it as a Required Template check on your environments and service connections.

Agent Pool Security

Self-hosted agents are a common attack vector. If you use them, you need to harden them aggressively.

Agent Pool Configuration

Agent Pool Security Checklist:
├── Use ephemeral agents (destroy after each build)
├── Run agents as non-root users
├── Disable interactive logon on agent machines
├── Use dedicated VNets with NSG rules
├── Install only required tools on agents
├── Enable disk encryption
├── Rotate agent registration tokens monthly
├── Use separate pools for different trust levels
│   ├── untrusted-pool → PR builds from forks
│   ├── internal-pool  → Builds from internal branches
│   └── deploy-pool    → Production deployments only
└── Monitor agent machine health and patch levels

Ephemeral Agent Setup with Azure Container Instances

# Use Azure Container Instances for ephemeral agents
pool:
  name: 'EphemeralPool'

steps:
  - script: |
      echo "Agent $(Agent.MachineName) is ephemeral"
      echo "This container will be destroyed after this job"
    displayName: 'Ephemeral agent confirmation'

The supporting infrastructure for ephemeral agents requires provisioning a new container for each job and destroying it immediately after. This eliminates persistence-based attacks entirely.

// provision-ephemeral-agent.js
// Called by an Azure Function to spin up a container agent per job
var childProcess = require('child_process');

function provisionAgent(jobId, poolName) {
  var agentName = 'agent-' + jobId.substring(0, 8);
  var containerName = 'azdo-agent-' + jobId.substring(0, 8);

  var cmd = [
    'az container create',
    '--resource-group pipeline-agents-rg',
    '--name ' + containerName,
    '--image myregistry.azurecr.io/azdo-agent:latest',
    '--cpu 2',
    '--memory 4',
    '--restart-policy Never',
    '--environment-variables',
    'AZP_URL=https://dev.azure.com/myorg',
    'AZP_POOL=' + poolName,
    'AZP_AGENT_NAME=' + agentName,
    '--secure-environment-variables',
    'AZP_TOKEN=$AGENT_PAT',
    '--vnet pipeline-agents-vnet',
    '--subnet agents-subnet'
  ].join(' ');

  console.log('Provisioning ephemeral agent:', agentName);

  childProcess.exec(cmd, function(err, stdout, stderr) {
    if (err) {
      console.error('Failed to provision agent:', stderr);
      process.exit(1);
    }
    console.log('Agent provisioned:', stdout);

    // Schedule teardown after job completes
    setTimeout(function() {
      destroyAgent(containerName);
    }, 3600000); // 1 hour max lifetime
  });
}

function destroyAgent(containerName) {
  var cmd = 'az container delete --resource-group pipeline-agents-rg --name ' + containerName + ' --yes';

  childProcess.exec(cmd, function(err, stdout, stderr) {
    if (err) {
      console.error('Failed to destroy agent:', stderr);
      return;
    }
    console.log('Agent destroyed:', containerName);
  });
}

var jobId = process.argv[2];
var poolName = process.argv[3] || 'EphemeralPool';

if (!jobId) {
  console.error('Usage: node provision-ephemeral-agent.js <job-id> [pool-name]');
  process.exit(1);
}

provisionAgent(jobId, poolName);

Artifact Integrity Verification

Every artifact your pipeline produces should be verifiable. This means checksums at minimum, and ideally cryptographic signing.

# Artifact integrity verification steps
steps:
  - script: |
      # Generate SHA-256 checksums for all artifacts
      cd $(Build.ArtifactStagingDirectory)
      find . -type f -exec sha256sum {} \; > checksums.sha256
      echo "Generated checksums:"
      cat checksums.sha256
    displayName: 'Generate artifact checksums'

  - script: |
      # Verify no tampering occurred during staging
      cd $(Build.ArtifactStagingDirectory)
      sha256sum -c checksums.sha256
      if [ $? -ne 0 ]; then
        echo "##vso[task.logissue type=error]Artifact integrity check failed!"
        exit 1
      fi
      echo "All artifacts verified"
    displayName: 'Verify artifact integrity'

  - publish: $(Build.ArtifactStagingDirectory)
    artifact: 'verified-build'
    displayName: 'Publish verified artifacts'

For container images, use Docker Content Trust:

steps:
  - script: |
      export DOCKER_CONTENT_TRUST=1
      export DOCKER_CONTENT_TRUST_SERVER=https://notary.myregistry.io
      docker build -t myregistry.azurecr.io/myapp:$(Build.BuildId) .
      docker push myregistry.azurecr.io/myapp:$(Build.BuildId)
    displayName: 'Build and push signed container image'
    env:
      DOCKER_CONTENT_TRUST_REPOSITORY_PASSPHRASE: $(SIGNING_PASSPHRASE)

Pipeline Decorators for Enforcement

Pipeline decorators inject steps into every pipeline in your organization. They are the ultimate enforcement mechanism because developers cannot bypass them.

Decorators are deployed as Azure DevOps extensions. Here is the structure of a security decorator:

{
  "manifestVersion": 1,
  "id": "security-decorator",
  "version": "1.0.0",
  "name": "Security Pipeline Decorator",
  "description": "Injects mandatory security scanning into all pipelines",
  "publisher": "myorg",
  "targets": [
    {
      "id": "Microsoft.VisualStudio.Services"
    }
  ],
  "contributions": [
    {
      "id": "security-scan-injector",
      "type": "ms.azure-pipelines.pipeline-decorator",
      "targets": [
        "ms.azure-pipelines-agent-job"
      ],
      "properties": {
        "template": "decorator.yml",
        "targetsExclude": [],
        "targetsInclude": []
      }
    }
  ]
}
# decorator.yml - Injected into every pipeline
steps:
  - task: Bash@3
    displayName: '[Security] Mandatory compliance scan'
    inputs:
      targetType: 'inline'
      script: |
        echo "Running mandatory security compliance scan..."
        echo "Pipeline: $(Build.DefinitionName)"
        echo "Branch: $(Build.SourceBranch)"
        echo "Triggered by: $(Build.RequestedFor)"

        # Check branch policy compliance
        if [ "$(Build.Reason)" == "IndividualCI" ] && [ "$(Build.SourceBranch)" == "refs/heads/main" ]; then
          echo "Direct push to main detected - flagging for review"
          echo "##vso[task.logissue type=warning]Direct push to main branch detected"
        fi

        # Verify build is not running as root
        CURRENT_USER=$(whoami)
        if [ "$CURRENT_USER" == "root" ]; then
          echo "##vso[task.logissue type=error]Pipeline is running as root - this violates security policy"
          exit 1
        fi
    condition: always()

Audit Trail and Monitoring

Azure DevOps has built-in auditing, but you need to actively monitor it.

Streaming Audit Logs

Configure audit log streaming to your SIEM or Log Analytics workspace. The following Node.js script pulls audit events and flags suspicious activity:

// monitor-pipeline-audit.js
var https = require('https');

var org = process.env.AZDO_ORG;
var pat = process.env.AZDO_PAT;
var auth = Buffer.from(':' + pat).toString('base64');

var SUSPICIOUS_ACTIONS = [
  'Pipeline.ModifiedPipeline',
  'Policy.PolicyConfigRemoved',
  'Git.RepositoryPermissionChanged',
  'Security.ModifyPermission',
  'Library.ServiceConnectionModified',
  'Library.VariableGroupModified'
];

function getAuditLog(startTime, callback) {
  var path = '/' + org + '/_apis/audit/auditlog?startTime=' +
    encodeURIComponent(startTime) + '&api-version=7.1-preview.1';

  var options = {
    hostname: 'auditservice.dev.azure.com',
    path: path,
    method: 'GET',
    headers: {
      'Authorization': 'Basic ' + auth,
      'Content-Type': 'application/json'
    }
  };

  var req = https.request(options, function(res) {
    var body = '';
    res.on('data', function(chunk) { body += chunk; });
    res.on('end', function() {
      callback(null, JSON.parse(body));
    });
  });

  req.on('error', function(err) { callback(err); });
  req.end();
}

function analyzeLogs() {
  var oneDayAgo = new Date(Date.now() - 86400000).toISOString();

  getAuditLog(oneDayAgo, function(err, result) {
    if (err) {
      console.error('Failed to fetch audit log:', err.message);
      process.exit(1);
    }

    var events = result.decoratedAuditLogEntries || [];
    var alerts = [];

    events.forEach(function(event) {
      if (SUSPICIOUS_ACTIONS.indexOf(event.actionId) !== -1) {
        alerts.push({
          action: event.actionId,
          actor: event.actorDisplayName,
          timestamp: event.timestamp,
          details: event.details,
          ip: event.actorClientId
        });
      }
    });

    console.log('Audit Summary (' + events.length + ' events in last 24 hours)');
    console.log('---');

    if (alerts.length === 0) {
      console.log('No suspicious activity detected');
    } else {
      console.error('ALERTS: ' + alerts.length + ' suspicious events detected');
      alerts.forEach(function(alert) {
        console.error('  [' + alert.timestamp + '] ' + alert.action);
        console.error('    Actor: ' + alert.actor);
        console.error('    Details: ' + alert.details);
        console.error('');
      });
      process.exit(1);
    }
  });
}

analyzeLogs();

Sample output when suspicious activity is detected:

Audit Summary (847 events in last 24 hours)
---
ALERTS: 2 suspicious events detected
  [2026-02-13T14:22:00Z] Security.ModifyPermission
    Actor: [email protected]
    Details: Modified permission for pipeline 'deploy-prod' - added Queue builds for Contributors

  [2026-02-13T16:45:00Z] Library.ServiceConnectionModified
    Actor: [email protected]
    Details: Modified service connection 'prod-azure-connection' - changed authorization scope

Runtime Security Controls

Runtime controls protect your pipeline while it is executing.

Restrict Fork Builds

Fork builds are a major attack vector. A malicious fork can modify the YAML to exfiltrate secrets.

# In your pipeline settings (configured via UI or REST API)
# Settings > Triggers > Pull request validation
# - Do NOT automatically run pipelines for fork PRs
# - If you must, use a separate pool with no secret access

trigger: none  # Disable CI trigger - use PR validation only

pr:
  branches:
    include:
      - main
  # Forks must be handled separately

Secret Masking and Access Controls

steps:
  - script: |
      # Azure DevOps automatically masks variables marked as secret
      # But you should also avoid writing secrets to files

      # WRONG - secret written to disk
      # echo $(mySecret) > /tmp/secret.txt

      # RIGHT - pass secrets via environment variables only
      echo "Connecting to database..."
      node deploy.js
    displayName: 'Secure deployment'
    env:
      DB_PASSWORD: $(dbPassword)  # Mapped as env var, auto-masked in logs
      API_KEY: $(apiKey)

Conditional Access Based on Branch

steps:
  - script: |
      echo "Running on branch: $(Build.SourceBranch)"
    displayName: 'Branch info'

  - script: |
      echo "Deploying to production..."
    displayName: 'Production deploy'
    condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))

  # Never expose production secrets on non-main branches
  - task: AzureKeyVault@2
    displayName: 'Fetch production secrets'
    condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')
    inputs:
      azureSubscription: 'prod-azure-connection'
      KeyVaultName: 'prod-keyvault'
      SecretsFilter: 'db-password,api-key'

Network Isolation for Build Agents

Self-hosted agents should run in isolated network segments. Microsoft-hosted agents are shared infrastructure and cannot be network-isolated, which is one reason to prefer self-hosted agents for sensitive workloads.

VNet Configuration

Network Architecture for Secure Agents:
┌─────────────────────────────────────────────┐
│  Azure VNet: pipeline-agents-vnet           │
│  Address space: 10.0.0.0/16                 │
│                                             │
│  ┌─────────────────────────────────────┐    │
│  │  Subnet: agents-subnet              │    │
│  │  10.0.1.0/24                        │    │
│  │  NSG Rules:                         │    │
│  │  - Outbound: Allow 443 to AzDO      │    │
│  │  - Outbound: Allow 443 to ACR       │    │
│  │  - Outbound: Allow 443 to Key Vault │    │
│  │  - Outbound: Deny all other         │    │
│  │  - Inbound:  Deny all               │    │
│  └─────────────────────────────────────┘    │
│                                             │
│  ┌─────────────────────────────────────┐    │
│  │  Subnet: deploy-subnet              │    │
│  │  10.0.2.0/24                        │    │
│  │  (Private endpoint for target apps) │    │
│  └─────────────────────────────────────┘    │
└─────────────────────────────────────────────┘

NSG Rules in Practice

# Pipeline step to verify network isolation
steps:
  - script: |
      echo "Verifying network isolation..."

      # Should succeed - Azure DevOps connectivity
      curl -sf https://dev.azure.com > /dev/null && echo "AzDO: CONNECTED" || echo "AzDO: BLOCKED"

      # Should fail - general internet access should be blocked
      curl -sf --connect-timeout 5 https://example.com > /dev/null 2>&1 && \
        echo "##vso[task.logissue type=error]General internet access is not blocked!" && exit 1 || \
        echo "General internet: BLOCKED (expected)"

      echo "Network isolation verified"
    displayName: 'Verify network isolation'

Complete Working Example

Here is a fully hardened Azure Pipeline YAML configuration that incorporates all of the security controls discussed above, along with a Node.js validation script.

# azure-pipelines-hardened.yml
# Fully hardened pipeline configuration

trigger:
  branches:
    include:
      - main
  paths:
    exclude:
      - docs/*
      - '*.md'
      - '*.txt'

pr:
  branches:
    include:
      - main

# Require this pipeline to extend a secured template
extends:
  template: templates/secured-base.yml
  parameters:
    pool: 'SecureAgentPool'
    environment: 'production'
    buildSteps:
      - script: |
          node --version
          npm --version
        displayName: 'Verify tool versions'

      - script: npm ci --ignore-scripts
        displayName: 'Install dependencies (no lifecycle scripts)'

      - script: npm run lint
        displayName: 'Lint check'

      - script: npm test
        displayName: 'Run tests'

      - script: npm run build
        displayName: 'Build application'

      - script: npm audit --audit-level=high
        displayName: 'Security audit'

    deploySteps:
      - script: |
          echo "Deploying build $(Build.BuildId)"
          echo "Artifact hash: $(artifactHash)"
        displayName: 'Deploy with integrity verification'
        env:
          DB_CONNECTION: $(db-connection-string)
          API_KEY: $(api-key)

    postDeploySteps:
      - script: |
          curl -sf https://myapp.azurewebsites.net/health || exit 1
        displayName: 'Post-deployment health check'
# templates/secured-base.yml
parameters:
  - name: pool
    type: string
  - name: environment
    type: string
    values:
      - development
      - staging
      - production
  - name: buildSteps
    type: stepList
    default: []
  - name: deploySteps
    type: stepList
    default: []
  - name: postDeploySteps
    type: stepList
    default: []

stages:
  - stage: Build
    displayName: 'Secure Build'
    pool:
      name: ${{ parameters.pool }}
    jobs:
      - job: SecureBuild
        timeoutInMinutes: 30
        cancelTimeoutInMinutes: 5
        workspace:
          clean: all
        steps:
          - checkout: self
            clean: true
            fetchDepth: 1
            persistCredentials: false

          - script: |
              echo "Build started at $(date -u)"
              echo "Agent: $(Agent.MachineName)"
              echo "Branch: $(Build.SourceBranch)"
              whoami | grep -v root || (echo "ERROR: Running as root" && exit 1)
            displayName: '[Security] Pre-flight checks'

          - ${{ each step in parameters.buildSteps }}:
            - ${{ step }}

          - script: |
              cd $(Build.ArtifactStagingDirectory)
              find . -type f -exec sha256sum {} \; > checksums.sha256
            displayName: '[Security] Generate checksums'

          - publish: $(Build.ArtifactStagingDirectory)
            artifact: 'secure-build'

  - stage: Deploy
    displayName: 'Secure Deploy'
    dependsOn: Build
    condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
    pool:
      name: ${{ parameters.pool }}
    jobs:
      - deployment: SecureDeploy
        environment: ${{ parameters.environment }}
        timeoutInMinutes: 15
        cancelTimeoutInMinutes: 5
        strategy:
          runOnce:
            deploy:
              steps:
                - download: current
                  artifact: 'secure-build'

                - script: |
                    cd $(Pipeline.Workspace)/secure-build
                    sha256sum -c checksums.sha256
                  displayName: '[Security] Verify artifact integrity'

                - ${{ each step in parameters.deploySteps }}:
                  - ${{ step }}

            on:
              success:
                steps:
                  - ${{ each step in parameters.postDeploySteps }}:
                    - ${{ step }}
              failure:
                steps:
                  - script: |
                      echo "##vso[task.logissue type=error]Deployment failed - initiating rollback"
                    displayName: '[Security] Alert on failure'

Validation Script

This Node.js script validates that a pipeline YAML file meets security requirements:

// validate-pipeline-security.js
var fs = require('fs');
var path = require('path');

var RULES = [
  {
    name: 'Checkout persistence disabled',
    pattern: /persistCredentials:\s*false/,
    severity: 'CRITICAL',
    message: 'checkout step must set persistCredentials: false'
  },
  {
    name: 'Clean checkout enabled',
    pattern: /clean:\s*true/,
    severity: 'HIGH',
    message: 'checkout step should set clean: true'
  },
  {
    name: 'Timeout configured',
    pattern: /timeoutInMinutes:\s*\d+/,
    severity: 'MEDIUM',
    message: 'Jobs should have explicit timeoutInMinutes'
  },
  {
    name: 'No inline secrets',
    pattern: /password|secret|key/i,
    severity: 'CRITICAL',
    message: 'Potential inline secret detected',
    inverted: true  // Fail if pattern IS found
  },
  {
    name: 'Uses variable group',
    pattern: /group:\s*\S+/,
    severity: 'HIGH',
    message: 'Pipeline should use variable groups for secrets'
  },
  {
    name: 'Shallow fetch',
    pattern: /fetchDepth:\s*[0-9]+/,
    severity: 'LOW',
    message: 'Consider setting fetchDepth for faster, more secure checkout'
  },
  {
    name: 'Uses extends template',
    pattern: /extends:\s*\n\s+template:/,
    severity: 'HIGH',
    message: 'Pipeline should extend a secured template'
  },
  {
    name: 'Branch condition on deploy',
    pattern: /eq\(variables\['Build\.SourceBranch'\],\s*'refs\/heads\/main'\)/,
    severity: 'CRITICAL',
    message: 'Deploy stages must have branch conditions'
  }
];

function validatePipeline(filePath) {
  var content;
  try {
    content = fs.readFileSync(filePath, 'utf8');
  } catch (err) {
    console.error('Cannot read file:', filePath, '-', err.message);
    process.exit(1);
  }

  console.log('Pipeline Security Validation: ' + path.basename(filePath));
  console.log('='.repeat(60));

  var results = { CRITICAL: 0, HIGH: 0, MEDIUM: 0, LOW: 0 };
  var passed = 0;
  var failed = 0;

  RULES.forEach(function(rule) {
    var match = rule.pattern.test(content);
    var pass = rule.inverted ? !match : match;

    if (pass) {
      console.log('  PASS  [' + rule.severity + '] ' + rule.name);
      passed++;
    } else {
      console.log('  FAIL  [' + rule.severity + '] ' + rule.name);
      console.log('         -> ' + rule.message);
      results[rule.severity]++;
      failed++;
    }
  });

  console.log('');
  console.log('Results: ' + passed + ' passed, ' + failed + ' failed');
  console.log('  Critical: ' + results.CRITICAL);
  console.log('  High:     ' + results.HIGH);
  console.log('  Medium:   ' + results.MEDIUM);
  console.log('  Low:      ' + results.LOW);

  if (results.CRITICAL > 0) {
    console.error('\nPipeline FAILED security validation (critical issues found)');
    process.exit(1);
  } else if (results.HIGH > 0) {
    console.warn('\nPipeline passed with warnings (high severity issues found)');
    process.exit(0);
  } else {
    console.log('\nPipeline PASSED security validation');
    process.exit(0);
  }
}

var targetFile = process.argv[2];
if (!targetFile) {
  console.error('Usage: node validate-pipeline-security.js <pipeline-yaml-file>');
  process.exit(1);
}

validatePipeline(targetFile);

Running this against the hardened pipeline:

$ node validate-pipeline-security.js azure-pipelines-hardened.yml
Pipeline Security Validation: azure-pipelines-hardened.yml
============================================================
  PASS  [CRITICAL] Checkout persistence disabled
  PASS  [HIGH] Clean checkout enabled
  PASS  [MEDIUM] Timeout configured
  PASS  [CRITICAL] No inline secrets
  PASS  [HIGH] Uses variable group
  PASS  [LOW] Shallow fetch
  PASS  [HIGH] Uses extends template
  PASS  [CRITICAL] Branch condition on deploy

Results: 8 passed, 0 failed
  Critical: 0
  High:     0
  Medium:   0
  Low:      0

Pipeline PASSED security validation

Common Issues and Troubleshooting

1. Service Connection Authorization Fails After Hardening

##[error]Pipeline does not have permissions to use service connection 'prod-azure-connection'.

This happens after you disable "Allow access to all pipelines" on a service connection. You need to explicitly authorize each pipeline that needs access. Go to Project Settings > Service Connections > (your connection) > Security and add the specific pipeline.

2. Variable Group Access Denied

##[error]There was a resource authorization issue: "The pipeline is not valid. Job BuildJob: Step AzureKeyVault input azureSubscription references service connection 'keyvault-connection' which could not be found."

This error is misleading. It usually means the pipeline has not been authorized to use the variable group, not that the service connection is missing. Navigate to Pipelines > Library > (variable group) > Pipeline permissions and authorize the pipeline.

3. Required Template Check Blocks Pipeline

##[error]Stage DeployProduction fails check 'Required template' for resource environment 'production'.
Pipeline must extend from template 'templates/secured-base.yml@security-templates'.

This means the environment has a Required Template check configured, but your pipeline is not extending the correct template. Ensure your pipeline YAML has the correct extends block with the exact template path and repository reference. The @security-templates suffix means the template is in a separate repository resource named security-templates that your pipeline must also declare.

4. Agent Pool Authorization Issues After Lockdown

##[error]No agent found in pool 'SecureAgentPool' which satisfies the specified demands: Agent.OS -equals Linux
Could not queue the build because the agent pool 'SecureAgentPool' does not have any compatible agents.

After restricting agent pool access, pipelines may lose the ability to see available agents. Check two things: first, that the pipeline is authorized on the agent pool under Organization Settings > Agent Pools > Security. Second, verify the agents in the pool actually satisfy your demands. The demands field is case-sensitive.

5. Fork PR Builds Accessing Secrets

##[warning]Secret variables are not available for pull request builds from forks.
##[error]Variable 'apiKey' is not set or is empty.

This is actually the desired behavior. Azure DevOps intentionally blocks secret access for fork PRs. If your PR validation build needs to succeed without secrets, you need to restructure your pipeline so that steps requiring secrets are conditional on the build source.

Best Practices

  • Use Workload Identity Federation for all service connections. Client secrets are a liability. Federated credentials cannot be exfiltrated because there is no secret to steal.

  • Enforce required templates on all environments and service connections. Templates are the only way to guarantee that every pipeline in your organization runs mandatory security controls like credential scanning and dependency auditing.

  • Never run agents as root. This is non-negotiable. A compromised agent running as root gives the attacker full control of the host. Create a dedicated service account with minimal permissions.

  • Use ephemeral agents for all sensitive workloads. Persistent agents accumulate state, cached credentials, and potential malware. Destroy the agent after every job and provision a fresh one.

  • Stream audit logs to an external SIEM. Azure DevOps audit logs are essential for incident response, but they are only useful if someone is watching them. Forward them to Splunk, Sentinel, or whatever your organization uses.

  • Rotate all tokens and credentials on a schedule. PATs, agent registration tokens, and any remaining client secrets should be rotated at least quarterly. Automate this with Azure Automation or a pipeline.

  • Separate agent pools by trust level. Fork PR builds, internal branch builds, and production deployments should use different pools with different network access and secret availability.

  • Review pipeline permissions quarterly. Permission drift is real. Set a calendar reminder to audit who can queue builds, modify pipelines, and access service connections. Use the REST API scripts in this article to automate the review.

  • Pin task versions in your YAML. Using task: AzureCLI@2 instead of task: AzureCLI prevents unexpected behavior if a new major version of a task introduces breaking changes or, in a worst-case scenario, supply chain compromise.

References

Powered by Contentful