Pipelines

Working with Pipeline Artifacts and Build Outputs

Complete guide to managing pipeline artifacts and build outputs in Azure DevOps, covering publishing, downloading, retention, optimization with .artifactignore, and cross-pipeline sharing.

Working with Pipeline Artifacts and Build Outputs

Overview

Pipeline artifacts are the mechanism Azure DevOps uses to pass files between stages, share build outputs with deployment pipelines, and retain deliverables for auditing or rollback. If you have ever wondered why your deployment stage cannot find the binaries your build stage just compiled, or why your pipeline takes twenty minutes uploading node_modules to artifact storage, this article is for you. Understanding the artifact system -- its two competing task families, retention rules, optimization knobs, and REST API -- is fundamental to building CI/CD pipelines that are fast, reliable, and maintainable.

Prerequisites

  • An Azure DevOps organization with at least one project
  • A YAML pipeline with at least two stages (build and deploy)
  • Familiarity with YAML pipeline syntax (trigger, stages, jobs, steps)
  • Node.js 18+ installed locally for the code examples
  • Basic understanding of npm workspaces or monorepo tooling

Pipeline Artifacts vs Build Artifacts

Azure DevOps has two competing task families for artifact management. This causes real confusion because both do roughly the same thing but with different performance characteristics, retention behavior, and API surfaces.

PublishBuildArtifacts (Legacy)

The original artifact task. It uploads files to Azure DevOps as "build artifacts" stored in a file container.

- task: PublishBuildArtifacts@1
  inputs:
    pathToPublish: '$(Build.ArtifactStagingDirectory)'
    artifactName: 'drop'
    publishLocation: 'Container'

Key characteristics:

  • Stores artifacts in Azure DevOps file containers
  • Supports publishLocation: 'FilePath' to publish to a UNC file share
  • Uses older REST APIs under _apis/build/builds/{buildId}/artifacts
  • Single-threaded upload -- slow for large artifacts
  • Well-documented and battle-tested

PublishPipelineArtifact (Modern)

The newer, faster replacement. It uses Azure Artifact's deduplication-backed storage.

- task: PublishPipelineArtifact@1
  inputs:
    targetPath: '$(Build.ArtifactStagingDirectory)'
    artifactName: 'drop'
    publishLocation: 'pipeline'

Key characteristics:

  • Uses dedup-based storage (content-addressable chunks)
  • Multi-threaded upload and download -- significantly faster
  • Only supports pipeline-level storage (no UNC file share option)
  • Better integration with multi-stage YAML pipelines
  • Artifacts are associated with the pipeline run, not just the build

My recommendation: Use PublishPipelineArtifact and DownloadPipelineArtifact for all new pipelines. The legacy tasks still work, but the performance difference is substantial. On a recent project, switching from PublishBuildArtifacts to PublishPipelineArtifact for a 450MB artifact dropped upload time from 3 minutes 40 seconds to 48 seconds.

The Shorthand Syntax

For simple cases, Azure DevOps provides shorthand keywords that map to these tasks:

steps:
  - publish: $(Build.ArtifactStagingDirectory)
    artifact: drop
    displayName: 'Publish build output'

This is equivalent to PublishPipelineArtifact@1. The corresponding download shorthand:

steps:
  - download: current
    artifact: drop
    displayName: 'Download build output'

This maps to DownloadPipelineArtifact@2. I use the shorthand syntax in most pipelines because it reduces visual noise.

Publishing Artifacts from Build Stages

The typical pattern is straightforward: build your project, copy the outputs to a staging directory, then publish.

Node.js Build and Publish

stages:
  - stage: Build
    displayName: 'Build'
    jobs:
      - job: BuildApp
        displayName: 'Build Node.js Application'
        pool:
          vmImage: 'ubuntu-latest'
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'
            displayName: 'Install Node.js 20'

          - script: npm ci
            displayName: 'Install dependencies'

          - script: npm run build
            displayName: 'Build application'

          - script: npm test
            displayName: 'Run tests'

          - script: npm prune --production
            displayName: 'Remove dev dependencies'

          - task: CopyFiles@2
            inputs:
              sourceFolder: '$(System.DefaultWorkingDirectory)'
              contents: |
                package.json
                package-lock.json
                dist/**
                node_modules/**
                config/**
              targetFolder: '$(Build.ArtifactStagingDirectory)/app'
            displayName: 'Copy files to staging'

          - publish: $(Build.ArtifactStagingDirectory)/app
            artifact: app-build
            displayName: 'Publish application artifact'

Notice the npm prune --production step before copying. This strips out dev dependencies, reducing the artifact size. On one of my Express.js projects, this took the artifact from 280MB down to 62MB.

Publishing Multiple Artifacts

When you need to publish separate artifacts from a single job -- say, the application build plus a database migration bundle -- publish them with distinct names:

steps:
  - publish: $(Build.ArtifactStagingDirectory)/app
    artifact: app-build
    displayName: 'Publish app artifact'

  - publish: $(Build.ArtifactStagingDirectory)/migrations
    artifact: db-migrations
    displayName: 'Publish migration scripts'

  - publish: $(Build.ArtifactStagingDirectory)/infrastructure
    artifact: infra-templates
    displayName: 'Publish ARM/Bicep templates'

Each artifact appears as a separate downloadable unit in the pipeline run summary. This matters because deployment stages can selectively download only what they need.

Downloading Artifacts in Deployment Stages

The download step is where most people first encounter artifact issues. In a multi-stage pipeline, each stage runs on a fresh agent. Nothing from the build stage exists unless you explicitly download it.

  - stage: DeployStaging
    displayName: 'Deploy to Staging'
    dependsOn: Build
    jobs:
      - deployment: DeployApp
        displayName: 'Deploy Application'
        environment: 'staging'
        strategy:
          runOnce:
            deploy:
              steps:
                - download: current
                  artifact: app-build
                  displayName: 'Download app artifact'

                - script: |
                    ls -la $(Pipeline.Workspace)/app-build/
                  displayName: 'Verify artifact contents'

                - script: |
                    cd $(Pipeline.Workspace)/app-build
                    node dist/server.js &
                    sleep 5
                    curl -f http://localhost:3000/health || exit 1
                  displayName: 'Smoke test'

Important detail: When using deployment jobs (the deployment keyword with environment), Azure DevOps automatically downloads all artifacts from the current pipeline run. You can override this by specifying - download: none and then explicitly downloading only what you need:

strategy:
  runOnce:
    deploy:
      steps:
        - download: none

        - download: current
          artifact: db-migrations
          displayName: 'Download only migrations'

This is a real performance win when your pipeline publishes five or six artifacts but a specific deployment stage only needs one.

Artifact Download Paths

Downloaded artifacts land in $(Pipeline.Workspace) organized by artifact name:

$(Pipeline.Workspace)/
  app-build/
    package.json
    dist/
      server.js
    node_modules/
    config/
  db-migrations/
    001-create-tables.sql
    002-add-indexes.sql

For build artifacts (the legacy task), files land in $(System.ArtifactsDirectory) instead. Mixing the two systems causes path confusion -- another reason to standardize on pipeline artifacts.

Artifact Naming Conventions and Organization

Artifact names become part of your deployment scripts and templates. Pick a convention and stick with it. Here is what I use:

{component}-{type}

Examples:
  api-build          # compiled API server
  api-tests          # test results and coverage
  web-build           # compiled frontend
  db-migrations       # SQL migration scripts
  infra-templates     # infrastructure as code
  e2e-results         # end-to-end test screenshots/logs

Avoid putting version numbers or timestamps in artifact names. The pipeline run ID already provides uniqueness. Naming an artifact api-build-v1.2.3-20260208 makes download steps fragile because the name changes every run.

For monorepos with multiple packages, prefix with the package name:

- publish: $(Build.ArtifactStagingDirectory)/auth-service
  artifact: auth-service-build

- publish: $(Build.ArtifactStagingDirectory)/gateway
  artifact: gateway-build

- publish: $(Build.ArtifactStagingDirectory)/shared-lib
  artifact: shared-lib-build

Universal Packages vs Pipeline Artifacts

Azure DevOps offers Universal Packages through Azure Artifacts as an alternative to pipeline artifacts. They solve different problems.

Feature Pipeline Artifacts Universal Packages
Scope Single pipeline run Organization-wide
Versioning Tied to run ID Semantic versioning
Retention Follows pipeline retention Independent retention
Access Pipeline-internal Feed permissions
CLI Not available az artifacts universal
Size limit ~2GB per artifact ~4TB per package

Use pipeline artifacts when passing files between stages in the same pipeline run. Use universal packages when you need to share versioned binaries across multiple pipelines, teams, or even organizations.

Publishing a universal package from a pipeline:

- task: UniversalPackages@0
  inputs:
    command: 'publish'
    publishDirectory: '$(Build.ArtifactStagingDirectory)/app'
    feedsToUsePublish: 'internal'
    vstsFeedPublish: 'my-project/shared-packages'
    vstsFeedPackagePublish: 'api-server'
    versionOption: 'custom'
    versionPublish: '$(Build.BuildNumber)'
  displayName: 'Publish to Universal Packages feed'

Consuming it in another pipeline:

- task: UniversalPackages@0
  inputs:
    command: 'download'
    feedsToUse: 'internal'
    vstsFeed: 'my-project/shared-packages'
    vstsFeedPackage: 'api-server'
    vstsPackageVersion: '*'
    downloadDirectory: '$(Build.ArtifactStagingDirectory)/api'
  displayName: 'Download latest API server package'

Artifact Retention Policies

This catches teams off guard. Pipeline artifacts follow the pipeline run retention policy, not an independent retention setting.

Default retention varies by project settings, but it is typically 30 days for non-production runs. After that, the run and all its artifacts are deleted.

Configuring Retention

At the project level: Project Settings > Pipelines > Settings > Retention.

At the pipeline level in YAML:

trigger:
  - main

pool:
  vmImage: 'ubuntu-latest'

# Keep runs from main for 365 days
# Keep PR runs for 10 days
resources:
  pipelines: []

You cannot set retention directly in YAML for the current pipeline. Instead, use the REST API or the pipeline settings UI. But you can use a script step to mark a run for retention:

- script: |
    curl -X PATCH \
      -H "Authorization: Bearer $(System.AccessToken)" \
      -H "Content-Type: application/json" \
      -d '{"retainedByRelease": true}' \
      "$(System.CollectionUri)$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)?api-version=7.1"
  displayName: 'Mark build for retention'
  condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')

This prevents the run (and its artifacts) from being cleaned up by automatic retention policies. Use it sparingly -- retained runs accumulate storage costs.

Leasing Artifacts

For more granular control, use build retention leases:

var http = require("https");
var url = require("url");

var orgUrl = process.env.SYSTEM_COLLECTIONURI;
var project = process.env.SYSTEM_TEAMPROJECT;
var buildId = process.env.BUILD_BUILDID;
var token = process.env.SYSTEM_ACCESSTOKEN;

var leasePayload = JSON.stringify([{
    daysValid: 365,
    definitionId: parseInt(process.env.SYSTEM_DEFINITIONID),
    ownerId: "User:" + process.env.BUILD_REQUESTEDFORID,
    protectPipeline: false,
    runId: parseInt(buildId)
}]);

var parsed = url.parse(orgUrl + project + "/_apis/build/retention/leases?api-version=7.1");

var options = {
    hostname: parsed.hostname,
    path: parsed.path,
    method: "POST",
    headers: {
        "Content-Type": "application/json",
        "Authorization": "Basic " + Buffer.from(":" + token).toString("base64"),
        "Content-Length": Buffer.byteLength(leasePayload)
    }
};

var req = http.request(options, function(res) {
    var body = "";
    res.on("data", function(chunk) { body += chunk; });
    res.on("end", function() {
        console.log("Lease created:", body);
    });
});

req.on("error", function(err) {
    console.error("Failed to create lease:", err.message);
    process.exit(1);
});

req.write(leasePayload);
req.end();

Sharing Artifacts Across Pipelines

Sometimes you need Pipeline B to consume artifacts from Pipeline A. The resources.pipelines block handles this:

resources:
  pipelines:
    - pipeline: api-build
      source: 'API-Server-CI'
      trigger:
        branches:
          include:
            - main

stages:
  - stage: Deploy
    jobs:
      - job: DeployFromUpstream
        steps:
          - download: api-build
            artifact: app-build
            displayName: 'Download artifact from API build pipeline'

          - script: |
              ls -la $(Pipeline.Workspace)/api-build/app-build/
            displayName: 'List downloaded files'

The source value must match the pipeline name exactly as it appears in Azure DevOps (not the YAML filename, the display name). The pipeline alias (api-build here) is what you reference in download steps.

You can pin to a specific run or branch:

resources:
  pipelines:
    - pipeline: api-build
      source: 'API-Server-CI'
      version: '20260208.3'  # specific run number
      branch: main           # or latest from branch

Cross-Project Artifact Access

For artifacts from a different Azure DevOps project:

resources:
  pipelines:
    - pipeline: shared-lib
      source: 'Shared-Library-CI'
      project: 'Platform-Team'

The service account running your pipeline needs read access to the source project's pipelines.

Artifact Size Optimization

Node.js projects are notorious for bloated artifacts because node_modules can easily exceed 500MB. Here are the strategies I use.

The .artifactignore File

Place a .artifactignore file in the root of the directory you are publishing. It uses .gitignore syntax:

# .artifactignore

# Test files
**/*.test.js
**/*.spec.js
__tests__/
coverage/
.nyc_output/

# Source files (if publishing compiled output)
src/
*.ts
tsconfig.json

# Development config
.eslintrc*
.prettierrc*
jest.config.*
.editorconfig

# Documentation
*.md
docs/
LICENSE

# OS files
.DS_Store
Thumbs.db

# Git
.git/
.gitignore

# Logs
*.log
npm-debug.log*

# Environment
.env
.env.*

The .artifactignore file must be at the root of the published directory, not the repository root. If you publish $(Build.ArtifactStagingDirectory)/app, then .artifactignore must be inside that app directory.

You can copy it in during the build:

- script: cp .artifactignore $(Build.ArtifactStagingDirectory)/app/.artifactignore
  displayName: 'Copy .artifactignore to staging'

Excluding node_modules Entirely

The better approach for most Node.js projects: do not include node_modules in the artifact at all. Instead, run npm ci --production in the deployment stage.

# Build stage
steps:
  - script: npm ci && npm run build && npm test
    displayName: 'Build and test'

  - task: CopyFiles@2
    inputs:
      sourceFolder: '$(System.DefaultWorkingDirectory)'
      contents: |
        package.json
        package-lock.json
        dist/**
        config/**
      targetFolder: '$(Build.ArtifactStagingDirectory)/app'

  - publish: $(Build.ArtifactStagingDirectory)/app
    artifact: app-build

# Deploy stage
steps:
  - download: current
    artifact: app-build

  - script: |
      cd $(Pipeline.Workspace)/app-build
      npm ci --production
    displayName: 'Install production dependencies'

This trades artifact size for deployment time. On a typical Express.js API, the artifact drops from 150MB to 8MB, but the deployment stage adds 15-30 seconds for npm ci. Almost always worth it.

Size Comparison

Here is a real example from a Node.js monorepo with three services:

Strategy Artifact Size Upload Time Download Time
Everything including node_modules 487MB 2m 15s 1m 42s
npm prune --production 124MB 38s 28s
.artifactignore (aggressive) 89MB 24s 18s
Dist only, no node_modules 11MB 3s 2s

Caching vs Artifacts: When to Use Which

Azure DevOps has a Cache@2 task that often gets confused with artifacts. They solve different problems.

Caching is for speeding up repeated operations within the same pipeline definition across multiple runs. The classic example is caching node_modules so npm ci does not download every package on every run.

- task: Cache@2
  inputs:
    key: 'npm | "$(Agent.OS)" | package-lock.json'
    path: '$(System.DefaultWorkingDirectory)/node_modules'
    restoreKeys: |
      npm | "$(Agent.OS)"
  displayName: 'Cache node_modules'

- script: npm ci
  displayName: 'Install dependencies'
  condition: ne(variables['CacheRestored'], 'true')

Artifacts pass files between stages or pipeline runs. They are the output of your build.

Use Case Cache Artifact
Speed up npm install Yes No
Pass build output to deploy stage No Yes
Share compiled library across pipelines No Yes
Preserve test results No Yes
Reuse Docker layer cache Yes No
Store deployment package No Yes

You can use both together. Cache the dependencies to speed up the build, then publish the build output as an artifact for deployment:

steps:
  # Cache for speed
  - task: Cache@2
    inputs:
      key: 'npm | "$(Agent.OS)" | package-lock.json'
      path: '$(System.DefaultWorkingDirectory)/node_modules'

  - script: npm ci
    displayName: 'Install dependencies'

  - script: npm run build
    displayName: 'Build'

  # Artifact for deployment
  - publish: $(Build.ArtifactStagingDirectory)/app
    artifact: app-build

Accessing Artifacts via REST API

The Azure DevOps REST API lets you download artifacts programmatically. This is useful for external deployment tools, custom dashboards, or rollback scripts.

Listing Artifacts for a Build

var https = require("https");
var url = require("url");

var orgUrl = "https://dev.azure.com/myorg";
var project = "myproject";
var buildId = 12345;
var pat = process.env.AZURE_DEVOPS_PAT;

var apiUrl = orgUrl + "/" + project + "/_apis/build/builds/" + buildId + "/artifacts?api-version=7.1";
var parsed = url.parse(apiUrl);

var options = {
    hostname: parsed.hostname,
    path: parsed.path,
    method: "GET",
    headers: {
        "Authorization": "Basic " + Buffer.from(":" + pat).toString("base64")
    }
};

var req = https.request(options, function(res) {
    var body = "";
    res.on("data", function(chunk) { body += chunk; });
    res.on("end", function() {
        var result = JSON.parse(body);
        result.value.forEach(function(artifact) {
            console.log("Artifact:", artifact.name);
            console.log("  Resource URL:", artifact.resource.downloadUrl);
            console.log("  Type:", artifact.resource.type);
            console.log("");
        });
    });
});

req.on("error", function(err) {
    console.error("Error:", err.message);
});

req.end();

Sample output:

Artifact: app-build
  Resource URL: https://dev.azure.com/myorg/abc123/_apis/build/builds/12345/artifacts?artifactName=app-build&api-version=7.1&%24format=zip
  Type: Container

Artifact: db-migrations
  Resource URL: https://dev.azure.com/myorg/abc123/_apis/build/builds/12345/artifacts?artifactName=db-migrations&api-version=7.1&%24format=zip
  Type: Container

Downloading an Artifact

var https = require("https");
var fs = require("fs");
var url = require("url");

function downloadArtifact(orgUrl, project, buildId, artifactName, pat, outputPath) {
    var apiUrl = orgUrl + "/" + project +
        "/_apis/build/builds/" + buildId +
        "/artifacts?artifactName=" + encodeURIComponent(artifactName) +
        "&api-version=7.1&%24format=zip";

    var parsed = url.parse(apiUrl);

    var options = {
        hostname: parsed.hostname,
        path: parsed.path,
        method: "GET",
        headers: {
            "Authorization": "Basic " + Buffer.from(":" + pat).toString("base64")
        }
    };

    var file = fs.createWriteStream(outputPath);

    var req = https.request(options, function(res) {
        if (res.statusCode === 302) {
            // Follow redirect
            var redirectParsed = url.parse(res.headers.location);
            var redirectOptions = {
                hostname: redirectParsed.hostname,
                path: redirectParsed.path,
                method: "GET"
            };
            https.get(res.headers.location, function(redirectRes) {
                redirectRes.pipe(file);
                file.on("finish", function() {
                    file.close();
                    console.log("Downloaded to " + outputPath);
                });
            });
            return;
        }

        res.pipe(file);
        file.on("finish", function() {
            file.close();
            var stats = fs.statSync(outputPath);
            console.log("Downloaded to " + outputPath + " (" + (stats.size / 1024 / 1024).toFixed(2) + " MB)");
        });
    });

    req.on("error", function(err) {
        fs.unlink(outputPath, function() {});
        console.error("Download failed:", err.message);
    });

    req.end();
}

downloadArtifact(
    "https://dev.azure.com/myorg",
    "myproject",
    12345,
    "app-build",
    process.env.AZURE_DEVOPS_PAT,
    "./app-build.zip"
);

Rollback Script Using Artifact API

Here is a practical rollback script that downloads an artifact from a previous successful build and deploys it:

var https = require("https");
var url = require("url");
var execSync = require("child_process").execSync;
var fs = require("fs");

var ORG_URL = "https://dev.azure.com/myorg";
var PROJECT = "myproject";
var DEFINITION_ID = 42;
var PAT = process.env.AZURE_DEVOPS_PAT;

function apiRequest(path, callback) {
    var apiUrl = ORG_URL + "/" + PROJECT + "/_apis" + path;
    var parsed = url.parse(apiUrl);

    var options = {
        hostname: parsed.hostname,
        path: parsed.path,
        method: "GET",
        headers: {
            "Authorization": "Basic " + Buffer.from(":" + PAT).toString("base64")
        }
    };

    var req = https.request(options, function(res) {
        var body = "";
        res.on("data", function(chunk) { body += chunk; });
        res.on("end", function() {
            callback(null, JSON.parse(body));
        });
    });

    req.on("error", function(err) { callback(err); });
    req.end();
}

function findLastSuccessfulBuild(callback) {
    var path = "/build/builds?definitions=" + DEFINITION_ID +
        "&resultFilter=succeeded&statusFilter=completed" +
        "&$top=1&branchName=refs/heads/main&api-version=7.1";

    apiRequest(path, function(err, data) {
        if (err) return callback(err);
        if (!data.value || data.value.length === 0) {
            return callback(new Error("No successful builds found"));
        }
        var build = data.value[0];
        console.log("Found build #" + build.buildNumber + " (ID: " + build.id + ")");
        console.log("  Finished: " + build.finishTime);
        console.log("  Commit: " + build.sourceVersion.substring(0, 8));
        callback(null, build);
    });
}

function rollback() {
    findLastSuccessfulBuild(function(err, build) {
        if (err) {
            console.error("Rollback failed:", err.message);
            process.exit(1);
        }

        console.log("\nDownloading artifact from build " + build.id + "...");

        var downloadUrl = ORG_URL + "/" + PROJECT +
            "/_apis/build/builds/" + build.id +
            "/artifacts?artifactName=app-build&api-version=7.1&%24format=zip";

        execSync("curl -s -L -o rollback-artifact.zip -u :" + PAT + ' "' + downloadUrl + '"');

        console.log("Extracting...");
        execSync("unzip -o rollback-artifact.zip -d rollback-deploy/");

        console.log("Deploying...");
        execSync("cd rollback-deploy/app-build && npm ci --production");

        console.log("Rollback to build #" + build.buildNumber + " complete.");
    });
}

rollback();

Complete Working Example

Here is a production-ready multi-stage pipeline for a Node.js monorepo with three packages: an API server, a worker service, and a shared library.

Repository Structure

monorepo/
  packages/
    api/
      src/
      package.json
      .artifactignore
    worker/
      src/
      package.json
      .artifactignore
    shared/
      src/
      package.json
  package.json           # workspace root
  package-lock.json
  azure-pipelines.yml

.artifactignore (shared across packages)

# packages/api/.artifactignore
node_modules/
src/
**/*.test.js
**/*.spec.js
__tests__/
coverage/
.nyc_output/
tsconfig.json
.eslintrc.json
*.map
.env
.env.*
jest.config.js
.git/
*.md

azure-pipelines.yml

trigger:
  branches:
    include:
      - main
  paths:
    exclude:
      - '**/*.md'
      - 'docs/**'

variables:
  nodeVersion: '20.x'
  npmCacheFolder: $(Pipeline.Workspace)/.npm

stages:
  # ============================================================
  # Stage 1: Build all packages
  # ============================================================
  - stage: Build
    displayName: 'Build & Test'
    jobs:
      - job: BuildShared
        displayName: 'Build Shared Library'
        pool:
          vmImage: 'ubuntu-latest'
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: $(nodeVersion)

          - task: Cache@2
            inputs:
              key: 'npm | "$(Agent.OS)" | package-lock.json'
              path: $(npmCacheFolder)
            displayName: 'Cache npm packages'

          - script: npm ci --workspace=packages/shared
            displayName: 'Install shared dependencies'

          - script: npm run build --workspace=packages/shared
            displayName: 'Build shared library'

          - script: npm test --workspace=packages/shared
            displayName: 'Test shared library'

          - task: CopyFiles@2
            inputs:
              sourceFolder: 'packages/shared'
              contents: |
                package.json
                dist/**
              targetFolder: '$(Build.ArtifactStagingDirectory)/shared'

          - publish: $(Build.ArtifactStagingDirectory)/shared
            artifact: shared-lib-build

      - job: BuildAPI
        displayName: 'Build API Server'
        dependsOn: BuildShared
        pool:
          vmImage: 'ubuntu-latest'
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: $(nodeVersion)

          - download: current
            artifact: shared-lib-build
            patterns: '**'

          - script: |
              mkdir -p packages/shared/dist
              cp -r $(Pipeline.Workspace)/shared-lib-build/dist/* packages/shared/dist/
              cp $(Pipeline.Workspace)/shared-lib-build/package.json packages/shared/package.json
            displayName: 'Restore shared library build'

          - script: npm ci --workspace=packages/api
            displayName: 'Install API dependencies'

          - script: npm run build --workspace=packages/api
            displayName: 'Build API'

          - script: npm test --workspace=packages/api -- --ci --reporters=default --reporters=jest-junit
            displayName: 'Test API'
            env:
              JEST_JUNIT_OUTPUT_DIR: $(Build.ArtifactStagingDirectory)/test-results

          - task: PublishTestResults@2
            inputs:
              testResultsFormat: 'JUnit'
              testResultsFiles: '$(Build.ArtifactStagingDirectory)/test-results/*.xml'
            condition: always()
            displayName: 'Publish test results'

          - task: CopyFiles@2
            inputs:
              sourceFolder: 'packages/api'
              contents: |
                package.json
                package-lock.json
                dist/**
                config/**
                .artifactignore
              targetFolder: '$(Build.ArtifactStagingDirectory)/api'

          - publish: $(Build.ArtifactStagingDirectory)/api
            artifact: api-build

      - job: BuildWorker
        displayName: 'Build Worker Service'
        dependsOn: BuildShared
        pool:
          vmImage: 'ubuntu-latest'
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: $(nodeVersion)

          - download: current
            artifact: shared-lib-build
            patterns: '**'

          - script: |
              mkdir -p packages/shared/dist
              cp -r $(Pipeline.Workspace)/shared-lib-build/dist/* packages/shared/dist/
              cp $(Pipeline.Workspace)/shared-lib-build/package.json packages/shared/package.json
            displayName: 'Restore shared library build'

          - script: npm ci --workspace=packages/worker
            displayName: 'Install Worker dependencies'

          - script: npm run build --workspace=packages/worker
            displayName: 'Build Worker'

          - script: npm test --workspace=packages/worker
            displayName: 'Test Worker'

          - task: CopyFiles@2
            inputs:
              sourceFolder: 'packages/worker'
              contents: |
                package.json
                package-lock.json
                dist/**
                config/**
                .artifactignore
              targetFolder: '$(Build.ArtifactStagingDirectory)/worker'

          - publish: $(Build.ArtifactStagingDirectory)/worker
            artifact: worker-build

  # ============================================================
  # Stage 2: Deploy to Staging
  # ============================================================
  - stage: DeployStaging
    displayName: 'Deploy to Staging'
    dependsOn: Build
    condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
    jobs:
      - deployment: DeployAPIStaging
        displayName: 'Deploy API to Staging'
        environment: 'staging-api'
        pool:
          vmImage: 'ubuntu-latest'
        strategy:
          runOnce:
            deploy:
              steps:
                - download: none

                - download: current
                  artifact: api-build
                  displayName: 'Download API artifact'

                - script: |
                    cd $(Pipeline.Workspace)/api-build
                    npm ci --production
                    echo "API artifact contents:"
                    du -sh .
                    ls -la dist/
                  displayName: 'Install production deps and verify'

                - task: AzureWebApp@1
                  inputs:
                    azureSubscription: 'Production-ServiceConnection'
                    appName: 'myapp-api-staging'
                    package: '$(Pipeline.Workspace)/api-build'
                  displayName: 'Deploy to Azure App Service'

      - deployment: DeployWorkerStaging
        displayName: 'Deploy Worker to Staging'
        environment: 'staging-worker'
        pool:
          vmImage: 'ubuntu-latest'
        strategy:
          runOnce:
            deploy:
              steps:
                - download: none

                - download: current
                  artifact: worker-build
                  displayName: 'Download Worker artifact'

                - script: |
                    cd $(Pipeline.Workspace)/worker-build
                    npm ci --production
                  displayName: 'Install production deps'

                - task: AzureWebApp@1
                  inputs:
                    azureSubscription: 'Production-ServiceConnection'
                    appName: 'myapp-worker-staging'
                    package: '$(Pipeline.Workspace)/worker-build'

  # ============================================================
  # Stage 3: Deploy to Production (manual approval gate)
  # ============================================================
  - stage: DeployProduction
    displayName: 'Deploy to Production'
    dependsOn: DeployStaging
    condition: succeeded()
    jobs:
      - deployment: DeployAPIProd
        displayName: 'Deploy API to Production'
        environment: 'production-api'
        pool:
          vmImage: 'ubuntu-latest'
        strategy:
          runOnce:
            deploy:
              steps:
                - download: none

                - download: current
                  artifact: api-build
                  displayName: 'Download API artifact'

                - script: |
                    cd $(Pipeline.Workspace)/api-build
                    npm ci --production
                  displayName: 'Install production deps'

                - task: AzureWebApp@1
                  inputs:
                    azureSubscription: 'Production-ServiceConnection'
                    appName: 'myapp-api-prod'
                    package: '$(Pipeline.Workspace)/api-build'
                  displayName: 'Deploy to Production'

                # Retain this build since it went to production
                - script: |
                    curl -X POST \
                      -H "Authorization: Bearer $(System.AccessToken)" \
                      -H "Content-Type: application/json" \
                      -d '[{"daysValid": 365, "definitionId": $(System.DefinitionId), "ownerId": "User:$(Build.RequestedForId)", "protectPipeline": false, "runId": $(Build.BuildId)}]' \
                      "$(System.CollectionUri)$(System.TeamProject)/_apis/build/retention/leases?api-version=7.1"
                  displayName: 'Create retention lease for production deploy'

Verifying Artifact Contents

Add a utility script to your monorepo for verifying artifact contents before deployment:

// scripts/verify-artifact.js
var fs = require("fs");
var path = require("path");

var artifactDir = process.argv[2];

if (!artifactDir) {
    console.error("Usage: node verify-artifact.js <artifact-directory>");
    process.exit(1);
}

var requiredFiles = [
    "package.json",
    "dist/server.js"
];

var forbiddenPatterns = [
    "node_modules",
    ".env",
    ".git",
    "src/"
];

var errors = [];

// Check required files exist
requiredFiles.forEach(function(file) {
    var fullPath = path.join(artifactDir, file);
    if (!fs.existsSync(fullPath)) {
        errors.push("MISSING required file: " + file);
    }
});

// Check forbidden files are absent
forbiddenPatterns.forEach(function(pattern) {
    var fullPath = path.join(artifactDir, pattern);
    if (fs.existsSync(fullPath)) {
        errors.push("FORBIDDEN path found in artifact: " + pattern);
    }
});

// Check artifact size
function getDirSize(dir) {
    var total = 0;
    var files = fs.readdirSync(dir);
    files.forEach(function(file) {
        var filePath = path.join(dir, file);
        var stat = fs.statSync(filePath);
        if (stat.isDirectory()) {
            total += getDirSize(filePath);
        } else {
            total += stat.size;
        }
    });
    return total;
}

var sizeMB = getDirSize(artifactDir) / 1024 / 1024;
console.log("Artifact size: " + sizeMB.toFixed(2) + " MB");

if (sizeMB > 100) {
    errors.push("Artifact size exceeds 100MB threshold: " + sizeMB.toFixed(2) + " MB");
}

if (errors.length > 0) {
    console.error("\nArtifact verification FAILED:");
    errors.forEach(function(err) {
        console.error("  - " + err);
    });
    process.exit(1);
} else {
    console.log("Artifact verification PASSED");
}

Common Issues and Troubleshooting

1. "No artifacts were found" in Download Step

##[error]No artifacts were found for pattern: **/app-build

This happens when the artifact name in the download step does not match the published artifact name, or when the publishing stage failed or was skipped. Check three things:

  • The artifact name is spelled identically in both publish and download steps (case-sensitive)
  • The build stage actually completed successfully
  • If downloading from another pipeline via resources.pipelines, verify the pipeline resource alias matches what you use in the download step

2. Artifact Upload Timeout on Large Files

##[error]Upload '/home/vsts/work/1/a/app.zip' failed after 3 retries: ESOCKETTIMEDOUT

The PublishBuildArtifacts task is single-threaded and struggles with large artifacts over slow connections. Solutions:

  • Switch to PublishPipelineArtifact which uses multi-threaded uploads
  • Reduce artifact size using .artifactignore or by excluding node_modules
  • If you must use the legacy task, split the artifact into smaller pieces

3. .artifactignore Not Being Applied

Published artifact 'app-build': 487 MB (expected ~11 MB)

The .artifactignore file must be in the root of the directory being published, not the repository root. If you publish $(Build.ArtifactStagingDirectory)/app, the file must be at $(Build.ArtifactStagingDirectory)/app/.artifactignore.

Also, .artifactignore only works with PublishPipelineArtifact, not with PublishBuildArtifacts. The legacy task ignores it entirely.

4. Permission Denied When Downloading Cross-Pipeline Artifacts

##[error]VS30063: You are not authorized to access https://dev.azure.com/myorg/OtherProject

The build service account needs explicit permission to read pipelines in the source project. Go to the source project's Project Settings > Permissions > Build Service and grant Read access to pipelines. Also verify that the pipeline resource definition includes the project property if the source pipeline is in a different project.

5. Artifact Path Confusion Between Pipeline and Build Artifacts

##[error]Path '/home/vsts/work/1/a/app-build/dist/server.js' does not exist

Pipeline artifacts download to $(Pipeline.Workspace)/{artifactName}/, while build artifacts download to $(System.ArtifactsDirectory)/{artifactName}/. If you switch from DownloadBuildArtifacts to DownloadPipelineArtifact, you must update all path references in subsequent steps. A useful debug step:

- script: |
    echo "Pipeline.Workspace: $(Pipeline.Workspace)"
    echo "System.ArtifactsDirectory: $(System.ArtifactsDirectory)"
    echo "---"
    find $(Pipeline.Workspace) -maxdepth 3 -type f | head -50
  displayName: 'Debug: show artifact paths'

6. Retention Lease Conflicts

Cannot delete build 12345 because it has active retention leases.

Retention leases prevent deletion of pipeline runs, which is the intended behavior. But if you create leases automatically on every production deploy without cleaning up old ones, you accumulate runs indefinitely. Write a cleanup script that deletes leases older than your desired retention window:

// scripts/cleanup-leases.js
var https = require("https");
var url = require("url");

var ORG_URL = "https://dev.azure.com/myorg";
var PROJECT = "myproject";
var PAT = process.env.AZURE_DEVOPS_PAT;
var MAX_AGE_DAYS = 365;

function apiRequest(method, path, body, callback) {
    var apiUrl = ORG_URL + "/" + PROJECT + "/_apis" + path;
    var parsed = url.parse(apiUrl);

    var options = {
        hostname: parsed.hostname,
        path: parsed.path,
        method: method,
        headers: {
            "Authorization": "Basic " + Buffer.from(":" + PAT).toString("base64"),
            "Content-Type": "application/json"
        }
    };

    var req = https.request(options, function(res) {
        var responseBody = "";
        res.on("data", function(chunk) { responseBody += chunk; });
        res.on("end", function() {
            callback(null, res.statusCode, responseBody ? JSON.parse(responseBody) : null);
        });
    });

    req.on("error", function(err) { callback(err); });
    if (body) req.write(JSON.stringify(body));
    req.end();
}

apiRequest("GET", "/build/retention/leases?api-version=7.1", null, function(err, status, data) {
    if (err) {
        console.error("Failed to list leases:", err.message);
        process.exit(1);
    }

    var now = new Date();
    var expired = data.value.filter(function(lease) {
        var created = new Date(lease.createdOn);
        var ageDays = (now - created) / (1000 * 60 * 60 * 24);
        return ageDays > MAX_AGE_DAYS;
    });

    console.log("Found " + expired.length + " expired leases out of " + data.value.length + " total");

    expired.forEach(function(lease) {
        apiRequest("DELETE", "/build/retention/leases?ids=" + lease.leaseId + "&api-version=7.1", null,
            function(delErr, delStatus) {
                if (delErr) {
                    console.error("Failed to delete lease " + lease.leaseId + ":", delErr.message);
                } else {
                    console.log("Deleted lease " + lease.leaseId + " (build " + lease.runId + ")");
                }
            }
        );
    });
});

Best Practices

  • Always use PublishPipelineArtifact over PublishBuildArtifacts. The dedup-backed storage and multi-threaded transfers are faster in every scenario I have tested. The only exception is if you need to publish to a UNC file share.

  • Exclude node_modules from artifacts and run npm ci --production at deploy time. The size reduction is dramatic and the npm ci overhead is minimal. Lock files ensure reproducible installs regardless of when the deployment runs.

  • Use .artifactignore aggressively. Source files, test files, documentation, and development configurations have no place in a deployment artifact. Every megabyte you exclude saves time on every deployment.

  • Name artifacts by component and purpose, not by version or date. Use api-build instead of api-v2.3.1-20260208. The pipeline run ID provides uniqueness, and stable names keep your download steps maintainable.

  • Download selectively in deployment stages. Use download: none followed by explicit download steps for only the artifacts that stage needs. This is particularly important in monorepo pipelines where a single build stage publishes five or six artifacts.

  • Create retention leases for production deployments. When a build goes to production, protect it from automatic cleanup. This gives you a reliable rollback target. Set a sensible expiration (90-365 days) to avoid unbounded storage growth.

  • Verify artifact contents before deployment. A one-second verification script that checks for required files and forbidden paths (like .env or node_modules) catches configuration errors before they reach production.

  • Use caching for dependencies, artifacts for outputs. The Cache@2 task speeds up repeated builds by preserving node_modules or other downloaded dependencies. Artifacts carry your built code to downstream stages. They complement each other but are not interchangeable.

  • Pin cross-pipeline artifact sources to a branch. When using resources.pipelines to consume artifacts from another pipeline, specify branch: main to avoid accidentally deploying from a feature branch.

  • Monitor artifact storage consumption. Azure DevOps includes artifact storage in your billing. A pipeline that publishes 500MB on every commit to a busy repository can accumulate terabytes of artifact storage in months. Combine .artifactignore, sensible retention policies, and lease cleanup scripts to keep costs under control.

References

Powered by Contentful