Artifacts

Artifact Retention Policies and Cost Optimization

A comprehensive guide to managing Azure Artifacts storage costs through retention policies, cleanup automation, storage monitoring, and cost optimization strategies for feeds across NuGet, npm, Maven, and Python packages.

Artifact Retention Policies and Cost Optimization

Overview

Azure Artifacts includes 2 GB of free storage per organization. After that, you pay per gigabyte per month. That sounds generous until you realize that a single active .NET project publishing NuGet packages from every CI build can burn through that allowance in weeks. Every pre-release version, every snapshot, every build artifact accumulates -- and Azure Artifacts never deletes anything by default. If you are not managing retention proactively, you are paying for storage that provides zero value.

I have helped organizations reduce their Azure Artifacts bills by 60-80% through a combination of retention policies, automated cleanup scripts, and versioning discipline. The tools are straightforward: Azure DevOps provides built-in retention settings at the feed and organization level, and the REST API gives you fine-grained control for custom cleanup workflows. This article covers the complete cost optimization strategy, from understanding the billing model through building automated cleanup tools.

Prerequisites

  • An Azure DevOps organization with active Azure Artifacts feeds
  • Organization administrator access (for org-level retention settings)
  • An Azure DevOps Personal Access Token (PAT) with Packaging (Read & Write) scope
  • Node.js 18+ for the cleanup automation scripts
  • Basic familiarity with the Azure Artifacts feed structure and package types

Understanding Azure Artifacts Billing

Azure Artifacts billing has changed several times, so let me be specific about the current model:

  • Free tier: 2 GB of storage per organization (shared across all feeds, all projects, all package types)
  • Paid tier: Billed per GB per month after exceeding the free tier
  • What counts as storage: Every package version in every feed. This includes NuGet packages, npm tarballs, Maven JARs, Python wheels, and Universal Packages
  • What also counts: Packages cached from upstream sources (PyPI, nuget.org, npmjs.com, Maven Central). When your feed proxies a public package, the cached copy counts against your storage

The billing is cumulative. If you have 500 versions of a package across 10 feeds, every single version counts. Old pre-release versions you published six months ago and never used again? They count. Snapshot builds from feature branches that were merged and deleted? They count.

To check your current storage usage, navigate to Organization Settings > Billing > Azure Artifacts. Or use the REST API:

// check-storage.js
var https = require("https");

var org = "my-organization";
var pat = process.env.AZURE_DEVOPS_PAT;
var auth = Buffer.from(":" + pat).toString("base64");

function apiRequest(hostname, path, callback) {
  var options = {
    hostname: hostname,
    path: path,
    method: "GET",
    headers: {
      "Authorization": "Basic " + auth,
      "Accept": "application/json"
    }
  };

  var req = https.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", function(err) { callback(err); });
  req.end();
}

// List all feeds with package counts
var path = "/" + org + "/_apis/packaging/feeds?api-version=7.1";
apiRequest("feeds.dev.azure.com", path, function(err, status, data) {
  if (err) return console.error("Error:", err.message);

  var result = JSON.parse(data);
  var totalPackages = 0;

  console.log("Azure Artifacts Storage Report");
  console.log("Organization: " + org);
  console.log("==============================");
  console.log("");

  result.value.forEach(function(feed) {
    console.log("Feed: " + feed.name);
    console.log("  Scope: " + (feed.project ? "Project (" + feed.project.name + ")" : "Organization"));
    console.log("  Upstream: " + (feed.upstreamEnabled ? "enabled" : "disabled"));
    console.log("  Packages: " + (feed.packageCount || 0));
    totalPackages += (feed.packageCount || 0);
    console.log("");
  });

  console.log("==============================");
  console.log("Total feeds: " + result.count);
  console.log("Total packages: " + totalPackages);
  console.log("");
  console.log("Note: Storage usage details available in Organization Settings > Billing");
});
node check-storage.js

Output:

Azure Artifacts Storage Report
Organization: my-organization
==============================

Feed: dotnet-packages
  Scope: Project (platform)
  Upstream: enabled
  Packages: 47

Feed: npm-packages
  Scope: Organization
  Upstream: enabled
  Packages: 132

Feed: python-packages
  Scope: Project (data-engineering)
  Upstream: enabled
  Packages: 23

==============================
Total feeds: 3
Total packages: 202

Note: Storage usage details available in Organization Settings > Billing

Feed-Level Retention Settings

Each Azure Artifacts feed has built-in retention settings that control how many versions of each package to keep.

Maximum Versions Per Package

Navigate to Artifacts > Feed Settings > Retention to configure:

  • Maximum versions per package: How many versions to retain (default: unlimited). When a new version is published that exceeds this limit, the oldest version is automatically deleted.
  • Days to keep recently downloaded packages: Packages downloaded within this window are exempt from version-count deletion.

I recommend these settings as a starting point:

Feed Type Max Versions Days to Keep
Internal packages (active dev) 30 30
Release packages 100 90
CI/CD builds (pre-release) 10 7

To configure retention via the REST API:

// set-retention.js
var https = require("https");

var org = "my-organization";
var project = "my-project";
var feedId = "dotnet-packages";
var pat = process.env.AZURE_DEVOPS_PAT;
var auth = Buffer.from(":" + pat).toString("base64");

var retentionPolicy = {
  countLimit: 30,
  daysToKeepRecentlyDownloadedPackages: 30
};

var body = JSON.stringify(retentionPolicy);

var options = {
  hostname: "feeds.dev.azure.com",
  path: "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
    "/retentionpolicies?api-version=7.1",
  method: "PUT",
  headers: {
    "Content-Type": "application/json",
    "Authorization": "Basic " + auth,
    "Content-Length": Buffer.byteLength(body)
  }
};

var req = https.request(options, function(res) {
  var data = "";
  res.on("data", function(chunk) { data += chunk; });
  res.on("end", function() {
    if (res.statusCode === 200) {
      console.log("Retention policy set:");
      console.log("  Max versions per package: " + retentionPolicy.countLimit);
      console.log("  Days to keep downloaded: " + retentionPolicy.daysToKeepRecentlyDownloadedPackages);
    } else {
      console.error("Failed (" + res.statusCode + "):", data);
    }
  });
});

req.write(body);
req.end();

Important Caveats

The feed-level retention policy has limitations you should know about:

  1. It applies per package, not per feed. If you set max versions to 30, each package keeps 30 versions. If you have 100 packages, that is up to 3,000 versions.
  2. Promoted versions in views are exempt. Packages promoted to the @Release or @Prerelease view are not deleted by retention policies, even if they are the oldest.
  3. Upstream cached packages follow their own rules. Packages cached from upstream sources are governed by the upstream source retention, not your feed retention.
  4. Deletion is permanent. Once a version is deleted by retention, it cannot be recovered. If someone depends on that exact version, their builds will break.

Organization-Level Retention

Organization administrators can set a global retention policy that acts as a ceiling. Navigate to Organization Settings > Storage > Azure Artifacts to configure:

  • Maximum storage per organization: Set an upper bound on total storage
  • Default retention for new feeds: Automatically apply retention policies to newly created feeds

The organization policy overrides feed-level settings. If the org policy says maximum 20 versions per package and a feed says 50, the org policy wins.

Automated Cleanup Scripts

Built-in retention policies work for simple cases, but real-world cleanup needs are more nuanced. You want to:

  • Delete pre-release versions older than 14 days but keep all release versions
  • Remove packages from deleted branches
  • Clean up upstream cached packages that nobody has downloaded in 90 days
  • Run cleanup in dry-run mode first to see what would be deleted

Here is a comprehensive cleanup utility:

// artifact-cleanup.js -- Automated Azure Artifacts cleanup with dry-run support
var https = require("https");

var org = process.env.AZURE_DEVOPS_ORG || "my-organization";
var project = process.env.AZURE_DEVOPS_PROJECT || "my-project";
var feedId = process.argv[2];
var pat = process.env.AZURE_DEVOPS_PAT;
var dryRun = process.argv.indexOf("--dry-run") !== -1;
var maxAge = parseInt(process.argv[3]) || 30; // days

if (!pat) {
  console.error("Error: AZURE_DEVOPS_PAT is required");
  process.exit(1);
}

if (!feedId) {
  console.error("Usage: node artifact-cleanup.js <feedName> [maxAgeDays] [--dry-run]");
  console.error("Example: node artifact-cleanup.js dotnet-packages 30 --dry-run");
  process.exit(1);
}

var auth = Buffer.from(":" + pat).toString("base64");
var cutoffDate = new Date();
cutoffDate.setDate(cutoffDate.getDate() - maxAge);

var stats = {
  packagesScanned: 0,
  versionsScanned: 0,
  versionsToDelete: 0,
  versionsDeleted: 0,
  errors: 0,
  estimatedSavedMB: 0
};

function apiRequest(method, hostname, path, body, callback) {
  var options = {
    hostname: hostname,
    path: path,
    method: method,
    headers: {
      "Content-Type": "application/json",
      "Authorization": "Basic " + auth
    }
  };
  if (body) options.headers["Content-Length"] = Buffer.byteLength(body);

  var req = https.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", function(err) { callback(err); });
  if (body) req.write(body);
  req.end();
}

function isPreRelease(version) {
  // NuGet: contains hyphen (1.0.0-beta.1)
  // Python: contains a, b, rc, dev (1.0.0a1, 1.0.0.dev1)
  // npm: contains hyphen (1.0.0-alpha.1)
  // Maven: contains SNAPSHOT
  return version.indexOf("-") !== -1 ||
    /\d+\.\d+\.\d+[a-z]/.test(version) ||
    version.indexOf("SNAPSHOT") !== -1 ||
    version.indexOf(".dev") !== -1;
}

function getPackages(continuationToken, allPackages, callback) {
  var path = "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
    "/packages?api-version=7.1&$top=100";
  if (continuationToken) {
    path += "&$skip=" + continuationToken;
  }

  apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
    if (err) return callback(err);
    var result = JSON.parse(data);
    allPackages = allPackages.concat(result.value || []);

    if (result.value && result.value.length === 100) {
      getPackages((continuationToken || 0) + 100, allPackages, callback);
    } else {
      callback(null, allPackages);
    }
  });
}

function getPackageVersions(packageId, protocolType, callback) {
  var path = "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
    "/packages/" + packageId + "/versions?api-version=7.1&$top=500";

  apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
    if (err) return callback(err);
    callback(null, JSON.parse(data));
  });
}

function deleteVersion(packageName, version, protocolType, callback) {
  var protocol = protocolType.toLowerCase();
  var path = "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
    "/" + protocol + "/packages/" + packageName + "/versions/" + version + "?api-version=7.1";

  if (dryRun) {
    console.log("  [DRY RUN] Would delete " + packageName + "@" + version);
    stats.versionsToDelete++;
    return callback(null);
  }

  var body = JSON.stringify({ listed: false });
  apiRequest("PATCH", "pkgs.dev.azure.com", path, body, function(err, status) {
    if (err) {
      stats.errors++;
      return callback(err);
    }
    if (status === 200) {
      console.log("  Deleted " + packageName + "@" + version);
      stats.versionsDeleted++;
    } else {
      console.error("  Failed to delete " + packageName + "@" + version + " (" + status + ")");
      stats.errors++;
    }
    callback(null);
  });
}

function processPackage(pkg, callback) {
  stats.packagesScanned++;
  var protocolType = pkg.protocolType || "nuget";

  getPackageVersions(pkg.id, protocolType, function(err, result) {
    if (err) {
      console.error("Error getting versions for " + pkg.name + ":", err.message);
      stats.errors++;
      return callback();
    }

    var versions = result.value || [];
    stats.versionsScanned += versions.length;

    var toDelete = versions.filter(function(v) {
      var publishDate = new Date(v.publishDate);
      var isOld = publishDate < cutoffDate;
      var isPreRel = isPreRelease(v.version);

      // Delete pre-release versions older than the cutoff
      // Keep all release versions
      return isOld && isPreRel;
    });

    if (toDelete.length === 0) return callback();

    console.log(pkg.name + ": " + toDelete.length + " pre-release versions to clean up");

    var index = 0;
    function next() {
      if (index >= toDelete.length) return callback();
      var v = toDelete[index++];
      deleteVersion(pkg.name, v.version, protocolType, function() {
        // Throttle API calls
        setTimeout(next, 200);
      });
    }
    next();
  });
}

function runCleanup() {
  console.log("Azure Artifacts Cleanup");
  console.log("=======================");
  console.log("Feed: " + feedId);
  console.log("Max age for pre-release versions: " + maxAge + " days");
  console.log("Cutoff date: " + cutoffDate.toISOString().split("T")[0]);
  console.log("Mode: " + (dryRun ? "DRY RUN (no deletions)" : "LIVE (will delete)"));
  console.log("");

  getPackages(0, [], function(err, packages) {
    if (err) {
      console.error("Error fetching packages:", err.message);
      process.exit(1);
    }

    console.log("Found " + packages.length + " packages to scan");
    console.log("");

    var index = 0;
    function processNext() {
      if (index >= packages.length) {
        printSummary();
        return;
      }
      processPackage(packages[index++], function() {
        processNext();
      });
    }
    processNext();
  });
}

function printSummary() {
  console.log("");
  console.log("Cleanup Summary");
  console.log("===============");
  console.log("Packages scanned: " + stats.packagesScanned);
  console.log("Versions scanned: " + stats.versionsScanned);
  if (dryRun) {
    console.log("Versions to delete: " + stats.versionsToDelete);
    console.log("");
    console.log("Run without --dry-run to perform deletions.");
  } else {
    console.log("Versions deleted: " + stats.versionsDeleted);
  }
  console.log("Errors: " + stats.errors);
}

runCleanup();

Run it:

# Dry run first -- see what would be deleted
node artifact-cleanup.js dotnet-packages 30 --dry-run

# Output:
# Azure Artifacts Cleanup
# =======================
# Feed: dotnet-packages
# Max age for pre-release versions: 30 days
# Cutoff date: 2026-01-10
# Mode: DRY RUN (no deletions)
#
# Found 47 packages to scan
#
# MyCompany.Utilities: 8 pre-release versions to clean up
#   [DRY RUN] Would delete [email protected]
#   [DRY RUN] Would delete [email protected]
#   ...
# MyCompany.Auth: 3 pre-release versions to clean up
#   [DRY RUN] Would delete [email protected]
#   ...
#
# Cleanup Summary
# ===============
# Packages scanned: 47
# Versions scanned: 312
# Versions to delete: 11
# Errors: 0
#
# Run without --dry-run to perform deletions.

# Satisfied with the preview? Run for real:
node artifact-cleanup.js dotnet-packages 30

Monitoring Storage Over Time

Set up a scheduled pipeline that reports storage usage weekly. This catches storage growth before it surprises you on the bill:

# azure-pipelines-storage-report.yml
schedules:
  - cron: "0 8 * * 1"  # Every Monday at 8 AM
    displayName: Weekly storage report
    branches:
      include:
        - main
    always: true

pool:
  vmImage: ubuntu-latest

steps:
  - task: NodeTool@0
    inputs:
      versionSpec: 18.x

  - script: node scripts/check-storage.js
    displayName: Generate storage report
    env:
      AZURE_DEVOPS_PAT: $(System.AccessToken)
      AZURE_DEVOPS_ORG: $(System.CollectionUri)

For a more detailed storage breakdown by package, use this extended report script:

// storage-report.js -- Detailed storage analysis per package
var https = require("https");

var org = process.env.AZURE_DEVOPS_ORG || "my-organization";
var project = process.env.AZURE_DEVOPS_PROJECT || "my-project";
var pat = process.env.AZURE_DEVOPS_PAT;
var auth = Buffer.from(":" + pat).toString("base64");

function apiRequest(hostname, path, callback) {
  var options = {
    hostname: hostname,
    path: path,
    method: "GET",
    headers: {
      "Authorization": "Basic " + auth,
      "Accept": "application/json"
    }
  };
  var req = https.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", function(err) { callback(err); });
  req.end();
}

function getFeeds(callback) {
  var path = "/" + org + "/_apis/packaging/feeds?api-version=7.1";
  apiRequest("feeds.dev.azure.com", path, function(err, status, data) {
    if (err) return callback(err);
    callback(null, JSON.parse(data).value);
  });
}

function getPackages(feedId, callback) {
  var path = "/" + org + "/_apis/packaging/feeds/" + feedId +
    "/packages?api-version=7.1&$top=500&includeAllVersions=true";
  apiRequest("feeds.dev.azure.com", path, function(err, status, data) {
    if (err) return callback(err);
    callback(null, JSON.parse(data).value || []);
  });
}

function analyzeFeed(feed, callback) {
  getPackages(feed.id, function(err, packages) {
    if (err) return callback(err);

    var analysis = {
      name: feed.name,
      packageCount: packages.length,
      totalVersions: 0,
      preReleaseVersions: 0,
      releaseVersions: 0,
      oldestVersion: null,
      newestVersion: null
    };

    packages.forEach(function(pkg) {
      var versions = pkg.versions || [];
      analysis.totalVersions += versions.length;

      versions.forEach(function(v) {
        if (v.version.indexOf("-") !== -1 || v.version.indexOf("SNAPSHOT") !== -1) {
          analysis.preReleaseVersions++;
        } else {
          analysis.releaseVersions++;
        }

        var publishDate = new Date(v.publishDate);
        if (!analysis.oldestVersion || publishDate < new Date(analysis.oldestVersion)) {
          analysis.oldestVersion = v.publishDate;
        }
        if (!analysis.newestVersion || publishDate > new Date(analysis.newestVersion)) {
          analysis.newestVersion = v.publishDate;
        }
      });
    });

    callback(null, analysis);
  });
}

getFeeds(function(err, feeds) {
  if (err) {
    console.error("Error:", err.message);
    process.exit(1);
  }

  console.log("===========================================");
  console.log("  Azure Artifacts Storage Analysis Report");
  console.log("  Generated: " + new Date().toISOString().split("T")[0]);
  console.log("===========================================");
  console.log("");

  var completed = 0;
  var grandTotals = {
    packages: 0,
    versions: 0,
    preRelease: 0,
    release: 0
  };

  feeds.forEach(function(feed) {
    analyzeFeed(feed, function(err, analysis) {
      completed++;
      if (err) {
        console.error("Error analyzing " + feed.name + ":", err.message);
      } else {
        grandTotals.packages += analysis.packageCount;
        grandTotals.versions += analysis.totalVersions;
        grandTotals.preRelease += analysis.preReleaseVersions;
        grandTotals.release += analysis.releaseVersions;

        console.log("Feed: " + analysis.name);
        console.log("  Packages: " + analysis.packageCount);
        console.log("  Total versions: " + analysis.totalVersions);
        console.log("  Release versions: " + analysis.releaseVersions);
        console.log("  Pre-release versions: " + analysis.preReleaseVersions);
        if (analysis.oldestVersion) {
          console.log("  Oldest: " + analysis.oldestVersion.split("T")[0]);
          console.log("  Newest: " + analysis.newestVersion.split("T")[0]);
        }
        console.log("");
      }

      if (completed === feeds.length) {
        console.log("===========================================");
        console.log("  Grand Totals");
        console.log("===========================================");
        console.log("  Total packages: " + grandTotals.packages);
        console.log("  Total versions: " + grandTotals.versions);
        console.log("  Release versions: " + grandTotals.release);
        console.log("  Pre-release versions: " + grandTotals.preRelease);
        console.log("");

        var preRelPct = grandTotals.versions > 0 ?
          (grandTotals.preRelease / grandTotals.versions * 100).toFixed(1) : 0;
        console.log("  Pre-release versions are " + preRelPct + "% of total storage");

        if (parseFloat(preRelPct) > 50) {
          console.log("  >> WARNING: More than half your storage is pre-release versions.");
          console.log("  >> Consider setting up retention policies or running cleanup.");
        }
      }
    });
  });
});

Cost Optimization Strategies

Strategy 1: Aggressive Pre-Release Retention

Pre-release versions are the biggest storage consumers in most organizations. Every CI build from every feature branch produces one, and nobody cleans them up. Set the feed retention to keep at most 5-10 pre-release versions per package, with a 7-day download window.

Strategy 2: Separate Feeds for Different Lifecycles

Create separate feeds with different retention policies:

Feed Purpose Max Versions Retention
dev-packages CI builds, pre-release 5 7 days
release-packages Promoted releases 50 180 days
stable-packages LTS / widely-used Unlimited Unlimited

This way, aggressive cleanup on dev-packages does not risk deleting a release version that production depends on.

Strategy 3: Clean Up Upstream Cached Packages

Upstream cached packages (from nuget.org, PyPI, Maven Central, npmjs) accumulate silently. Your feed caches every public package your developers ever installed. Most of these are transient dependencies that nobody directly references.

You cannot selectively delete upstream cached packages through the UI, but you can manage them through the REST API. A practical approach: periodically create a new feed with upstream sources, migrate your internal packages to it, and delete the old feed. The new feed will re-cache public packages on demand, keeping only the ones currently in use.

Strategy 4: Version Numbering Discipline

The single most impactful change is publishing fewer versions:

  • Do not publish from every CI build. Only publish from main branch and release branches.
  • Use pre-release versions for feature branch builds -- these get cleaned up by retention policies.
  • Consolidate patch releases. If you published 1.0.1, 1.0.2, 1.0.3 in one week, consider whether 1.0.1 and 1.0.2 still need to exist.

Strategy 5: Pipeline Artifact Retention vs Feed Retention

Do not confuse Azure Pipelines artifact retention with Azure Artifacts feed retention. They are separate systems:

  • Pipeline artifacts (produced by PublishBuildArtifacts@1) are governed by pipeline retention policies. They are stored in pipeline storage, not Azure Artifacts.
  • Feed packages (produced by dotnet nuget push, npm publish, etc.) are governed by feed retention policies. They live in Azure Artifacts storage.

If you are publishing to a feed from your pipeline, the package exists in both places. Set pipeline artifact retention to a short duration (7-14 days) since the permanent copy is in the feed.

Complete Working Example

This Node.js utility combines storage analysis, cleanup, and scheduling into a single tool:

// artifact-manager.js -- Complete artifact lifecycle management tool
var https = require("https");

var config = {
  org: process.env.AZURE_DEVOPS_ORG || "my-organization",
  pat: process.env.AZURE_DEVOPS_PAT,
  dryRun: process.argv.indexOf("--dry-run") !== -1
};

if (!config.pat) {
  console.error("Error: AZURE_DEVOPS_PAT environment variable is required");
  process.exit(1);
}

var auth = Buffer.from(":" + config.pat).toString("base64");

function apiRequest(method, hostname, path, body, callback) {
  var options = {
    hostname: hostname,
    path: path,
    method: method,
    headers: {
      "Content-Type": "application/json",
      "Authorization": "Basic " + auth
    }
  };
  if (body) options.headers["Content-Length"] = Buffer.byteLength(body);

  var req = https.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", function(err) { callback(err); });
  if (body) req.write(body);
  req.end();
}

function getAllFeeds(callback) {
  var path = "/" + config.org + "/_apis/packaging/feeds?api-version=7.1";
  apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
    if (err) return callback(err);
    callback(null, JSON.parse(data).value || []);
  });
}

function getFeedPackages(feedId, skip, allPackages, callback) {
  var path = "/" + config.org + "/_apis/packaging/feeds/" + feedId +
    "/packages?api-version=7.1&$top=100&$skip=" + skip;
  apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
    if (err) return callback(err);
    var result = JSON.parse(data);
    allPackages = allPackages.concat(result.value || []);
    if (result.value && result.value.length === 100) {
      getFeedPackages(feedId, skip + 100, allPackages, callback);
    } else {
      callback(null, allPackages);
    }
  });
}

function getVersions(feedId, packageId, callback) {
  var path = "/" + config.org + "/_apis/packaging/feeds/" + feedId +
    "/packages/" + packageId + "/versions?api-version=7.1&$top=500";
  apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
    if (err) return callback(err);
    callback(null, JSON.parse(data).value || []);
  });
}

function unlistVersion(feedId, protocol, pkgName, version, callback) {
  var path = "/" + config.org + "/_apis/packaging/feeds/" + feedId +
    "/" + protocol.toLowerCase() + "/packages/" + pkgName +
    "/versions/" + version + "?api-version=7.1";
  var body = JSON.stringify({ listed: false });

  if (config.dryRun) {
    return callback(null, true);
  }

  apiRequest("PATCH", "pkgs.dev.azure.com", path, body, function(err, status) {
    callback(err, status === 200);
  });
}

// Command: report
function runReport() {
  getAllFeeds(function(err, feeds) {
    if (err) return console.error("Error:", err.message);

    console.log("Storage Report - " + new Date().toISOString().split("T")[0]);
    console.log("============================================");

    var pending = feeds.length;
    if (pending === 0) {
      console.log("No feeds found.");
      return;
    }

    feeds.forEach(function(feed) {
      getFeedPackages(feed.id, 0, [], function(err, packages) {
        pending--;

        if (!err) {
          var totalVersions = 0;
          packages.forEach(function(p) {
            totalVersions += (p.versions || []).length;
          });
          console.log(feed.name + ": " + packages.length + " packages, " +
            totalVersions + " versions");
        }

        if (pending === 0) {
          console.log("============================================");
        }
      });
    });
  });
}

// Command: cleanup
function runCleanup(feedName, maxAgeDays) {
  var cutoff = new Date();
  cutoff.setDate(cutoff.getDate() - maxAgeDays);

  console.log("Cleanup: " + feedName);
  console.log("Removing pre-release versions older than " + maxAgeDays + " days");
  console.log("Mode: " + (config.dryRun ? "DRY RUN" : "LIVE"));
  console.log("");

  var feedPath = "/" + config.org + "/_apis/packaging/feeds/" + feedName + "?api-version=7.1";
  apiRequest("GET", "feeds.dev.azure.com", feedPath, null, function(err, status, data) {
    if (err || status !== 200) {
      console.error("Feed not found: " + feedName);
      return;
    }

    var feed = JSON.parse(data);
    getFeedPackages(feed.id, 0, [], function(err, packages) {
      if (err) return console.error("Error:", err.message);

      var totalDeleted = 0;
      var index = 0;

      function processNext() {
        if (index >= packages.length) {
          console.log("");
          console.log("Total versions " + (config.dryRun ? "to delete" : "deleted") +
            ": " + totalDeleted);
          return;
        }

        var pkg = packages[index++];
        getVersions(feed.id, pkg.id, function(err, versions) {
          if (err) return processNext();

          var stale = versions.filter(function(v) {
            var isOld = new Date(v.publishDate) < cutoff;
            var isPreRel = v.version.indexOf("-") !== -1 ||
              v.version.indexOf("SNAPSHOT") !== -1 ||
              v.version.indexOf(".dev") !== -1;
            return isOld && isPreRel;
          });

          if (stale.length === 0) return processNext();

          var protocol = pkg.protocolType || "nuget";
          var vIdx = 0;

          function deleteNext() {
            if (vIdx >= stale.length) return processNext();
            var v = stale[vIdx++];
            var prefix = config.dryRun ? "[DRY RUN] " : "";
            console.log(prefix + "Delete " + pkg.name + "@" + v.version);
            totalDeleted++;

            unlistVersion(feed.id, protocol, pkg.name, v.version, function() {
              setTimeout(deleteNext, 100);
            });
          }
          deleteNext();
        });
      }
      processNext();
    });
  });
}

// Route commands
var command = process.argv[2];

switch (command) {
  case "report":
    runReport();
    break;
  case "cleanup":
    var feedName = process.argv[3];
    var maxAge = parseInt(process.argv[4]) || 30;
    if (!feedName) {
      console.error("Usage: node artifact-manager.js cleanup <feedName> [maxAgeDays] [--dry-run]");
      process.exit(1);
    }
    runCleanup(feedName, maxAge);
    break;
  default:
    console.log("Azure Artifacts Manager");
    console.log("");
    console.log("Commands:");
    console.log("  report                              Show storage usage for all feeds");
    console.log("  cleanup <feed> [days] [--dry-run]   Remove old pre-release versions");
    console.log("");
    console.log("Environment:");
    console.log("  AZURE_DEVOPS_PAT   Personal Access Token (required)");
    console.log("  AZURE_DEVOPS_ORG   Organization name (default: my-organization)");
}
# Storage report
node artifact-manager.js report

# Cleanup with dry run
node artifact-manager.js cleanup dotnet-packages 30 --dry-run

# Cleanup for real
node artifact-manager.js cleanup dotnet-packages 30

Common Issues and Troubleshooting

1. Storage Usage Not Decreasing After Deletion

Error: You deleted packages but the storage meter in Organization Settings has not changed.

Azure Artifacts recalculates storage usage asynchronously. It can take up to 24 hours for deletions to be reflected in the billing dashboard. Also verify you are deleting (not just unlisting) packages. Unlisting hides the package from search but does not free storage.

2. Retention Policy Not Deleting Old Versions

Error: You set max versions to 10 but a package still has 50 versions.

Feed retention policies run on a schedule, not immediately. It can take up to 24 hours for the policy to process all packages in a feed. Also check if the old versions were promoted to a view (@Release, @Prerelease) -- promoted versions are exempt from retention.

3. Build Breaks After Cleanup Deletes a Dependency

Error:

error NU1101: Unable to find package MyCompany.Utilities version 1.0.3

A build references a specific version that was deleted by cleanup. This is why the daysToKeepRecentlyDownloadedPackages setting matters -- it protects versions that are actively in use. Set this to at least 30 days. For release versions, never delete them through automated cleanup -- only clean up pre-release versions automatically.

4. Upstream Cached Packages Consuming Excessive Storage

Error: Your storage is growing even though you are not publishing new internal packages.

Upstream cached packages accumulate as developers install new public dependencies. Every unique version of every transitive dependency gets cached. The only way to manage this is to periodically review cached packages and consider disabling upstream caching for feeds that do not need it.

5. Cannot Delete Packages -- 403 Forbidden

Error:

403 Forbidden: The current user does not have permission to delete packages from this feed.

Package deletion requires Owner permission on the feed, not just Contributor. For automated cleanup scripts, create a dedicated service account PAT with Owner access on the specific feeds it manages.

6. Retention Policy Deletes Packages Still Referenced by Lock Files

Error: Package restore fails in CI because a pinned version no longer exists in the feed.

Lock files (package-lock.json, packages.lock.json, poetry.lock) pin exact versions. If retention deletes that version, builds break. Set daysToKeepRecentlyDownloadedPackages high enough to cover your longest-lived branch. Or better yet, do not rely on retention for release versions -- only apply aggressive retention to pre-release versions.

Best Practices

  1. Set retention policies on every feed from day one. Do not wait until storage costs surprise you. A reasonable default is 30 maximum versions per package with 30 days of download protection.

  2. Separate pre-release and release packages into different feeds. Apply aggressive retention (5 versions, 7 days) to pre-release feeds and conservative retention (unlimited versions) to release feeds.

  3. Run cleanup scripts on a schedule, not manually. Automate cleanup as a weekly pipeline. Manual cleanup happens once after the bill shocks someone, then never again.

  4. Always run cleanup in dry-run mode first. The cost of re-publishing a deleted package is high (broken builds, wasted developer time). Preview deletions before executing them.

  5. Monitor storage weekly. Set up a scheduled pipeline that reports storage usage and trends. Catch growth early before it becomes a billing problem.

  6. Do not publish from every CI build. Only publish packages from main/release branches. Feature branch builds should use pipeline artifacts for testing, not feed packages. This is the single highest-impact change for reducing storage consumption.

  7. Use feed views instead of separate feeds when possible. Feed views (@Release, @Prerelease) give you the same lifecycle management as separate feeds but with shared storage. Promoted versions in views are protected from retention policies.

  8. Clean up feeds for decommissioned projects. When a project is archived, delete its feed entirely rather than letting packages accumulate indefinitely.

  9. Set the organization-level storage budget. Configure an upper bound on Azure Artifacts storage in Organization Settings. This prevents runaway costs from misconfigured feeds or forgotten projects.

  10. Document your retention strategy. Write down which feeds exist, what their retention policies are, and why. New team members need to understand why their pre-release build from last month is gone.

References

Powered by Contentful