Artifacts

Migrating from Nexus/Artifactory to Azure Artifacts

A practical guide to migrating package repositories from Sonatype Nexus and JFrog Artifactory to Azure Artifacts, covering planning, package export, client reconfiguration, pipeline updates, and rollback strategies.

Migrating from Nexus/Artifactory to Azure Artifacts

Overview

Migrating from a self-hosted package repository like Sonatype Nexus or JFrog Artifactory to Azure Artifacts eliminates infrastructure management, integrates directly with Azure DevOps pipelines, and consolidates your toolchain. But the migration itself is not trivial -- you are moving potentially thousands of package versions across multiple ecosystems while keeping builds running and developers productive. A botched migration means broken CI pipelines, failed deployments, and angry developers who cannot install dependencies.

I have led three Nexus-to-Azure-Artifacts migrations and two Artifactory-to-Azure-Artifacts migrations for organizations of varying sizes. The migrations that went smoothly had one thing in common: they ran both systems in parallel for at least two weeks before cutting over. The migrations that caused outages tried to do a hard cutover over a weekend. This article covers the complete migration process, from assessment and planning through execution, validation, and rollback.

Prerequisites

  • Administrative access to your existing Nexus or Artifactory instance
  • Azure DevOps organization with Azure Artifacts enabled
  • Azure DevOps Personal Access Token (PAT) with Packaging (Read & Write) scope
  • Node.js 18+ for the migration scripts
  • Access to modify pipeline YAML files and client configuration (nuget.config, .npmrc, pip.conf)
  • A maintenance window for the final cutover (even with parallel running)

Migration Assessment

Before writing any scripts, audit what you have. Most organizations underestimate the scope of their package repositories.

Inventory Checklist

Run this assessment against your existing repository:

// migration-assessment.js -- Audit your existing package repository
var https = require("https");
var http = require("http");

// Configure for your existing repository
var repoConfig = {
  type: process.argv[2] || "nexus", // "nexus" or "artifactory"
  baseUrl: process.env.REPO_URL || "http://nexus.internal:8081",
  username: process.env.REPO_USER || "admin",
  password: process.env.REPO_PASS
};

if (!repoConfig.password) {
  console.error("Error: REPO_PASS environment variable is required");
  process.exit(1);
}

var auth = Buffer.from(repoConfig.username + ":" + repoConfig.password).toString("base64");

function apiGet(url, callback) {
  var parsed = new URL(url);
  var client = parsed.protocol === "https:" ? https : http;

  var options = {
    hostname: parsed.hostname,
    port: parsed.port,
    path: parsed.pathname + parsed.search,
    method: "GET",
    headers: {
      "Authorization": "Basic " + auth,
      "Accept": "application/json"
    }
  };

  var req = client.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", function(err) { callback(err); });
  req.end();
}

function assessNexus() {
  var nexusApi = repoConfig.baseUrl + "/service/rest/v1";

  // List repositories
  apiGet(nexusApi + "/repositories", function(err, status, data) {
    if (err) return console.error("Error connecting to Nexus:", err.message);
    if (status !== 200) return console.error("Nexus API error (" + status + "):", data);

    var repos = JSON.parse(data);

    console.log("Migration Assessment Report");
    console.log("===========================");
    console.log("Source: Nexus (" + repoConfig.baseUrl + ")");
    console.log("Date: " + new Date().toISOString().split("T")[0]);
    console.log("");

    var summary = {
      totalRepos: repos.length,
      byFormat: {},
      byType: { hosted: 0, proxy: 0, group: 0 }
    };

    repos.forEach(function(repo) {
      var format = repo.format || "unknown";
      var type = repo.type || "unknown";

      summary.byFormat[format] = (summary.byFormat[format] || 0) + 1;
      if (summary.byType[type] !== undefined) summary.byType[type]++;

      console.log(repo.name);
      console.log("  Format: " + format);
      console.log("  Type: " + type);
      console.log("  URL: " + repo.url);
      console.log("");
    });

    console.log("Summary");
    console.log("-------");
    console.log("Total repositories: " + summary.totalRepos);
    console.log("By format:");
    Object.keys(summary.byFormat).forEach(function(f) {
      console.log("  " + f + ": " + summary.byFormat[f]);
    });
    console.log("By type:");
    console.log("  Hosted (to migrate): " + summary.byType.hosted);
    console.log("  Proxy (replaced by upstreams): " + summary.byType.proxy);
    console.log("  Group (replaced by feed config): " + summary.byType.group);
    console.log("");
    console.log("Migration scope: " + summary.byType.hosted + " hosted repositories");
    console.log("Note: Proxy repos are replaced by Azure Artifacts upstream sources");
    console.log("Note: Group repos are replaced by feed + upstream configuration");
  });
}

function assessArtifactory() {
  var artApi = repoConfig.baseUrl + "/api";

  apiGet(artApi + "/repositories", function(err, status, data) {
    if (err) return console.error("Error connecting to Artifactory:", err.message);
    if (status !== 200) return console.error("Artifactory API error (" + status + ")");

    var repos = JSON.parse(data);

    console.log("Migration Assessment Report");
    console.log("===========================");
    console.log("Source: Artifactory (" + repoConfig.baseUrl + ")");
    console.log("Date: " + new Date().toISOString().split("T")[0]);
    console.log("");

    var summary = { totalRepos: repos.length, byPackageType: {}, byRclass: {} };

    repos.forEach(function(repo) {
      var pkgType = repo.packageType || "unknown";
      var rclass = repo.rclass || "unknown";

      summary.byPackageType[pkgType] = (summary.byPackageType[pkgType] || 0) + 1;
      summary.byRclass[rclass] = (summary.byRclass[rclass] || 0) + 1;

      console.log(repo.key);
      console.log("  Package type: " + pkgType);
      console.log("  Repository class: " + rclass);
      console.log("  URL: " + repo.url);
      console.log("");
    });

    console.log("Summary");
    console.log("-------");
    console.log("Total repositories: " + summary.totalRepos);
    console.log("By package type: " + JSON.stringify(summary.byPackageType));
    console.log("By class: " + JSON.stringify(summary.byRclass));
  });
}

if (repoConfig.type === "nexus") {
  assessNexus();
} else {
  assessArtifactory();
}
REPO_URL=http://nexus.internal:8081 REPO_PASS=admin123 node migration-assessment.js nexus

What Migrates and What Does Not

Nexus/Artifactory Feature Azure Artifacts Equivalent
Hosted repositories Azure Artifacts feeds
Proxy repositories Upstream sources on feeds
Group repositories Feed with multiple upstreams
Repository permissions Feed roles (Reader/Contributor/Owner)
Cleanup policies Feed retention policies
NuGet/npm/Maven/PyPI support Native support for all four
Docker registries Azure Container Registry (separate service)
Raw/generic repositories Universal Packages
LDAP/AD authentication Azure AD integration
Webhooks Azure DevOps service hooks
Build promotion Feed views (@Release, @Prerelease)

Does not migrate directly:

  • Docker images (migrate to Azure Container Registry)
  • Helm charts (migrate to ACR or use helm-specific tools)
  • Custom repository formats
  • Repository-level statistics and download counts
  • Routing rules and virtual repositories
  • Cleanup policy configurations (must be recreated)

Migration Strategy: Parallel Running

The safest migration approach runs both systems simultaneously:

Phase 1: Setup (1 day)

  • Create Azure Artifacts feeds matching your Nexus/Artifactory hosted repositories
  • Configure upstream sources (replaces proxy repositories)
  • Set up authentication

Phase 2: Package Migration (1-3 days)

  • Export packages from Nexus/Artifactory
  • Import into Azure Artifacts
  • Validate package integrity

Phase 3: Parallel Running (2-4 weeks)

  • Point new builds at Azure Artifacts
  • Existing builds continue using Nexus/Artifactory
  • Publish new packages to both systems

Phase 4: Cutover (1 day)

  • Update all remaining client configurations
  • Update all pipeline definitions
  • Stop publishing to Nexus/Artifactory

Phase 5: Decommission (after 30 days)

  • Remove Nexus/Artifactory from network
  • Archive the server for compliance if needed

Exporting Packages

Exporting from Nexus 3

Nexus 3 stores packages on the filesystem. The export approach depends on the repository format:

// export-nexus-npm.js -- Export npm packages from Nexus
var https = require("https");
var http = require("http");
var fs = require("fs");
var path = require("path");

var nexusUrl = process.env.REPO_URL || "http://nexus.internal:8081";
var repoName = process.argv[2] || "npm-hosted";
var outputDir = process.argv[3] || "./export-npm";
var username = process.env.REPO_USER || "admin";
var password = process.env.REPO_PASS;

if (!password) {
  console.error("Error: REPO_PASS is required");
  process.exit(1);
}

var auth = Buffer.from(username + ":" + password).toString("base64");

function nexusGet(apiPath, callback) {
  var parsed = new URL(nexusUrl + apiPath);
  var client = parsed.protocol === "https:" ? https : http;
  var options = {
    hostname: parsed.hostname,
    port: parsed.port,
    path: parsed.pathname + parsed.search,
    method: "GET",
    headers: { "Authorization": "Basic " + auth, "Accept": "application/json" }
  };
  var req = client.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", callback);
  req.end();
}

function downloadFile(url, destPath, callback) {
  var parsed = new URL(url);
  var client = parsed.protocol === "https:" ? https : http;
  var options = {
    hostname: parsed.hostname,
    port: parsed.port,
    path: parsed.pathname,
    headers: { "Authorization": "Basic " + auth }
  };

  var file = fs.createWriteStream(destPath);
  var req = client.get(options, function(res) {
    res.pipe(file);
    file.on("finish", function() { file.close(callback); });
  });
  req.on("error", function(err) {
    fs.unlinkSync(destPath);
    callback(err);
  });
}

function listComponents(continuationToken, allComponents, callback) {
  var apiPath = "/service/rest/v1/components?repository=" + repoName;
  if (continuationToken) {
    apiPath += "&continuationToken=" + continuationToken;
  }

  nexusGet(apiPath, function(err, status, data) {
    if (err) return callback(err);
    var result = JSON.parse(data);
    allComponents = allComponents.concat(result.items || []);

    if (result.continuationToken) {
      listComponents(result.continuationToken, allComponents, callback);
    } else {
      callback(null, allComponents);
    }
  });
}

// Create output directory
if (!fs.existsSync(outputDir)) {
  fs.mkdirSync(outputDir, { recursive: true });
}

console.log("Exporting npm packages from Nexus");
console.log("Repository: " + repoName);
console.log("Output: " + outputDir);
console.log("");

listComponents(null, [], function(err, components) {
  if (err) return console.error("Error:", err.message);

  console.log("Found " + components.length + " components");

  var downloadQueue = [];
  components.forEach(function(component) {
    (component.assets || []).forEach(function(asset) {
      if (asset.path.endsWith(".tgz")) {
        downloadQueue.push({
          name: component.name,
          version: component.version,
          url: asset.downloadUrl,
          filename: path.basename(asset.path)
        });
      }
    });
  });

  console.log("Downloading " + downloadQueue.length + " tarballs...");
  console.log("");

  var completed = 0;
  var errors = 0;

  function processNext() {
    if (downloadQueue.length === 0) {
      console.log("");
      console.log("Export complete: " + completed + " packages, " + errors + " errors");
      return;
    }

    var item = downloadQueue.shift();
    var destPath = path.join(outputDir, item.filename);

    downloadFile(item.url, destPath, function(err) {
      if (err) {
        console.error("  FAILED: " + item.name + "@" + item.version + " -- " + err.message);
        errors++;
      } else {
        completed++;
        process.stdout.write("\r  Downloaded: " + completed + "/" + (completed + downloadQueue.length));
      }
      processNext();
    });
  }

  // Download 3 at a time
  for (var i = 0; i < Math.min(3, downloadQueue.length); i++) {
    processNext();
  }
});

Exporting from Artifactory

Artifactory provides an AQL (Artifactory Query Language) API for bulk operations:

// export-artifactory.js -- Export packages from JFrog Artifactory
var https = require("https");
var http = require("http");
var fs = require("fs");
var path = require("path");

var artUrl = process.env.REPO_URL || "http://artifactory.internal:8082";
var repoName = process.argv[2] || "npm-local";
var outputDir = process.argv[3] || "./export-art";
var username = process.env.REPO_USER || "admin";
var password = process.env.REPO_PASS;

if (!password) {
  console.error("Error: REPO_PASS is required");
  process.exit(1);
}

var auth = Buffer.from(username + ":" + password).toString("base64");

function artPost(apiPath, body, callback) {
  var parsed = new URL(artUrl + apiPath);
  var client = parsed.protocol === "https:" ? https : http;
  var bodyStr = typeof body === "string" ? body : JSON.stringify(body);
  var options = {
    hostname: parsed.hostname,
    port: parsed.port,
    path: parsed.pathname,
    method: "POST",
    headers: {
      "Authorization": "Basic " + auth,
      "Content-Type": "text/plain",
      "Content-Length": Buffer.byteLength(bodyStr)
    }
  };
  var req = client.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", callback);
  req.write(bodyStr);
  req.end();
}

function downloadFile(filePath, destPath, callback) {
  var parsed = new URL(artUrl + "/" + repoName + "/" + filePath);
  var client = parsed.protocol === "https:" ? https : http;
  var options = {
    hostname: parsed.hostname,
    port: parsed.port,
    path: parsed.pathname,
    headers: { "Authorization": "Basic " + auth }
  };
  var file = fs.createWriteStream(destPath);
  var req = client.get(options, function(res) {
    res.pipe(file);
    file.on("finish", function() { file.close(callback); });
  });
  req.on("error", function(err) {
    fs.unlinkSync(destPath);
    callback(err);
  });
}

if (!fs.existsSync(outputDir)) {
  fs.mkdirSync(outputDir, { recursive: true });
}

// Query all artifacts in the repository
var aql = 'items.find({"repo":"' + repoName + '"}).include("name","path","size","actual_sha1")';

artPost("/api/search/aql", aql, function(err, status, data) {
  if (err) return console.error("Error:", err.message);

  var result = JSON.parse(data);
  var artifacts = (result.results || []).filter(function(item) {
    return item.name.endsWith(".tgz") || item.name.endsWith(".nupkg") ||
      item.name.endsWith(".jar") || item.name.endsWith(".whl");
  });

  console.log("Found " + artifacts.length + " artifacts to export");

  var completed = 0;
  var index = 0;

  function next() {
    if (index >= artifacts.length) {
      console.log("\nExport complete: " + completed + " files");
      return;
    }
    var artifact = artifacts[index++];
    var filePath = artifact.path + "/" + artifact.name;
    var destPath = path.join(outputDir, artifact.name);

    downloadFile(filePath, destPath, function(err) {
      if (err) {
        console.error("  FAILED: " + filePath + " -- " + err.message);
      } else {
        completed++;
        process.stdout.write("\r  Exported: " + completed + "/" + artifacts.length);
      }
      next();
    });
  }
  next();
});

Importing into Azure Artifacts

npm Package Import

#!/bin/bash
# import-npm.sh -- Bulk import npm tarballs to Azure Artifacts

FEED_URL="https://pkgs.dev.azure.com/my-org/my-project/_packaging/my-feed/npm/registry/"
EXPORT_DIR="./export-npm"

# Configure npm to use the Azure Artifacts feed
npm config set registry "$FEED_URL"

echo "Importing npm packages to Azure Artifacts..."

for tarball in "$EXPORT_DIR"/*.tgz; do
  echo "Publishing: $(basename $tarball)"
  npm publish "$tarball" --registry "$FEED_URL" 2>&1
  if [ $? -ne 0 ]; then
    echo "  WARNING: Failed (may already exist)"
  fi
done

echo "Import complete."

NuGet Package Import

#!/bin/bash
# import-nuget.sh -- Bulk import NuGet packages to Azure Artifacts

FEED_URL="https://pkgs.dev.azure.com/my-org/my-project/_packaging/my-feed/nuget/v3/index.json"
EXPORT_DIR="./export-nuget"

echo "Importing NuGet packages to Azure Artifacts..."

for nupkg in "$EXPORT_DIR"/*.nupkg; do
  echo "Publishing: $(basename $nupkg)"
  dotnet nuget push "$nupkg" \
    --source "$FEED_URL" \
    --api-key az \
    --skip-duplicate 2>&1
done

echo "Import complete."

Python Package Import

#!/bin/bash
# import-python.sh -- Bulk import Python packages to Azure Artifacts

FEED_URL="https://pkgs.dev.azure.com/my-org/my-project/_packaging/my-feed/pypi/upload/"
EXPORT_DIR="./export-python"

echo "Importing Python packages to Azure Artifacts..."

twine upload \
  --repository-url "$FEED_URL" \
  --username azure \
  --password "$AZURE_DEVOPS_PAT" \
  "$EXPORT_DIR"/*.whl "$EXPORT_DIR"/*.tar.gz

echo "Import complete."

Client Reconfiguration

npm Migration

Before (Nexus):

registry=http://nexus.internal:8081/repository/npm-group/
_auth=YWRtaW46YWRtaW4xMjM=
always-auth=true

After (Azure Artifacts):

registry=https://pkgs.dev.azure.com/my-org/my-project/_packaging/npm-packages/npm/registry/
always-auth=true

NuGet Migration

Before (Artifactory):

<configuration>
  <packageSources>
    <add key="artifactory" value="http://artifactory.internal:8082/api/nuget/nuget-local" />
  </packageSources>
</configuration>

After (Azure Artifacts):

<configuration>
  <packageSources>
    <clear />
    <add key="azure-artifacts"
         value="https://pkgs.dev.azure.com/my-org/my-project/_packaging/nuget-packages/nuget/v3/index.json" />
  </packageSources>
</configuration>

pip Migration

Before (Nexus):

[global]
index-url = http://nexus.internal:8081/repository/pypi-group/simple/
trusted-host = nexus.internal

After (Azure Artifacts):

[global]
index-url = https://pkgs.dev.azure.com/my-org/my-project/_packaging/python-packages/pypi/simple/

Complete Working Example

This is a comprehensive migration orchestration script that handles the full lifecycle:

// migrate.js -- Complete migration orchestrator
var fs = require("fs");
var path = require("path");
var https = require("https");
var http = require("http");

var config = {
  sourceType: process.argv[2] || "nexus",
  sourceUrl: process.env.SOURCE_URL,
  sourceUser: process.env.SOURCE_USER || "admin",
  sourcePass: process.env.SOURCE_PASS,
  targetOrg: process.env.AZURE_DEVOPS_ORG || "my-organization",
  targetProject: process.env.AZURE_DEVOPS_PROJECT || "my-project",
  targetPat: process.env.AZURE_DEVOPS_PAT,
  dryRun: process.argv.indexOf("--dry-run") !== -1
};

if (!config.sourceUrl || !config.sourcePass || !config.targetPat) {
  console.error("Required environment variables:");
  console.error("  SOURCE_URL     - Nexus/Artifactory base URL");
  console.error("  SOURCE_PASS    - Source repository password");
  console.error("  AZURE_DEVOPS_PAT - Azure DevOps PAT");
  console.error("");
  console.error("Usage: node migrate.js [nexus|artifactory] [--dry-run]");
  process.exit(1);
}

var sourceAuth = Buffer.from(config.sourceUser + ":" + config.sourcePass).toString("base64");
var targetAuth = Buffer.from(":" + config.targetPat).toString("base64");

function logStep(step, message) {
  var prefix = config.dryRun ? "[DRY RUN] " : "";
  console.log(prefix + "[Step " + step + "] " + message);
}

function sourceGet(apiPath, callback) {
  var parsed = new URL(config.sourceUrl + apiPath);
  var client = parsed.protocol === "https:" ? https : http;
  var options = {
    hostname: parsed.hostname,
    port: parsed.port,
    path: parsed.pathname + parsed.search,
    method: "GET",
    headers: { "Authorization": "Basic " + sourceAuth, "Accept": "application/json" }
  };
  var req = client.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", callback);
  req.end();
}

function targetApi(method, path, body, callback) {
  var bodyStr = body ? JSON.stringify(body) : null;
  var options = {
    hostname: "feeds.dev.azure.com",
    path: path,
    method: method,
    headers: {
      "Content-Type": "application/json",
      "Authorization": "Basic " + targetAuth
    }
  };
  if (bodyStr) options.headers["Content-Length"] = Buffer.byteLength(bodyStr);

  var req = https.request(options, function(res) {
    var data = "";
    res.on("data", function(chunk) { data += chunk; });
    res.on("end", function() { callback(null, res.statusCode, data); });
  });
  req.on("error", callback);
  if (bodyStr) req.write(bodyStr);
  req.end();
}

// Step 1: Assess source
function assessSource(callback) {
  logStep(1, "Assessing source repository...");

  if (config.sourceType === "nexus") {
    sourceGet("/service/rest/v1/repositories", function(err, status, data) {
      if (err) return callback(err);
      var repos = JSON.parse(data);
      var hosted = repos.filter(function(r) { return r.type === "hosted"; });
      console.log("  Found " + repos.length + " repositories (" + hosted.length + " hosted)");
      callback(null, hosted);
    });
  } else {
    sourceGet("/api/repositories", function(err, status, data) {
      if (err) return callback(err);
      var repos = JSON.parse(data);
      var local = repos.filter(function(r) { return r.rclass === "local"; });
      console.log("  Found " + repos.length + " repositories (" + local.length + " local)");
      callback(null, local);
    });
  }
}

// Step 2: Create target feeds
function createTargetFeeds(sourceRepos, callback) {
  logStep(2, "Creating Azure Artifacts feeds...");

  var formatMap = {
    npm: ["npm"],
    nuget: ["nuget"],
    maven2: ["maven"],
    pypi: ["pypi"],
    raw: ["upack"]
  };

  var feedsToCreate = sourceRepos.map(function(repo) {
    var format = repo.format || repo.packageType || "unknown";
    return {
      name: repo.name || repo.key,
      format: format,
      upstreams: getUpstreamsForFormat(format)
    };
  });

  var completed = 0;
  feedsToCreate.forEach(function(feed) {
    if (config.dryRun) {
      console.log("  Would create feed: " + feed.name + " (" + feed.format + ")");
      completed++;
      if (completed === feedsToCreate.length) callback(null, feedsToCreate);
      return;
    }

    var feedDef = {
      name: feed.name.replace(/[^a-zA-Z0-9-]/g, "-"),
      description: "Migrated from " + config.sourceType + " repository: " + feed.name,
      upstreamEnabled: feed.upstreams.length > 0,
      upstreamSources: feed.upstreams
    };

    var path = "/" + config.targetOrg + "/" + config.targetProject +
      "/_apis/packaging/feeds?api-version=7.1";
    targetApi("POST", path, feedDef, function(err, status) {
      completed++;
      if (status === 201) {
        console.log("  Created: " + feedDef.name);
      } else if (status === 409) {
        console.log("  Exists: " + feedDef.name);
      } else {
        console.log("  Failed: " + feedDef.name + " (" + status + ")");
      }
      if (completed === feedsToCreate.length) callback(null, feedsToCreate);
    });
  });
}

function getUpstreamsForFormat(format) {
  var upstreams = {
    npm: [{ name: "npmjs", protocol: "npm", location: "https://registry.npmjs.org/", upstreamSourceType: "public" }],
    nuget: [{ name: "NuGet Gallery", protocol: "nuget", location: "https://api.nuget.org/v3/index.json", upstreamSourceType: "public" }],
    pypi: [{ name: "PyPI", protocol: "pypi", location: "https://pypi.org/", upstreamSourceType: "public" }],
    maven2: [{ name: "Maven Central", protocol: "maven", location: "https://repo.maven.apache.org/maven2/", upstreamSourceType: "public" }]
  };
  return upstreams[format] || [];
}

// Run migration
assessSource(function(err, repos) {
  if (err) return console.error("Assessment failed:", err.message);
  createTargetFeeds(repos, function(err) {
    if (err) return console.error("Feed creation failed:", err.message);
    logStep(3, "Feed creation complete.");
    console.log("");
    console.log("Next steps:");
    console.log("  1. Run package export scripts for each repository");
    console.log("  2. Run import scripts to populate Azure Artifacts feeds");
    console.log("  3. Update client configurations (.npmrc, nuget.config, pip.conf)");
    console.log("  4. Update pipeline YAML files");
    console.log("  5. Run parallel for 2-4 weeks");
    console.log("  6. Cut over and decommission source repository");
  });
});
# Dry run first
SOURCE_URL=http://nexus.internal:8081 SOURCE_PASS=admin123 \
  AZURE_DEVOPS_PAT=your-pat node migrate.js nexus --dry-run

# Execute
SOURCE_URL=http://nexus.internal:8081 SOURCE_PASS=admin123 \
  AZURE_DEVOPS_PAT=your-pat node migrate.js nexus

Common Issues and Troubleshooting

1. Package Version Already Exists During Import

Error:

409 Conflict: Package version already exists

This is expected when re-running the import after a partial failure. Use --skip-duplicate for NuGet and ignore 409 errors for npm. The import scripts should be idempotent.

2. Authentication Fails After Switching to Azure Artifacts

Error:

401 Unauthorized when restoring packages from Azure Artifacts

Developer machines still have cached credentials for the old Nexus/Artifactory server. Clear cached credentials: dotnet nuget locals all --clear for NuGet, delete ~/.npmrc entries for npm, and remove ~/.pip/pip.conf entries for pip. Install the Azure Artifacts credential provider for seamless authentication.

3. Missing Packages After Import

Error: A build fails because a specific package version is not in Azure Artifacts.

The export script missed some packages, likely due to pagination or format filtering. Re-run the assessment to verify the total count, then re-run the export for the missing packages. Also check if the missing package came from a proxy/remote repository -- these are not exported; instead, they are resolved through upstream sources.

4. Build Times Increase After Migration

Error: Pipeline builds take 2-3 minutes longer after switching to Azure Artifacts.

The first restore after switching feeds will be slower because Azure Artifacts has not cached upstream packages yet. Subsequent builds will be faster as the feed caches packages. Pre-warm the feed by running a full restore before cutting over production pipelines.

5. Feed Views Missing After Migration

Error: Packages imported to Azure Artifacts are all in @Local view, not in @Release.

Imported packages land in the @Local view by default. You need to promote them to @Prerelease or @Release views using the REST API or the promote script from the feed views article.

Best Practices

  1. Run both systems in parallel for at least two weeks. Publish new packages to both Nexus/Artifactory and Azure Artifacts during the parallel period. This gives you a rollback path if issues surface.

  2. Migrate one package type at a time. Do npm first, validate it is working, then NuGet, then Python. Migrating everything at once multiplies the surface area for problems.

  3. Pre-warm Azure Artifacts upstream caches. Before cutting over, run a full npm install / dotnet restore / pip install against the Azure Artifacts feed to cache all upstream packages. This eliminates cold-start latency for the team.

  4. Communicate the timeline clearly. Give developers at least a week's notice before the cutover. Provide the new feed URLs and updated configuration files. Hold a brief walkthrough session.

  5. Keep the old repository read-only for 30 days after cutover. Do not delete Nexus/Artifactory immediately. Someone will have an old branch or an old pipeline that still points at it. Keeping it read-only prevents new publishes while allowing emergency restores.

  6. Validate package integrity after import. For critical packages, compare checksums between the source and destination. A corrupted package in your feed will cause mysterious build failures.

  7. Update all pipeline YAML files in a single PR. Changing feed URLs piecemeal across dozens of pipelines leads to inconsistency. Make one PR that updates every pipeline, review it thoroughly, and merge it at the start of the cutover window.

  8. Document the rollback procedure. Write down exactly how to revert to Nexus/Artifactory if the migration fails: which config files to change, which pipeline variables to update, and who to notify.

References

Powered by Contentful