Python Package Feeds with Azure Artifacts
A practical guide to publishing and consuming Python packages with Azure Artifacts feeds, covering pip configuration, twine publishing, authentication, setuptools, pyproject.toml, and upstream PyPI integration.
Python Package Feeds with Azure Artifacts
Overview
Azure Artifacts supports Python package feeds, giving you a private PyPI-compatible repository that integrates with pip, twine, and Azure Pipelines. If your organization builds internal Python libraries -- data science utilities, shared SDK clients, internal tooling -- Azure Artifacts lets you distribute them with the same workflow developers already use for public PyPI packages. You configure pip to point at your feed, and everything works: pip install your-internal-package just works, alongside public packages from PyPI.
I have set up Python feeds in Azure Artifacts for teams ranging from data engineering groups sharing ETL utilities to platform teams distributing internal SDKs. The integration with pip and twine is solid once you get authentication configured correctly, which is the part that trips most people up. This article covers the complete setup: feed creation, pip configuration, building and publishing with twine, authentication methods, upstream PyPI proxying, and pipeline automation.
Prerequisites
- An Azure DevOps organization with an active project
- An Azure Artifacts feed created (or we will create one)
- Python 3.8 or later installed
- pip, setuptools, and twine installed (
pip install setuptools twine build) - An Azure DevOps Personal Access Token (PAT) with Packaging (Read & Write) scope
- Node.js 18+ for the automation scripts
- Familiarity with Python package structure (
setup.pyorpyproject.toml)
Creating a Python-Enabled Feed
Azure Artifacts feeds support multiple protocols simultaneously. A single feed can host NuGet, npm, Maven, and Python packages. When you create a feed, Python support is available automatically.
If you do not have a feed yet, create one through the REST API:
// create-python-feed.js
var https = require("https");
var org = "my-organization";
var project = "my-project";
var pat = process.env.AZURE_DEVOPS_PAT;
var auth = Buffer.from(":" + pat).toString("base64");
var feedDefinition = {
name: "python-packages",
description: "Internal Python packages and libraries",
upstreamEnabled: true,
upstreamSources: [
{
name: "PyPI",
protocol: "pypi",
location: "https://pypi.org/",
upstreamSourceType: "public"
}
]
};
var body = JSON.stringify(feedDefinition);
var options = {
hostname: "feeds.dev.azure.com",
path: "/" + org + "/" + project + "/_apis/packaging/feeds?api-version=7.1",
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": "Basic " + auth,
"Content-Length": Buffer.byteLength(body)
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode === 201) {
var feed = JSON.parse(data);
console.log("Feed created: " + feed.name);
console.log("Feed ID: " + feed.id);
console.log("pip install URL: https://pkgs.dev.azure.com/" + org +
"/" + project + "/_packaging/" + feed.name + "/pypi/simple/");
console.log("twine upload URL: https://pkgs.dev.azure.com/" + org +
"/" + project + "/_packaging/" + feed.name + "/pypi/upload/");
} else {
console.error("Failed (" + res.statusCode + "):", data);
}
});
});
req.on("error", function(err) { console.error("Error:", err.message); });
req.write(body);
req.end();
export AZURE_DEVOPS_PAT="your-pat-here"
node create-python-feed.js
Output:
Feed created: python-packages
Feed ID: 5c8d2e3f-1a2b-3c4d-5e6f-789012345678
pip install URL: https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
twine upload URL: https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/upload/
Configuring pip for Azure Artifacts
pip.conf / pip.ini Setup
pip reads its configuration from a config file. The location depends on your operating system:
- Linux/macOS:
~/.config/pip/pip.confor~/.pip/pip.conf - Windows:
%APPDATA%\pip\pip.ini - Per-project:
pip.confin your virtual environment directory
Here is the configuration:
[global]
index-url = https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
This sets your Azure Artifacts feed as the primary package index. If you have PyPI configured as an upstream source on your feed, public packages will resolve transparently through your feed.
If you want to keep PyPI as a separate fallback (not recommended when you have upstream sources configured), use extra-index-url:
[global]
index-url = https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
extra-index-url = https://pypi.org/simple/
I do not recommend the extra-index-url approach. When pip sees multiple indexes, it picks the best version across all of them, which creates a dependency confusion attack vector -- an attacker can publish a higher version of your internal package name on public PyPI, and pip will prefer it. Using upstream sources through your Azure Artifacts feed avoids this because your feed controls the resolution order.
Authentication for pip
pip needs credentials to access your private feed. There are three approaches:
Option 1: URL-embedded credentials (quick and dirty)
[global]
index-url = https://azure:${AZURE_DEVOPS_PAT}@pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
Replace ${AZURE_DEVOPS_PAT} with your actual PAT. This works but is insecure for shared machines.
Option 2: keyring authentication (recommended for local development)
Install the Azure Artifacts keyring backend:
pip install artifacts-keyring
Then set your pip.conf with just the URL (no credentials):
[global]
index-url = https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
When pip tries to access the feed, the keyring backend intercepts the authentication challenge and opens a browser for Azure DevOps login. The token is cached locally.
Option 3: Environment variable (recommended for CI/CD)
export PIP_INDEX_URL="https://azure:${AZURE_DEVOPS_PAT}@pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/"
pip install your-package
The PIP_INDEX_URL environment variable overrides the config file. This is the cleanest approach for CI/CD because the PAT never touches disk.
Building Python Packages
Using setuptools with setup.py
The traditional approach uses setup.py:
# setup.py
from setuptools import setup, find_packages
setup(
name="mycompany-utilities",
version="1.0.0",
author="Platform Team",
author_email="[email protected]",
description="Internal utility library for data processing",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
url="https://dev.azure.com/my-organization/my-project/_git/python-utilities",
packages=find_packages(where="src"),
package_dir={"": "src"},
python_requires=">=3.8",
install_requires=[
"requests>=2.28.0",
"python-dateutil>=2.8.0",
],
extras_require={
"dev": [
"pytest>=7.0",
"pytest-cov>=4.0",
"black>=23.0",
],
},
classifiers=[
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
],
)
Using pyproject.toml (Modern Approach)
The modern approach uses pyproject.toml, which is now the recommended standard:
[build-system]
requires = ["setuptools>=68.0", "wheel"]
build-backend = "setuptools.backends._legacy:_Backend"
[project]
name = "mycompany-utilities"
version = "1.0.0"
description = "Internal utility library for data processing"
readme = "README.md"
license = {text = "Proprietary"}
requires-python = ">=3.8"
authors = [
{name = "Platform Team", email = "[email protected]"},
]
dependencies = [
"requests>=2.28.0",
"python-dateutil>=2.8.0",
]
[project.optional-dependencies]
dev = [
"pytest>=7.0",
"pytest-cov>=4.0",
"black>=23.0",
]
[project.urls]
Repository = "https://dev.azure.com/my-organization/my-project/_git/python-utilities"
[tool.setuptools.packages.find]
where = ["src"]
Package Structure
python-utilities/
pyproject.toml
setup.py # optional if using pyproject.toml
README.md
src/
mycompany_utilities/
__init__.py
data_helpers.py
api_client.py
tests/
test_data_helpers.py
test_api_client.py
# src/mycompany_utilities/__init__.py
__version__ = "1.0.0"
from .data_helpers import chunked, flatten, deduplicate
from .api_client import APIClient
# src/mycompany_utilities/data_helpers.py
def chunked(iterable, size):
"""Split an iterable into chunks of a given size."""
chunk = []
for item in iterable:
chunk.append(item)
if len(chunk) == size:
yield chunk
chunk = []
if chunk:
yield chunk
def flatten(nested_list):
"""Flatten a list of lists into a single list."""
result = []
for item in nested_list:
if isinstance(item, (list, tuple)):
result.extend(flatten(item))
else:
result.append(item)
return result
def deduplicate(items, key=None):
"""Remove duplicates while preserving order."""
seen = set()
result = []
for item in items:
k = key(item) if key else item
if k not in seen:
seen.add(k)
result.append(item)
return result
Building the Distribution
# Build source distribution and wheel
python -m build
# Output:
# * Creating venv isolated environment...
# * Installing packages in isolated environment... (setuptools>=68.0, wheel)
# * Getting build dependencies for sdist...
# * Building sdist...
# * Building wheel...
# Successfully built mycompany_utilities-1.0.0.tar.gz and mycompany_utilities-1.0.0-py3-none-any.whl
The build module creates both a source distribution (.tar.gz) and a wheel (.whl) in the dist/ directory.
Publishing with twine
twine is the standard tool for uploading Python packages to PyPI-compatible repositories.
.pypirc Configuration
Create or update ~/.pypirc:
[distutils]
index-servers =
azure-artifacts
[azure-artifacts]
repository = https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/upload/
username = azure
password = <your-pat-here>
Then upload:
twine upload --repository azure-artifacts dist/*
Output:
Uploading distributions to https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/upload/
Uploading mycompany_utilities-1.0.0-py3-none-any.whl
100% ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 12.5/12.5 kB • 00:00 • ?
Uploading mycompany_utilities-1.0.0.tar.gz
100% ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.8/11.8 kB • 00:00 • ?
Publishing Without .pypirc
You can pass credentials directly to twine, which is useful for CI/CD:
twine upload \
--repository-url https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/upload/ \
--username azure \
--password $AZURE_DEVOPS_PAT \
dist/*
Upstream PyPI Integration
When you configure PyPI as an upstream source on your Azure Artifacts feed, your feed acts as a caching proxy. The first time someone installs a public package through your feed, it fetches from PyPI and caches it. Subsequent installs pull from the cache.
This provides several benefits:
- Faster installs -- cached packages are served from Azure's CDN, not PyPI
- Availability -- your builds keep working even if PyPI has an outage
- Audit trail -- you can see exactly which public packages your organization depends on
- Dependency confusion protection -- your feed resolves internal packages first, then checks upstream
To verify upstream sources are working:
# Install a public package through your Azure Artifacts feed
pip install requests --index-url https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
# Output:
# Looking in indexes: https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/simple/
# Collecting requests
# Downloading https://pkgs.dev.azure.com/my-organization/my-project/_packaging/python-packages/pypi/packages/requests/2.31.0/requests-2.31.0-py3-none-any.whl (62.6 kB)
# Installing collected packages: requests
# Successfully installed requests-2.31.0
Notice the download URL points to your Azure Artifacts feed, not pypi.org. The package is fetched from PyPI on first access and cached in your feed.
Pipeline Integration
Basic Python Pipeline with Azure Artifacts
trigger:
branches:
include:
- main
pool:
vmImage: ubuntu-latest
variables:
pythonVersion: '3.11'
feedName: python-packages
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(pythonVersion)
displayName: Set Python version
- task: PipAuthenticate@1
inputs:
artifactFeeds: $(feedName)
displayName: Authenticate pip with Azure Artifacts
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: Install dependencies
- script: |
pip install pytest pytest-cov
pytest tests/ --cov=src --cov-report=xml --junitxml=test-results.xml
displayName: Run tests
- task: PublishTestResults@2
inputs:
testResultsFormat: JUnit
testResultsFiles: test-results.xml
condition: always()
displayName: Publish test results
The PipAuthenticate@1 task modifies the pip configuration for the pipeline run so that pip can access your private feed. You do not need to manually set PIP_INDEX_URL or manage credentials when using this task.
Publishing Pipeline
trigger:
branches:
include:
- main
pool:
vmImage: ubuntu-latest
variables:
pythonVersion: '3.11'
feedName: python-packages
stages:
- stage: Build
jobs:
- job: BuildAndTest
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(pythonVersion)
- task: PipAuthenticate@1
inputs:
artifactFeeds: $(feedName)
displayName: Authenticate pip
- script: |
python -m pip install --upgrade pip build
pip install -r requirements.txt
pip install pytest pytest-cov
displayName: Install dependencies
- script: pytest tests/ --cov=src --junitxml=test-results.xml
displayName: Run tests
- script: python -m build
displayName: Build package
- task: PublishBuildArtifacts@1
inputs:
pathToPublish: dist
artifactName: python-dist
- stage: Publish
dependsOn: Build
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
jobs:
- job: PublishPackage
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(pythonVersion)
- task: DownloadBuildArtifacts@1
inputs:
buildType: current
downloadType: single
artifactName: python-dist
downloadPath: $(System.ArtifactsDirectory)
- task: TwineAuthenticate@1
inputs:
artifactFeed: $(feedName)
displayName: Authenticate twine
- script: |
pip install twine
twine upload -r $(feedName) --config-file $(PYPIRC_PATH) $(System.ArtifactsDirectory)/python-dist/*
displayName: Publish to Azure Artifacts
The TwineAuthenticate@1 task creates a .pypirc file with the correct credentials and sets the PYPIRC_PATH environment variable. Use --config-file $(PYPIRC_PATH) with twine to pick up the credentials.
Complete Working Example
This is a complete Python utility library project with build configuration, tests, and a pipeline that publishes to Azure Artifacts.
Project Structure
python-utilities/
pyproject.toml
README.md
requirements.txt
src/
mycompany_utilities/
__init__.py
data_helpers.py
api_client.py
tests/
__init__.py
test_data_helpers.py
test_api_client.py
azure-pipelines.yml
The Library Code
# src/mycompany_utilities/api_client.py
import requests
import time
import logging
logger = logging.getLogger(__name__)
class APIClient:
"""A resilient HTTP client with retry logic and timeout handling."""
def __init__(self, base_url, api_key=None, timeout=30, max_retries=3):
self.base_url = base_url.rstrip("/")
self.timeout = timeout
self.max_retries = max_retries
self.session = requests.Session()
if api_key:
self.session.headers["Authorization"] = "Bearer " + api_key
def get(self, path, params=None):
return self._request("GET", path, params=params)
def post(self, path, data=None, json_data=None):
return self._request("POST", path, data=data, json=json_data)
def put(self, path, data=None, json_data=None):
return self._request("PUT", path, data=data, json=json_data)
def delete(self, path):
return self._request("DELETE", path)
def _request(self, method, path, **kwargs):
url = self.base_url + "/" + path.lstrip("/")
kwargs["timeout"] = self.timeout
last_error = None
for attempt in range(1, self.max_retries + 1):
try:
response = self.session.request(method, url, **kwargs)
response.raise_for_status()
return response.json() if response.content else None
except requests.exceptions.HTTPError as e:
if e.response.status_code < 500:
raise
last_error = e
logger.warning(
"Attempt %d/%d failed: %s %s returned %d",
attempt, self.max_retries, method, url,
e.response.status_code
)
except requests.exceptions.ConnectionError as e:
last_error = e
logger.warning(
"Attempt %d/%d failed: connection error for %s %s",
attempt, self.max_retries, method, url
)
except requests.exceptions.Timeout as e:
last_error = e
logger.warning(
"Attempt %d/%d failed: timeout for %s %s",
attempt, self.max_retries, method, url
)
if attempt < self.max_retries:
wait_time = 2 ** attempt
logger.info("Retrying in %d seconds...", wait_time)
time.sleep(wait_time)
raise last_error
def close(self):
self.session.close()
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self.close()
Tests
# tests/test_data_helpers.py
import pytest
from mycompany_utilities.data_helpers import chunked, flatten, deduplicate
class TestChunked:
def test_even_split(self):
result = list(chunked([1, 2, 3, 4, 5, 6], 2))
assert result == [[1, 2], [3, 4], [5, 6]]
def test_uneven_split(self):
result = list(chunked([1, 2, 3, 4, 5], 2))
assert result == [[1, 2], [3, 4], [5]]
def test_empty_input(self):
result = list(chunked([], 3))
assert result == []
def test_chunk_larger_than_input(self):
result = list(chunked([1, 2], 5))
assert result == [[1, 2]]
class TestFlatten:
def test_nested_lists(self):
result = flatten([[1, 2], [3, 4], [5]])
assert result == [1, 2, 3, 4, 5]
def test_deeply_nested(self):
result = flatten([1, [2, [3, [4]]]])
assert result == [1, 2, 3, 4]
def test_empty_input(self):
result = flatten([])
assert result == []
class TestDeduplicate:
def test_simple_dedup(self):
result = deduplicate([1, 2, 2, 3, 1, 4])
assert result == [1, 2, 3, 4]
def test_with_key_function(self):
items = [{"id": 1, "name": "a"}, {"id": 2, "name": "b"}, {"id": 1, "name": "c"}]
result = deduplicate(items, key=lambda x: x["id"])
assert len(result) == 2
assert result[0]["name"] == "a"
def test_preserves_order(self):
result = deduplicate([3, 1, 2, 1, 3])
assert result == [3, 1, 2]
# tests/test_api_client.py
import pytest
import json
from unittest.mock import patch, MagicMock
from mycompany_utilities.api_client import APIClient
class TestAPIClient:
def test_get_request(self):
client = APIClient("https://api.example.com", api_key="test-key")
mock_response = MagicMock()
mock_response.status_code = 200
mock_response.content = b'{"result": "ok"}'
mock_response.json.return_value = {"result": "ok"}
mock_response.raise_for_status = MagicMock()
with patch.object(client.session, "request", return_value=mock_response) as mock_req:
result = client.get("/users")
assert result == {"result": "ok"}
mock_req.assert_called_once()
def test_retry_on_server_error(self):
client = APIClient("https://api.example.com", max_retries=2)
error_response = MagicMock()
error_response.status_code = 503
error_response.raise_for_status.side_effect = Exception("503 Server Error")
# This will raise after retries
with patch.object(client.session, "request", return_value=error_response):
with pytest.raises(Exception):
client.get("/health")
Pipeline YAML
trigger:
branches:
include:
- main
- feature/*
pool:
vmImage: ubuntu-latest
variables:
pythonVersion: '3.11'
feedName: python-packages
stages:
- stage: Test
jobs:
- job: RunTests
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(pythonVersion)
- task: PipAuthenticate@1
inputs:
artifactFeeds: $(feedName)
- script: |
python -m pip install --upgrade pip
pip install -e ".[dev]"
displayName: Install package and dev dependencies
- script: |
pytest tests/ \
--cov=src/mycompany_utilities \
--cov-report=xml:coverage.xml \
--junitxml=test-results.xml \
-v
displayName: Run tests with coverage
- task: PublishTestResults@2
inputs:
testResultsFormat: JUnit
testResultsFiles: test-results.xml
condition: always()
- task: PublishCodeCoverageResults@2
inputs:
summaryFileLocation: coverage.xml
condition: always()
- stage: Publish
dependsOn: Test
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
jobs:
- job: PublishPackage
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(pythonVersion)
- script: |
python -m pip install --upgrade pip build twine
displayName: Install build tools
- script: python -m build
displayName: Build package
- task: TwineAuthenticate@1
inputs:
artifactFeed: $(feedName)
- script: |
twine upload \
-r $(feedName) \
--config-file $(PYPIRC_PATH) \
dist/*
displayName: Upload to Azure Artifacts
Feed Management Script
// python-feed-manager.js
var https = require("https");
var org = process.env.AZURE_DEVOPS_ORG || "my-organization";
var project = process.env.AZURE_DEVOPS_PROJECT || "my-project";
var feedId = process.env.AZURE_DEVOPS_FEED || "python-packages";
var pat = process.env.AZURE_DEVOPS_PAT;
if (!pat) {
console.error("Error: AZURE_DEVOPS_PAT is required");
process.exit(1);
}
var auth = Buffer.from(":" + pat).toString("base64");
function apiRequest(method, hostname, path, body, callback) {
var options = {
hostname: hostname,
path: path,
method: method,
headers: {
"Content-Type": "application/json",
"Authorization": "Basic " + auth
}
};
if (body) options.headers["Content-Length"] = Buffer.byteLength(body);
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() { callback(null, res.statusCode, data); });
});
req.on("error", function(err) { callback(err); });
if (body) req.write(body);
req.end();
}
function listPackages() {
var path = "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
"/packages?api-version=7.1&protocolType=PyPi&$top=50";
apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
if (err) return console.error("Error:", err.message);
var result = JSON.parse(data);
console.log("Python packages in " + feedId + " (" + result.count + " total):");
console.log("---");
result.value.forEach(function(pkg) {
var latest = pkg.versions[0];
console.log(" " + pkg.name + " @ " + latest.version);
console.log(" Published: " + new Date(latest.publishDate).toLocaleDateString());
console.log("---");
});
});
}
function getPackageVersions(packageName) {
var path = "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
"/packages?api-version=7.1&protocolType=PyPi&packageNameQuery=" + packageName;
apiRequest("GET", "feeds.dev.azure.com", path, null, function(err, status, data) {
if (err) return console.error("Error:", err.message);
var result = JSON.parse(data);
if (result.count === 0) {
console.log("Package not found: " + packageName);
return;
}
var pkg = result.value[0];
console.log("Versions of " + pkg.name + ":");
pkg.versions.forEach(function(v) {
console.log(" " + v.version + " (" + new Date(v.publishDate).toLocaleDateString() + ")");
});
});
}
function deleteVersion(packageName, version) {
var path = "/" + org + "/" + project + "/_apis/packaging/feeds/" + feedId +
"/pypi/packages/" + packageName + "/versions/" + version + "?api-version=7.1";
var body = JSON.stringify({ listed: false });
apiRequest("PATCH", "pkgs.dev.azure.com", path, body, function(err, status, data) {
if (err) return console.error("Error:", err.message);
if (status === 200) {
console.log("Unlisted " + packageName + " " + version);
} else {
console.error("Failed (" + status + "):", data);
}
});
}
var command = process.argv[2];
switch (command) {
case "list":
listPackages();
break;
case "versions":
getPackageVersions(process.argv[3]);
break;
case "unlist":
deleteVersion(process.argv[3], process.argv[4]);
break;
default:
console.log("Usage:");
console.log(" node python-feed-manager.js list");
console.log(" node python-feed-manager.js versions <package-name>");
console.log(" node python-feed-manager.js unlist <package-name> <version>");
}
node python-feed-manager.js list
# Output:
# Python packages in python-packages (3 total):
# ---
# mycompany-utilities @ 1.0.0
# Published: 2/7/2026
# ---
# mycompany-etl-helpers @ 2.1.0
# Published: 2/5/2026
# ---
Common Issues and Troubleshooting
1. 401 Unauthorized When Installing Packages
Error:
ERROR: Could not find a version that satisfies the requirement mycompany-utilities
ERROR: No matching distribution found for mycompany-utilities
WARNING: 401 Unauthorized - https://pkgs.dev.azure.com/.../simple/mycompany-utilities/
pip silently swallows 401 errors and reports them as "package not found." If you see "no matching distribution found" for a package you know exists, it is almost certainly an authentication problem. Check your PAT, verify it has Packaging scope, and ensure the feed URL is correct. Install the artifacts-keyring package for interactive authentication, or set PIP_INDEX_URL with embedded credentials.
2. twine Upload Fails with 403
Error:
HTTPError: 403 Forbidden from https://pkgs.dev.azure.com/.../pypi/upload/
The PAT has Packaging Read but not Write scope. Generate a new PAT with Packaging (Read & Write). Also verify the build service identity has Contributor permission on the feed if this happens in a pipeline.
3. Package Version Already Exists
Error:
HTTPError: 409 Conflict
The feed already contains the package 'mycompany-utilities' at version '1.0.0'.
Python feeds in Azure Artifacts are immutable -- you cannot overwrite a published version. Increment the version number. If you are iterating during development, use pre-release versions:
1.0.0.dev1
1.0.0.dev2
1.0.0a1 # alpha
1.0.0b1 # beta
1.0.0rc1 # release candidate
1.0.0 # final release
4. Dependency Confusion with Extra Index URL
Error: pip installs a malicious package from PyPI instead of your internal package.
This happens when you use extra-index-url for PyPI alongside your private feed. pip picks the highest version across all indexes. Remove extra-index-url and use upstream sources on your Azure Artifacts feed instead. The upstream source resolves internal packages first, then falls back to PyPI.
5. PipAuthenticate Task Not Setting Credentials
Error: PipAuthenticate@1 runs successfully but pip still gets 401 errors.
The task sets the PIP_INDEX_URL and PIP_EXTRA_INDEX_URL environment variables for subsequent steps. If your pip install runs in a different scope (e.g., inside a Docker container or a script that resets environment variables), the credentials are lost. Use PIP_INDEX_URL explicitly:
- script: |
echo "Using index: $PIP_INDEX_URL"
pip install -r requirements.txt
displayName: Install with explicit index
6. Slow Package Resolution on First Access
Error: pip install takes 30+ seconds for packages available on PyPI.
The first time a package is accessed through an upstream source, Azure Artifacts fetches it from PyPI, which adds latency. Subsequent installs are fast because the package is cached. Pre-warm your feed by installing common dependencies once:
pip install requests flask sqlalchemy numpy pandas
Best Practices
Use upstream sources instead of
extra-index-url. Upstream sources on your Azure Artifacts feed provide dependency confusion protection by resolving internal packages first. Theextra-index-urlapproach is vulnerable to name squatting attacks on public PyPI.Use
artifacts-keyringfor local development. It handles authentication transparently without storing PATs in config files. Developers authenticate through their browser once, and the token is cached in the system keyring.Use
PipAuthenticate@1andTwineAuthenticate@1in pipelines. These tasks handle credential injection cleanly. Never hardcode PATs in pipeline variables -- use$(System.AccessToken)or the authenticate tasks.Build with
python -m buildinstead ofpython setup.py sdist bdist_wheel. Thebuildmodule creates an isolated environment for the build process, which prevents your local development dependencies from leaking into the package.Publish both source distributions and wheels. Wheels install faster because they skip the build step. Source distributions provide a fallback for platforms where pre-built wheels are not available.
Use pre-release versions for development builds. PEP 440 defines a clear versioning scheme:
.devN,aN(alpha),bN(beta),rcN(release candidate). Use these for CI builds from feature branches so consumers on stable versions are not affected.Pin your dependencies in
requirements.txtfor applications, use ranges insetup.py/pyproject.tomlfor libraries. Applications should be reproducible (requests==2.31.0). Libraries should be flexible (requests>=2.28.0) to avoid version conflicts.Run
pip install -e ".[dev]"for local development. Editable installs let you test your package as consumers would see it, while still being able to modify code without reinstalling.Include a
py.typedmarker file for type-hinted packages. If your library includes type annotations, add an emptypy.typedfile in your package directory. This tells type checkers that your package supports typing.