Integrations

Azure DevOps REST API: Complete Reference

Master the Azure DevOps REST API with Node.js for work items, builds, PRs, pipelines, and automated project management

Azure DevOps REST API: Complete Reference

The Azure DevOps REST API is the backbone of every integration, automation, and custom tool you will build against Azure DevOps Services or Server. Every button click in the portal maps to an API call, and understanding this API gives you programmatic control over work items, builds, releases, repositories, pull requests, pipelines, tests, wikis, and project settings. This guide covers the full surface area with working Node.js examples so you can build production-grade automation.

Prerequisites

  • An Azure DevOps organization (cloud or on-premises 2020+)
  • Node.js 14 or later installed
  • A Personal Access Token (PAT) with appropriate scopes
  • Basic familiarity with REST APIs and HTTP methods
  • node-fetch or the built-in https module for HTTP requests

Install the HTTP dependency we will use throughout:

npm install node-fetch@2

We use version 2 because it supports CommonJS require() without ESM configuration.

API Versioning and Base URLs

Every Azure DevOps REST API call requires an api-version query parameter. Microsoft versions the API independently from the product release cycle. The general format is:

https://dev.azure.com/{organization}/{project}/_apis/{area}/{resource}?api-version=7.1

Some APIs live under different subdomains:

Subdomain Purpose
dev.azure.com Core APIs (work items, repos, builds)
vssps.dev.azure.com Identity and profile APIs
vsrm.dev.azure.com Release management APIs
feeds.dev.azure.com Artifact feeds
almsearch.dev.azure.com Search APIs

For Azure DevOps Server (on-premises), the base URL pattern is:

https://{server}/{collection}/{project}/_apis/{area}/{resource}?api-version=7.1

API version previews use the format 7.1-preview.1. Once a version graduates from preview, you drop the preview suffix. Always pin your integrations to a specific version rather than omitting it — the default version behavior is unpredictable and has caused breaking changes in production.

Authentication

Personal Access Tokens (PAT)

PATs are the simplest authentication method. You create them in User Settings > Personal Access Tokens. The token is sent as a Basic auth header with an empty username:

var fetch = require("node-fetch");

var token = process.env.AZURE_DEVOPS_PAT;
var organization = "myorg";

var headers = {
    "Authorization": "Basic " + Buffer.from(":" + token).toString("base64"),
    "Content-Type": "application/json"
};

function callApi(path) {
    var url = "https://dev.azure.com/" + organization + "/" + path;
    return fetch(url, { headers: headers }).then(function(res) {
        if (!res.ok) {
            throw new Error("API returned " + res.status + ": " + res.statusText);
        }
        return res.json();
    });
}

PATs expire after a maximum of one year. For production automations, use service principals or managed identities instead.

OAuth 2.0

OAuth is appropriate for applications that act on behalf of users. You register your app at https://app.vssps.visualstudio.com/app/register, then implement the standard authorization code flow:

var crypto = require("crypto");

var clientId = process.env.AZURE_DEVOPS_CLIENT_ID;
var clientSecret = process.env.AZURE_DEVOPS_CLIENT_SECRET;
var redirectUri = "https://myapp.example.com/callback";
var scope = "vso.work vso.code vso.build";

function getAuthorizationUrl() {
    var state = crypto.randomBytes(16).toString("hex");
    return "https://app.vssps.visualstudio.com/oauth2/authorize" +
        "?client_id=" + clientId +
        "&response_type=Assertion" +
        "&state=" + state +
        "&scope=" + encodeURIComponent(scope) +
        "&redirect_uri=" + encodeURIComponent(redirectUri);
}

function exchangeCodeForToken(code) {
    return fetch("https://app.vssps.visualstudio.com/oauth2/token", {
        method: "POST",
        headers: { "Content-Type": "application/x-www-form-urlencoded" },
        body: "client_assertion_type=urn:ietf:params:oauth:client-assertion-type:jwt-bearer" +
            "&client_assertion=" + encodeURIComponent(clientSecret) +
            "&grant_type=urn:ietf:params:oauth:grant-type:jwt-bearer" +
            "&assertion=" + encodeURIComponent(code) +
            "&redirect_uri=" + encodeURIComponent(redirectUri)
    }).then(function(res) { return res.json(); });
}

Service Principal (Microsoft Entra ID)

For automated services without user context, register an application in Entra ID, grant it access to your Azure DevOps organization, and use client credentials:

function getServicePrincipalToken() {
    var tenantId = process.env.AZURE_TENANT_ID;
    var clientId = process.env.AZURE_CLIENT_ID;
    var clientSecret = process.env.AZURE_CLIENT_SECRET;

    var url = "https://login.microsoftonline.com/" + tenantId + "/oauth2/v2.0/token";
    var body = "grant_type=client_credentials" +
        "&client_id=" + clientId +
        "&client_secret=" + encodeURIComponent(clientSecret) +
        "&scope=499b84ac-1321-427f-aa17-267ca6975798/.default";

    return fetch(url, {
        method: "POST",
        headers: { "Content-Type": "application/x-www-form-urlencoded" },
        body: body
    }).then(function(res) { return res.json(); })
    .then(function(data) { return data.access_token; });
}

The scope 499b84ac-1321-427f-aa17-267ca6975798 is the well-known resource ID for Azure DevOps. This is the approach I recommend for CI/CD integrations and background services.

Work Item CRUD Operations

Work items are the central data model — bugs, user stories, tasks, epics, and features all share the same API surface.

Creating a Work Item

Work items use JSON Patch operations for creation and updates:

function createWorkItem(project, type, fields) {
    var url = project + "/_apis/wit/workitems/$" + encodeURIComponent(type) + "?api-version=7.1";
    var patchDoc = Object.keys(fields).map(function(key) {
        return {
            op: "add",
            path: "/fields/" + key,
            value: fields[key]
        };
    });

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: {
            "Authorization": "Basic " + Buffer.from(":" + token).toString("base64"),
            "Content-Type": "application/json-patch+json"
        },
        body: JSON.stringify(patchDoc)
    }).then(function(res) { return res.json(); });
}

// Usage
createWorkItem("MyProject", "Bug", {
    "System.Title": "Login page throws 500 on invalid email",
    "System.Description": "Steps to reproduce: enter foo@@ in email field",
    "System.AssignedTo": "[email protected]",
    "Microsoft.VSTS.Common.Priority": 2,
    "System.AreaPath": "MyProject\\Backend",
    "System.IterationPath": "MyProject\\Sprint 42"
}).then(function(item) {
    console.log("Created work item #" + item.id);
});

Note the Content-Type header — it must be application/json-patch+json, not regular application/json. This is the most common mistake people make with this API.

Reading Work Items

function getWorkItem(id, expand) {
    var url = "_apis/wit/workitems/" + id + "?api-version=7.1";
    if (expand) {
        url += "&$expand=" + expand;
    }
    return callApi(url);
}

// Get a single item with relations
getWorkItem(12345, "relations").then(function(item) {
    console.log(item.fields["System.Title"]);
    console.log(item.fields["System.State"]);
});

// Batch get multiple items
function getWorkItems(ids) {
    return callApi("_apis/wit/workitems?ids=" + ids.join(",") + "&api-version=7.1");
}

Querying with WIQL

WIQL (Work Item Query Language) is a SQL-like syntax for filtering work items:

function queryWorkItems(project, wiql) {
    return fetch("https://dev.azure.com/" + organization + "/" + project + "/_apis/wit/wiql?api-version=7.1", {
        method: "POST",
        headers: headers,
        body: JSON.stringify({ query: wiql })
    }).then(function(res) { return res.json(); });
}

queryWorkItems("MyProject",
    "SELECT [System.Id], [System.Title], [System.State] " +
    "FROM WorkItems " +
    "WHERE [System.TeamProject] = @project " +
    "AND [System.State] = 'Active' " +
    "AND [System.AssignedTo] = @me " +
    "ORDER BY [Microsoft.VSTS.Common.Priority] ASC"
).then(function(result) {
    var ids = result.workItems.map(function(wi) { return wi.id; });
    return getWorkItems(ids);
});

WIQL returns only IDs by default. You need a second call to fetch the full work item details. This is by design — it prevents accidentally loading thousands of full objects.

Updating Work Items

function updateWorkItem(id, fields) {
    var patchDoc = Object.keys(fields).map(function(key) {
        return {
            op: "replace",
            path: "/fields/" + key,
            value: fields[key]
        };
    });

    return fetch("https://dev.azure.com/" + organization + "/_apis/wit/workitems/" + id + "?api-version=7.1", {
        method: "PATCH",
        headers: {
            "Authorization": "Basic " + Buffer.from(":" + token).toString("base64"),
            "Content-Type": "application/json-patch+json"
        },
        body: JSON.stringify(patchDoc)
    }).then(function(res) { return res.json(); });
}

// Move to closed state
updateWorkItem(12345, {
    "System.State": "Closed",
    "Microsoft.VSTS.Common.ResolvedReason": "Fixed"
});

Build and Release APIs

Triggering a Build

function queueBuild(project, definitionId, sourceBranch, parameters) {
    var body = {
        definition: { id: definitionId },
        sourceBranch: sourceBranch || "refs/heads/main"
    };
    if (parameters) {
        body.parameters = JSON.stringify(parameters);
    }

    return fetch("https://dev.azure.com/" + organization + "/" + project + "/_apis/build/builds?api-version=7.1", {
        method: "POST",
        headers: headers,
        body: JSON.stringify(body)
    }).then(function(res) { return res.json(); });
}

// Queue a build with custom parameters
queueBuild("MyProject", 42, "refs/heads/release/v2.1", {
    "deployTarget": "staging",
    "runIntegrationTests": "true"
}).then(function(build) {
    console.log("Build " + build.buildNumber + " queued (ID: " + build.id + ")");
});

Getting Build Results

function getBuild(project, buildId) {
    return callApi(project + "/_apis/build/builds/" + buildId + "?api-version=7.1");
}

function getBuildLogs(project, buildId) {
    return callApi(project + "/_apis/build/builds/" + buildId + "/logs?api-version=7.1");
}

function listBuilds(project, definitionId, top) {
    var url = project + "/_apis/build/builds?api-version=7.1";
    if (definitionId) url += "&definitions=" + definitionId;
    if (top) url += "&$top=" + top;
    return callApi(url);
}

Release APIs

Release APIs live on the vsrm.dev.azure.com subdomain:

function createRelease(project, definitionId, description) {
    var url = "https://vsrm.dev.azure.com/" + organization + "/" + project +
        "/_apis/release/releases?api-version=7.1";

    var body = {
        definitionId: definitionId,
        description: description || "Automated release",
        isDraft: false
    };

    return fetch(url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify(body)
    }).then(function(res) { return res.json(); });
}

function deployRelease(project, releaseId, environmentId) {
    var url = "https://vsrm.dev.azure.com/" + organization + "/" + project +
        "/_apis/release/releases/" + releaseId +
        "/environments/" + environmentId + "?api-version=7.1";

    return fetch(url, {
        method: "PATCH",
        headers: headers,
        body: JSON.stringify({
            status: "inProgress",
            comment: "Deploying via automation"
        })
    }).then(function(res) { return res.json(); });
}

Git Repository Operations

Listing Repositories and Browsing Files

function listRepos(project) {
    return callApi(project + "/_apis/git/repositories?api-version=7.1");
}

function getFileContent(project, repoId, path, branch) {
    var url = project + "/_apis/git/repositories/" + repoId +
        "/items?path=" + encodeURIComponent(path) +
        "&versionDescriptor.version=" + encodeURIComponent(branch || "main") +
        "&versionDescriptor.versionType=branch" +
        "&includeContent=true&api-version=7.1";
    return callApi(url);
}

function listCommits(project, repoId, branch, top) {
    var url = project + "/_apis/git/repositories/" + repoId +
        "/commits?searchCriteria.itemVersion.version=" + encodeURIComponent(branch || "main") +
        "&$top=" + (top || 50) + "&api-version=7.1";
    return callApi(url);
}

Creating a Branch

function createBranch(project, repoId, branchName, sourceCommitId) {
    var url = project + "/_apis/git/repositories/" + repoId + "/refs?api-version=7.1";

    var body = [{
        name: "refs/heads/" + branchName,
        oldObjectId: "0000000000000000000000000000000000000000",
        newObjectId: sourceCommitId
    }];

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify(body)
    }).then(function(res) { return res.json(); });
}

Pushing a Commit

function pushCommit(project, repoId, branch, changes, commitMessage) {
    var url = project + "/_apis/git/repositories/" + repoId + "/pushes?api-version=7.1";

    var body = {
        refUpdates: [{
            name: "refs/heads/" + branch,
            oldObjectId: "" // Needs the current commit SHA
        }],
        commits: [{
            comment: commitMessage,
            changes: changes.map(function(change) {
                return {
                    changeType: change.type || "edit",
                    item: { path: change.path },
                    newContent: {
                        content: change.content,
                        contentType: "rawtext"
                    }
                };
            })
        }]
    };

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify(body)
    }).then(function(res) { return res.json(); });
}

Pull Request Management

Creating and Managing PRs

function createPullRequest(project, repoId, sourceBranch, targetBranch, title, description) {
    var url = project + "/_apis/git/repositories/" + repoId + "/pullrequests?api-version=7.1";

    var body = {
        sourceRefName: "refs/heads/" + sourceBranch,
        targetRefName: "refs/heads/" + targetBranch,
        title: title,
        description: description
    };

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify(body)
    }).then(function(res) { return res.json(); });
}

function addPullRequestReviewer(project, repoId, prId, reviewerId) {
    var url = project + "/_apis/git/repositories/" + repoId +
        "/pullrequests/" + prId + "/reviewers/" + reviewerId + "?api-version=7.1";

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "PUT",
        headers: headers,
        body: JSON.stringify({ vote: 0, isRequired: true })
    }).then(function(res) { return res.json(); });
}

function completePullRequest(project, repoId, prId, lastMergeSourceCommit, deleteSource) {
    var url = project + "/_apis/git/repositories/" + repoId +
        "/pullrequests/" + prId + "?api-version=7.1";

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "PATCH",
        headers: headers,
        body: JSON.stringify({
            status: "completed",
            lastMergeSourceCommit: { commitId: lastMergeSourceCommit },
            completionOptions: {
                mergeStrategy: "squash",
                deleteSourceBranch: deleteSource !== false,
                transitionWorkItems: true
            }
        })
    }).then(function(res) { return res.json(); });
}

PR Comments and Threads

function addPrComment(project, repoId, prId, content, filePath, line) {
    var url = project + "/_apis/git/repositories/" + repoId +
        "/pullrequests/" + prId + "/threads?api-version=7.1";

    var thread = {
        comments: [{ content: content, commentType: 1 }],
        status: 1 // Active
    };

    if (filePath) {
        thread.threadContext = {
            filePath: filePath,
            rightFileStart: { line: line, offset: 1 },
            rightFileEnd: { line: line, offset: 1 }
        };
    }

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify(thread)
    }).then(function(res) { return res.json(); });
}

Pipeline APIs (YAML Pipelines)

The newer YAML pipelines use a different API surface than classic builds:

function listPipelines(project) {
    return callApi(project + "/_apis/pipelines?api-version=7.1");
}

function runPipeline(project, pipelineId, branch, variables) {
    var url = project + "/_apis/pipelines/" + pipelineId + "/runs?api-version=7.1";

    var body = {
        resources: {
            repositories: {
                self: { refName: "refs/heads/" + (branch || "main") }
            }
        }
    };

    if (variables) {
        body.variables = {};
        Object.keys(variables).forEach(function(key) {
            body.variables[key] = { value: variables[key] };
        });
    }

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify(body)
    }).then(function(res) { return res.json(); });
}

function getPipelineRun(project, pipelineId, runId) {
    return callApi(project + "/_apis/pipelines/" + pipelineId + "/runs/" + runId + "?api-version=7.1");
}

Test Management APIs

function createTestRun(project, planId, name, pointIds) {
    var url = project + "/_apis/test/runs?api-version=7.1";

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "POST",
        headers: headers,
        body: JSON.stringify({
            name: name,
            plan: { id: planId },
            pointIds: pointIds
        })
    }).then(function(res) { return res.json(); });
}

function updateTestResults(project, runId, results) {
    var url = project + "/_apis/test/runs/" + runId + "/results?api-version=7.1";

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "PATCH",
        headers: headers,
        body: JSON.stringify(results.map(function(r) {
            return {
                testCaseTitle: r.title,
                outcome: r.passed ? "Passed" : "Failed",
                state: "Completed",
                errorMessage: r.error || "",
                durationInMs: r.duration || 0
            };
        }))
    }).then(function(res) { return res.json(); });
}

Wiki APIs

function createWikiPage(project, wikiId, path, content) {
    var url = project + "/_apis/wiki/wikis/" + wikiId +
        "/pages?path=" + encodeURIComponent(path) + "&api-version=7.1";

    return fetch("https://dev.azure.com/" + organization + "/" + url, {
        method: "PUT",
        headers: headers,
        body: JSON.stringify({ content: content })
    }).then(function(res) { return res.json(); });
}

function getWikiPage(project, wikiId, path) {
    var url = project + "/_apis/wiki/wikis/" + wikiId +
        "/pages?path=" + encodeURIComponent(path) +
        "&includeContent=true&api-version=7.1";
    return callApi(url);
}

Project and Team APIs

function listProjects() {
    return callApi("_apis/projects?api-version=7.1");
}

function getProject(projectId) {
    return callApi("_apis/projects/" + projectId + "?includeCapabilities=true&api-version=7.1");
}

function listTeams(projectId) {
    return callApi("_apis/projects/" + projectId + "/teams?api-version=7.1");
}

function getTeamMembers(projectId, teamId) {
    return callApi("_apis/projects/" + projectId + "/teams/" + teamId + "/members?api-version=7.1");
}

Pagination and Continuation Tokens

Most list endpoints return paginated results. Azure DevOps uses two pagination approaches.

Top/Skip pagination for simpler endpoints:

function getAllWorkItemsFromQuery(project, wiql) {
    return queryWorkItems(project, wiql).then(function(result) {
        // WIQL returns max 20,000 IDs
        var ids = result.workItems.map(function(wi) { return wi.id; });
        var batches = [];
        // Batch in groups of 200 (API limit per request)
        for (var i = 0; i < ids.length; i += 200) {
            batches.push(ids.slice(i, i + 200));
        }
        return Promise.all(batches.map(function(batch) {
            return getWorkItems(batch);
        }));
    }).then(function(results) {
        var items = [];
        results.forEach(function(r) {
            items = items.concat(r.value);
        });
        return items;
    });
}

Continuation token pagination for larger datasets:

function getAllBuilds(project, definitionId) {
    var allBuilds = [];

    function fetchPage(continuationToken) {
        var url = project + "/_apis/build/builds?api-version=7.1&definitions=" + definitionId;
        if (continuationToken) {
            url += "&continuationToken=" + continuationToken;
        }

        return fetch("https://dev.azure.com/" + organization + "/" + url, {
            headers: headers
        }).then(function(res) {
            var nextToken = res.headers.get("x-ms-continuationtoken");
            return res.json().then(function(data) {
                allBuilds = allBuilds.concat(data.value);
                if (nextToken) {
                    return fetchPage(nextToken);
                }
                return allBuilds;
            });
        });
    }

    return fetchPage(null);
}

The continuation token comes back in the response header x-ms-continuationtoken. If it is absent, you have reached the last page. Do not guess or construct these tokens yourself — they are opaque server-generated values.

Rate Limiting and Throttling

Azure DevOps Services enforces rate limits based on a Token Bucket System (TBS). Each organization gets a global bucket, and each authenticated identity gets a personal bucket. When you exceed the limit, the API returns a 429 Too Many Requests status with a Retry-After header.

Build retry logic into your client:

function fetchWithRetry(url, options, maxRetries) {
    maxRetries = maxRetries || 5;
    var attempt = 0;

    function doFetch() {
        attempt++;
        return fetch(url, options).then(function(res) {
            if (res.status === 429 && attempt < maxRetries) {
                var retryAfter = parseInt(res.headers.get("Retry-After") || "5", 10);
                console.log("Rate limited. Retrying in " + retryAfter + " seconds (attempt " + attempt + ")");
                return new Promise(function(resolve) {
                    setTimeout(resolve, retryAfter * 1000);
                }).then(doFetch);
            }
            if (res.status === 503 && attempt < maxRetries) {
                // Service temporarily unavailable — back off exponentially
                var backoff = Math.pow(2, attempt) * 1000;
                return new Promise(function(resolve) {
                    setTimeout(resolve, backoff);
                }).then(doFetch);
            }
            return res;
        });
    }

    return doFetch();
}

In practice, I have found that staying under 30 requests per second per identity avoids throttling for most workloads. If you are doing bulk operations, add deliberate delays between batches.

Complete Working Example: Reusable API Client

Here is a comprehensive Node.js client library that wraps everything above into a production-ready module:

var fetch = require("node-fetch");

function AzureDevOpsClient(options) {
    this.organization = options.organization;
    this.project = options.project;
    this.token = options.token;
    this.apiVersion = options.apiVersion || "7.1";
    this.maxRetries = options.maxRetries || 5;

    this.baseUrl = "https://dev.azure.com/" + this.organization;
    this.vsrmUrl = "https://vsrm.dev.azure.com/" + this.organization;
    this.authHeader = "Basic " + Buffer.from(":" + this.token).toString("base64");
}

// --- Core HTTP Methods ---

AzureDevOpsClient.prototype._headers = function(contentType) {
    return {
        "Authorization": this.authHeader,
        "Content-Type": contentType || "application/json"
    };
};

AzureDevOpsClient.prototype._url = function(path, baseOverride) {
    var base = baseOverride || this.baseUrl;
    var separator = path.indexOf("?") === -1 ? "?" : "&";
    return base + "/" + path + separator + "api-version=" + this.apiVersion;
};

AzureDevOpsClient.prototype._fetch = function(url, options) {
    var self = this;
    var attempt = 0;

    function doFetch() {
        attempt++;
        return fetch(url, options).then(function(res) {
            if (res.status === 429 && attempt < self.maxRetries) {
                var retryAfter = parseInt(res.headers.get("Retry-After") || "5", 10);
                return new Promise(function(resolve) {
                    setTimeout(resolve, retryAfter * 1000);
                }).then(doFetch);
            }
            if (!res.ok) {
                return res.text().then(function(body) {
                    var err = new Error("Azure DevOps API error " + res.status + ": " + body);
                    err.status = res.status;
                    err.body = body;
                    throw err;
                });
            }
            var contentType = res.headers.get("content-type") || "";
            if (contentType.indexOf("json") !== -1) {
                return res.json();
            }
            return res.text();
        });
    }

    return doFetch();
};

AzureDevOpsClient.prototype.get = function(path, baseOverride) {
    return this._fetch(this._url(path, baseOverride), {
        headers: this._headers()
    });
};

AzureDevOpsClient.prototype.post = function(path, body, contentType, baseOverride) {
    return this._fetch(this._url(path, baseOverride), {
        method: "POST",
        headers: this._headers(contentType),
        body: JSON.stringify(body)
    });
};

AzureDevOpsClient.prototype.patch = function(path, body, contentType) {
    return this._fetch(this._url(path), {
        method: "PATCH",
        headers: this._headers(contentType),
        body: JSON.stringify(body)
    });
};

AzureDevOpsClient.prototype.put = function(path, body) {
    return this._fetch(this._url(path), {
        method: "PUT",
        headers: this._headers(),
        body: JSON.stringify(body)
    });
};

// --- Work Items ---

AzureDevOpsClient.prototype.createWorkItem = function(type, fields) {
    var patchDoc = Object.keys(fields).map(function(key) {
        return { op: "add", path: "/fields/" + key, value: fields[key] };
    });
    return this.post(
        this.project + "/_apis/wit/workitems/$" + encodeURIComponent(type),
        patchDoc,
        "application/json-patch+json"
    );
};

AzureDevOpsClient.prototype.getWorkItem = function(id, expand) {
    var path = "_apis/wit/workitems/" + id;
    if (expand) path += "?$expand=" + expand;
    return this.get(path);
};

AzureDevOpsClient.prototype.updateWorkItem = function(id, fields) {
    var patchDoc = Object.keys(fields).map(function(key) {
        return { op: "replace", path: "/fields/" + key, value: fields[key] };
    });
    return this.patch(
        "_apis/wit/workitems/" + id,
        patchDoc,
        "application/json-patch+json"
    );
};

AzureDevOpsClient.prototype.queryWorkItems = function(wiql) {
    return this.post(this.project + "/_apis/wit/wiql", { query: wiql });
};

// --- Builds ---

AzureDevOpsClient.prototype.queueBuild = function(definitionId, sourceBranch, parameters) {
    var body = {
        definition: { id: definitionId },
        sourceBranch: sourceBranch || "refs/heads/main"
    };
    if (parameters) {
        body.parameters = JSON.stringify(parameters);
    }
    return this.post(this.project + "/_apis/build/builds", body);
};

AzureDevOpsClient.prototype.getBuild = function(buildId) {
    return this.get(this.project + "/_apis/build/builds/" + buildId);
};

AzureDevOpsClient.prototype.listBuilds = function(definitionId, top) {
    var path = this.project + "/_apis/build/builds";
    var params = [];
    if (definitionId) params.push("definitions=" + definitionId);
    if (top) params.push("$top=" + top);
    if (params.length) path += "?" + params.join("&");
    return this.get(path);
};

// --- Pull Requests ---

AzureDevOpsClient.prototype.createPullRequest = function(repoId, sourceBranch, targetBranch, title, description) {
    return this.post(this.project + "/_apis/git/repositories/" + repoId + "/pullrequests", {
        sourceRefName: "refs/heads/" + sourceBranch,
        targetRefName: "refs/heads/" + targetBranch,
        title: title,
        description: description
    });
};

AzureDevOpsClient.prototype.getPullRequest = function(repoId, prId) {
    return this.get(this.project + "/_apis/git/repositories/" + repoId + "/pullrequests/" + prId);
};

AzureDevOpsClient.prototype.completePullRequest = function(repoId, prId, lastMergeCommit) {
    return this.patch(
        this.project + "/_apis/git/repositories/" + repoId + "/pullrequests/" + prId,
        {
            status: "completed",
            lastMergeSourceCommit: { commitId: lastMergeCommit },
            completionOptions: {
                mergeStrategy: "squash",
                deleteSourceBranch: true,
                transitionWorkItems: true
            }
        }
    );
};

// --- Pipelines ---

AzureDevOpsClient.prototype.listPipelines = function() {
    return this.get(this.project + "/_apis/pipelines");
};

AzureDevOpsClient.prototype.runPipeline = function(pipelineId, branch, variables) {
    var body = {
        resources: {
            repositories: {
                self: { refName: "refs/heads/" + (branch || "main") }
            }
        }
    };
    if (variables) {
        body.variables = {};
        Object.keys(variables).forEach(function(key) {
            body.variables[key] = { value: variables[key] };
        });
    }
    return this.post(this.project + "/_apis/pipelines/" + pipelineId + "/runs", body);
};

// --- Releases ---

AzureDevOpsClient.prototype.createRelease = function(definitionId, description) {
    return this.post(
        this.project + "/_apis/release/releases",
        { definitionId: definitionId, description: description || "Automated release", isDraft: false },
        "application/json",
        this.vsrmUrl
    );
};

// --- Repositories ---

AzureDevOpsClient.prototype.listRepos = function() {
    return this.get(this.project + "/_apis/git/repositories");
};

AzureDevOpsClient.prototype.listCommits = function(repoId, branch, top) {
    var path = this.project + "/_apis/git/repositories/" + repoId + "/commits" +
        "?searchCriteria.itemVersion.version=" + encodeURIComponent(branch || "main") +
        "&$top=" + (top || 50);
    return this.get(path);
};

// --- Projects ---

AzureDevOpsClient.prototype.listProjects = function() {
    return this.get("_apis/projects");
};

AzureDevOpsClient.prototype.getProject = function(projectId) {
    return this.get("_apis/projects/" + projectId + "?includeCapabilities=true");
};

module.exports = AzureDevOpsClient;

Using the Client

var AzureDevOpsClient = require("./azure-devops-client");

var client = new AzureDevOpsClient({
    organization: "myorg",
    project: "MyProject",
    token: process.env.AZURE_DEVOPS_PAT
});

// Create a bug and link it to a build
client.createWorkItem("Bug", {
    "System.Title": "API returns 500 when payload exceeds 1MB",
    "System.Description": "The upload endpoint crashes with large files",
    "Microsoft.VSTS.Common.Priority": 1,
    "System.Tags": "api; production; urgent"
}).then(function(bug) {
    console.log("Created bug #" + bug.id);

    // Queue a hotfix build
    return client.queueBuild(15, "refs/heads/hotfix/upload-limit");
}).then(function(build) {
    console.log("Build queued: " + build.buildNumber);

    // Create a PR for the fix
    return client.createPullRequest(
        "my-repo-id",
        "hotfix/upload-limit",
        "main",
        "Fix: Increase upload payload limit to 10MB",
        "Fixes bug where API crashes with large file uploads.\n\nIncreases body parser limit and adds streaming for large payloads."
    );
}).then(function(pr) {
    console.log("PR #" + pr.pullRequestId + " created");
}).catch(function(err) {
    console.error("Failed:", err.message);
});

Common Issues and Troubleshooting

1. 400 Bad Request on Work Item Creation

The most common cause is using application/json instead of application/json-patch+json as the Content-Type header. Work item endpoints require JSON Patch format. Another frequent cause is referencing field names incorrectly — use the full reference name like System.Title, not the display name.

2. 203 Non-Authoritative Information Instead of 401

When your PAT is invalid or expired, Azure DevOps returns a 203 status with an HTML login page instead of a proper 401. Your code sees a 2xx status and tries to parse HTML as JSON, causing a confusing parse error. Always validate that the response Content-Type is application/json before parsing.

function safeJsonParse(res) {
    var contentType = res.headers.get("content-type") || "";
    if (contentType.indexOf("json") === -1) {
        throw new Error("Expected JSON but got " + contentType + " — check authentication");
    }
    return res.json();
}

3. Continuation Token Ignored on First Page

When calling paginated endpoints, do not send continuationToken=null or continuationToken=undefined on the first request. Some endpoints interpret these as literal string values and return empty results. Only append the parameter when you actually have a token from a previous response.

4. WIQL Returns IDs But getWorkItems Fails with 404

WIQL queries can return work item IDs from projects the PAT does not have access to, especially with cross-project queries. When you batch-fetch those IDs, the API returns a 404. Always scope your WIQL queries with [System.TeamProject] = @project unless you explicitly need cross-project results.

5. Push API Rejects Commits with "Old Object ID Does Not Match"

The push endpoint requires the current tip commit SHA in refUpdates[].oldObjectId. If another commit was pushed between your read and your push, you get a conflict. Always fetch the latest ref immediately before pushing, and implement retry logic for concurrent environments.

6. Rate Limiting Hits Unexpectedly During Bulk Operations

Azure DevOps uses a sliding window rate limiter. If you run parallel requests (such as Promise.all with 50 work item updates), you will exhaust your budget instantly. Implement concurrency control — limit yourself to 5-10 parallel requests and add a small delay between batches.

Best Practices

  • Always pin the API version. Never omit api-version from your requests. The default version is not stable and can change without notice, breaking your integration.

  • Use service principals for automated workflows. PATs are tied to individual users. When that person leaves the organization, every automation using their PAT breaks simultaneously. Service principals survive personnel changes.

  • Implement exponential backoff for all API calls. Azure DevOps has transient failures more often than you might expect, especially around deployment windows. A simple retry with backoff eliminates most intermittent failures.

  • Cache read-heavy data locally. Project metadata, team members, area paths, and iteration paths change infrequently. Cache them with a TTL of 5-10 minutes to reduce API calls by 80% or more.

  • Use batch endpoints when available. Fetching 200 work items in one call is dramatically faster than 200 individual requests. The GET _apis/wit/workitems?ids=1,2,3... endpoint accepts up to 200 IDs per call.

  • Validate PATs on startup. Make a lightweight API call (like listing projects) when your service starts. Fail fast with a clear error message if authentication is broken rather than discovering it when a user triggers a workflow.

  • Log request IDs for debugging. Every Azure DevOps API response includes an ActivityId header. Log it with your request metadata. When you open a support ticket with Microsoft, this ID lets them trace exactly what happened on their side.

  • Prefer WIQL over OData for work item queries. The OData analytics endpoint is powerful but has different rate limits and can be slow for simple queries. WIQL is faster for filtering and sorting work items by field values.

  • Handle the 203 authentication trap. As mentioned in troubleshooting, always check the response Content-Type. A 203 with HTML is not a successful response — it means your credentials have expired.

  • Scope PATs to minimum required permissions. A PAT with full access is a security liability. Create separate PATs for separate integrations, each with only the scopes they need. If a work item bot does not need code access, do not grant vso.code.

References

Powered by Contentful