Building Custom Azure DevOps Extensions
Build custom Azure DevOps extensions with pipeline tasks, dashboard widgets, and hub pages using the Extension SDK
Building Custom Azure DevOps Extensions
Azure DevOps is extensible by design. When the built-in features fall short, you can build custom extensions that plug directly into the platform — pipeline tasks, dashboard widgets, hub pages, context menus, and more. I have built several production extensions over the years, and the process is straightforward once you understand the architecture. This article walks through the full lifecycle of building, testing, packaging, and publishing Azure DevOps extensions with Node.js.
Prerequisites
- Node.js 16+ installed
- An Azure DevOps organization (free tier works)
- A publisher account on the Visual Studio Marketplace
- Basic familiarity with Azure DevOps pipelines and dashboards
tfx-cliinstalled globally:npm install -g tfx-cli
Extension Architecture Overview
An Azure DevOps extension is a packaged bundle of HTML, JavaScript, CSS, and configuration files that gets installed into an Azure DevOps organization. Every extension is defined by a manifest file (vss-extension.json) that declares what the extension contributes to the platform.
Extensions operate through contribution points. Azure DevOps exposes dozens of contribution points — locations in the UI or pipeline execution where your extension can inject functionality. The most common contribution types are:
- Hub/Tab Extensions — Full pages or tabs within the Azure DevOps UI
- Pipeline Tasks — Custom build/release tasks that run during pipeline execution
- Dashboard Widgets — Cards on the project dashboard
- Context Menu Actions — Items added to right-click menus on work items, repos, etc.
- Service Hooks — Custom integrations triggered by Azure DevOps events
Each extension lives under a publisher identity. You create a publisher on the Visual Studio Marketplace, and all your extensions are scoped under that publisher ID. Extensions can be private (shared only with specific organizations) or public (listed on the marketplace).
The runtime model differs by contribution type. Hub pages and widgets run as iframes in the browser, communicating with the host frame via the Extension SDK. Pipeline tasks run as Node.js processes on the build agent. This distinction matters — the SDK, APIs, and debugging approaches are different for each.
Development Environment Setup
Start by creating the project structure. I recommend separating pipeline tasks from UI contributions since they have different build processes.
my-extension/
vss-extension.json
overview.md
images/
logo.png
hub/
hub.html
hub.js
widget/
widget.html
widget.js
widget.css
tasks/
deployment-check/
task.json
index.js
package.json
package.json
Initialize the project:
mkdir my-extension && cd my-extension
npm init -y
npm install azure-devops-extension-sdk --save
npm install azure-pipelines-task-lib --save
npm install -g tfx-cli
The azure-devops-extension-sdk is for UI-based contributions (hubs, widgets). The azure-pipelines-task-lib is for pipeline tasks that run on the agent. They serve completely different purposes, and you will often need both in the same extension.
Extension Manifest (vss-extension.json)
The manifest is the spine of your extension. It declares metadata, contributions, and file mappings. Here is a manifest that registers a pipeline task, a dashboard widget, and a hub page:
{
"manifestVersion": 1,
"id": "deployment-toolkit",
"version": "1.0.0",
"name": "Deployment Toolkit",
"publisher": "your-publisher-id",
"description": "Pipeline task for deployment validation and dashboard status widget",
"categories": ["Azure Pipelines", "Azure Boards"],
"targets": [
{
"id": "Microsoft.VisualStudio.Services"
}
],
"icons": {
"default": "images/logo.png"
},
"files": [
{
"path": "tasks/deployment-check"
},
{
"path": "hub",
"addressable": true
},
{
"path": "widget",
"addressable": true
},
{
"path": "node_modules/azure-devops-extension-sdk",
"addressable": true,
"packagePath": "lib"
}
],
"contributions": [
{
"id": "deployment-check-task",
"type": "ms.vss-distributed-task.task",
"targets": [
"ms.vss-distributed-task.tasks"
],
"properties": {
"name": "tasks/deployment-check"
}
},
{
"id": "deployment-status-widget",
"type": "ms.vss-dashboards-web.widget",
"targets": [
"ms.vss-dashboards-web.widget-catalog",
".deployment-status-widget.Configuration"
],
"properties": {
"name": "Deployment Status",
"description": "Shows recent deployment validation results",
"catalogIconUrl": "images/logo.png",
"uri": "widget/widget.html",
"supportedSizes": [
{ "rowSpan": 1, "columnSpan": 2 }
],
"supportedScopes": ["project_team"]
}
},
{
"id": "deployment-hub",
"type": "ms.vss-web.hub",
"targets": [
"ms.vss-build-web.build-release-hub-group"
],
"properties": {
"name": "Deployment Readiness",
"uri": "hub/hub.html",
"icon": "images/logo.png"
}
}
],
"scopes": [
"vso.build",
"vso.release",
"vso.project"
]
}
A few things to note. The files array controls what gets packaged into the .vsix file. Setting addressable: true means those files are served via URL and can be loaded in iframes. Pipeline task folders do not need to be addressable — they are extracted onto the agent filesystem.
The scopes array declares what permissions the extension requests. Only request what you actually need. Users see these scopes during installation, and excessive permissions will make them hesitate.
The targets in each contribution specify where in the Azure DevOps UI your extension appears. Getting the right target ID is half the battle. Microsoft documents these in the extension points reference, but I find it faster to inspect existing extensions on the marketplace to see which targets they use.
Creating Hub/Tab Extensions
Hub extensions are full pages within the Azure DevOps navigation. They run as iframes loaded from the files you package.
<!-- hub/hub.html -->
<!DOCTYPE html>
<html>
<head>
<script src="../lib/SDK.min.js"></script>
<script src="hub.js"></script>
<style>
body { font-family: -apple-system, BlinkMacSystemFont, sans-serif; padding: 20px; }
.status-table { width: 100%; border-collapse: collapse; margin-top: 16px; }
.status-table th, .status-table td { padding: 8px 12px; border: 1px solid #e0e0e0; text-align: left; }
.status-pass { color: #107c10; font-weight: bold; }
.status-fail { color: #d83b01; font-weight: bold; }
</style>
</head>
<body>
<h2>Deployment Readiness Dashboard</h2>
<div id="content">Loading...</div>
</body>
</html>
// hub/hub.js
var SDK = null;
function initializeHub() {
VSS.init({
explicitNotifyLoaded: true,
usePlatformStyles: true
});
VSS.ready(function() {
VSS.require(["VSS/Service", "TFS/Build/RestClient"], function(VSS_Service, Build_Client) {
var buildClient = VSS_Service.getCollectionClient(Build_Client.BuildHttpClient);
var projectId = VSS.getWebContext().project.id;
buildClient.getBuilds(projectId, null, null, null, null, null, null, null, null, null, null, null, 10).then(function(builds) {
renderBuilds(builds);
VSS.notifyLoadSucceeded();
}, function(error) {
document.getElementById("content").innerHTML = "Error: " + error.message;
VSS.notifyLoadFailed(error.message);
});
});
});
}
function renderBuilds(builds) {
var html = '<table class="status-table">';
html += '<tr><th>Build</th><th>Branch</th><th>Status</th><th>Date</th></tr>';
builds.forEach(function(build) {
var statusClass = build.result === 2 ? "status-pass" : "status-fail";
var statusText = build.result === 2 ? "Succeeded" : "Failed";
var date = new Date(build.finishTime).toLocaleDateString();
html += '<tr>';
html += '<td>' + build.buildNumber + '</td>';
html += '<td>' + build.sourceBranch + '</td>';
html += '<td class="' + statusClass + '">' + statusText + '</td>';
html += '<td>' + date + '</td>';
html += '</tr>';
});
html += '</table>';
document.getElementById("content").innerHTML = html;
}
initializeHub();
The SDK initialization pattern is critical. You must call VSS.init() before anything else, and you must call VSS.notifyLoadSucceeded() when your page is ready. If you forget that notification, the host frame shows a loading spinner forever.
Custom Pipeline Tasks
Pipeline tasks are the most impactful type of extension. They run as Node.js scripts on the build agent and integrate directly into YAML or classic pipelines.
Every pipeline task has a task.json that defines inputs, outputs, and execution entry point:
{
"id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
"name": "DeploymentCheck",
"friendlyName": "Deployment Readiness Check",
"description": "Validates deployment prerequisites before release",
"category": "Deploy",
"visibility": ["Build", "Release"],
"author": "Your Name",
"version": {
"Major": 1,
"Minor": 0,
"Patch": 0
},
"instanceNameFormat": "Check deployment readiness for $(targetEnvironment)",
"inputs": [
{
"name": "targetEnvironment",
"type": "pickList",
"label": "Target Environment",
"required": true,
"options": {
"dev": "Development",
"staging": "Staging",
"production": "Production"
},
"defaultValue": "staging"
},
{
"name": "healthEndpoint",
"type": "string",
"label": "Health Check URL",
"required": true,
"helpMarkDown": "URL to check for service health before deployment"
},
{
"name": "requiredApprovals",
"type": "string",
"label": "Required Approvals Count",
"required": false,
"defaultValue": "1"
},
{
"name": "failOnWarning",
"type": "boolean",
"label": "Fail on Warnings",
"required": false,
"defaultValue": "false"
}
],
"execution": {
"Node16": {
"target": "index.js"
}
}
}
The id must be a unique GUID. Generate one and never change it — Azure DevOps uses this to track the task across versions. The execution block specifies the Node.js handler version. Use Node16 for current compatibility. Older tasks used Node or Node10, but those are deprecated.
Here is the task implementation:
// tasks/deployment-check/index.js
var tl = require("azure-pipelines-task-lib/task");
var https = require("https");
var http = require("http");
function run() {
try {
var targetEnvironment = tl.getInput("targetEnvironment", true);
var healthEndpoint = tl.getInput("healthEndpoint", true);
var requiredApprovals = parseInt(tl.getInput("requiredApprovals", false) || "1", 10);
var failOnWarning = tl.getBoolInput("failOnWarning", false);
console.log("=== Deployment Readiness Check ===");
console.log("Target Environment: " + targetEnvironment);
console.log("Health Endpoint: " + healthEndpoint);
var checks = [];
// Check 1: Health endpoint
checks.push(checkHealthEndpoint(healthEndpoint));
// Check 2: Branch policy (production requires main branch)
checks.push(checkBranchPolicy(targetEnvironment));
// Check 3: Verify no active incidents
checks.push(checkActiveIncidents());
Promise.all(checks).then(function(results) {
var failures = results.filter(function(r) { return r.status === "fail"; });
var warnings = results.filter(function(r) { return r.status === "warn"; });
results.forEach(function(result) {
var icon = result.status === "pass" ? "✓" : result.status === "warn" ? "⚠" : "✗";
console.log("[" + icon + "] " + result.name + ": " + result.message);
if (result.status === "pass") {
tl.command("task.logissue", { type: "log" }, result.name + ": " + result.message);
} else if (result.status === "warn") {
tl.warning(result.name + ": " + result.message);
} else {
tl.error(result.name + ": " + result.message);
}
});
// Set output variables
tl.setVariable("DeploymentReady", failures.length === 0 ? "true" : "false");
tl.setVariable("CheckResults", JSON.stringify(results));
if (failures.length > 0) {
tl.setResult(tl.TaskResult.Failed, failures.length + " check(s) failed. Deployment blocked.");
} else if (warnings.length > 0 && failOnWarning) {
tl.setResult(tl.TaskResult.Failed, warnings.length + " warning(s) found and failOnWarning is enabled.");
} else {
tl.setResult(tl.TaskResult.Succeeded, "All deployment readiness checks passed.");
}
}).catch(function(err) {
tl.setResult(tl.TaskResult.Failed, "Unexpected error: " + err.message);
});
} catch (err) {
tl.setResult(tl.TaskResult.Failed, err.message);
}
}
function checkHealthEndpoint(url) {
return new Promise(function(resolve) {
var client = url.startsWith("https") ? https : http;
var timeout = setTimeout(function() {
resolve({ name: "Health Check", status: "fail", message: "Timeout after 10 seconds" });
}, 10000);
client.get(url, function(res) {
clearTimeout(timeout);
if (res.statusCode === 200) {
resolve({ name: "Health Check", status: "pass", message: "Endpoint returned 200 OK" });
} else {
resolve({ name: "Health Check", status: "fail", message: "Endpoint returned " + res.statusCode });
}
}).on("error", function(err) {
clearTimeout(timeout);
resolve({ name: "Health Check", status: "fail", message: "Connection failed: " + err.message });
});
});
}
function checkBranchPolicy(environment) {
return new Promise(function(resolve) {
var sourceBranch = tl.getVariable("Build.SourceBranch") || "";
if (environment === "production" && sourceBranch !== "refs/heads/main") {
resolve({
name: "Branch Policy",
status: "fail",
message: "Production deployments require main branch. Current: " + sourceBranch
});
} else if (environment === "staging" && sourceBranch.indexOf("refs/heads/release") !== 0) {
resolve({
name: "Branch Policy",
status: "warn",
message: "Staging deployments should use release branches. Current: " + sourceBranch
});
} else {
resolve({
name: "Branch Policy",
status: "pass",
message: "Branch " + sourceBranch + " is valid for " + environment
});
}
});
}
function checkActiveIncidents() {
return new Promise(function(resolve) {
// In production, this would check your incident management system
// (PagerDuty, OpsGenie, etc.) via API
var incidentApiUrl = tl.getVariable("INCIDENT_API_URL");
if (!incidentApiUrl) {
resolve({
name: "Active Incidents",
status: "warn",
message: "No incident API configured. Skipping check."
});
return;
}
var client = incidentApiUrl.startsWith("https") ? https : http;
client.get(incidentApiUrl, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
try {
var incidents = JSON.parse(data);
var activeCount = incidents.filter(function(i) { return i.status === "active"; }).length;
if (activeCount > 0) {
resolve({
name: "Active Incidents",
status: "fail",
message: activeCount + " active incident(s) found. Resolve before deploying."
});
} else {
resolve({
name: "Active Incidents",
status: "pass",
message: "No active incidents"
});
}
} catch (e) {
resolve({ name: "Active Incidents", status: "warn", message: "Could not parse incident response" });
}
});
}).on("error", function() {
resolve({ name: "Active Incidents", status: "warn", message: "Could not reach incident API" });
});
});
}
run();
The pipeline task uses output variables (tl.setVariable) so downstream tasks can react to the results. You would reference $(DeploymentReady) in conditions on later pipeline stages.
Here is how the task is used in a YAML pipeline:
stages:
- stage: ValidateDeployment
jobs:
- job: ReadinessCheck
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DeploymentCheck@1
name: readiness
inputs:
targetEnvironment: 'production'
healthEndpoint: 'https://api.example.com/health'
requiredApprovals: '2'
failOnWarning: true
- stage: Deploy
dependsOn: ValidateDeployment
condition: and(succeeded(), eq(dependencies.ValidateDeployment.outputs['ReadinessCheck.readiness.DeploymentReady'], 'true'))
jobs:
- deployment: Production
environment: production
strategy:
runOnce:
deploy:
steps:
- script: echo "Deploying to production"
Widget Extensions for Dashboards
Dashboard widgets are small components that display information on Azure DevOps project dashboards. They run inside iframes and use the Extension SDK to communicate with the host.
<!-- widget/widget.html -->
<!DOCTYPE html>
<html>
<head>
<script src="../lib/SDK.min.js"></script>
<link rel="stylesheet" href="widget.css">
<script src="widget.js"></script>
</head>
<body>
<div class="widget-container">
<h3 class="widget-title">Deployment Status</h3>
<div id="status-content">
<div class="loading">Loading deployment data...</div>
</div>
</div>
</body>
</html>
/* widget/widget.css */
body {
margin: 0;
padding: 10px;
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif;
font-size: 13px;
}
.widget-container {
height: 100%;
}
.widget-title {
font-size: 14px;
margin: 0 0 10px 0;
color: #333;
}
.env-row {
display: flex;
justify-content: space-between;
padding: 6px 0;
border-bottom: 1px solid #f0f0f0;
}
.env-name {
font-weight: 600;
}
.env-status {
padding: 2px 8px;
border-radius: 3px;
font-size: 11px;
font-weight: 600;
}
.env-status.healthy { background: #dff6dd; color: #107c10; }
.env-status.degraded { background: #fff4ce; color: #835c00; }
.env-status.down { background: #fde7e9; color: #a80000; }
// widget/widget.js
VSS.init({
explicitNotifyLoaded: true,
usePlatformStyles: false
});
VSS.ready(function() {
VSS.require([
"TFS/Dashboards/WidgetHelpers",
"VSS/Service",
"TFS/Build/RestClient"
], function(WidgetHelpers, VSS_Service, Build_Client) {
WidgetHelpers.IncludeWidgetStyles();
VSS.register("deployment-status-widget", function() {
return {
load: function(widgetSettings) {
return loadWidgetData(VSS_Service, Build_Client).then(function() {
return WidgetHelpers.WidgetStatusHelper.Success();
}, function(error) {
return WidgetHelpers.WidgetStatusHelper.Failure(error.message);
});
},
reload: function(widgetSettings) {
return loadWidgetData(VSS_Service, Build_Client).then(function() {
return WidgetHelpers.WidgetStatusHelper.Success();
});
}
};
});
VSS.notifyLoadSucceeded();
});
});
function loadWidgetData(VSS_Service, Build_Client) {
var buildClient = VSS_Service.getCollectionClient(Build_Client.BuildHttpClient);
var projectId = VSS.getWebContext().project.id;
return buildClient.getBuilds(projectId, null, null, null, null, null, null, null, null, null, null, null, 5).then(function(builds) {
var environments = [
{ name: "Production", branch: "refs/heads/main" },
{ name: "Staging", branch: "refs/heads/release" },
{ name: "Development", branch: "refs/heads/develop" }
];
var html = "";
environments.forEach(function(env) {
var envBuild = null;
for (var i = 0; i < builds.length; i++) {
if (builds[i].sourceBranch === env.branch || builds[i].sourceBranch.indexOf(env.branch) === 0) {
envBuild = builds[i];
break;
}
}
var status = "down";
var statusText = "Unknown";
if (envBuild) {
if (envBuild.result === 2) {
status = "healthy";
statusText = "Healthy";
} else if (envBuild.result === 8) {
status = "degraded";
statusText = "Partial";
} else {
status = "down";
statusText = "Failed";
}
}
html += '<div class="env-row">';
html += '<span class="env-name">' + env.name + '</span>';
html += '<span class="env-status ' + status + '">' + statusText + '</span>';
html += '</div>';
});
document.getElementById("status-content").innerHTML = html;
});
}
The widget registration ID (deployment-status-widget) must match the contribution ID in vss-extension.json. This is a common source of "widget failed to load" errors.
Using the Azure DevOps Extension SDK
The Extension SDK handles communication between your extension iframe and the Azure DevOps host. The key patterns to know:
// Get the current project context
var context = VSS.getWebContext();
var projectName = context.project.name;
var projectId = context.project.id;
var userId = context.user.id;
var orgUrl = context.collection.uri;
// Access REST API clients
VSS.require(["VSS/Service", "TFS/WorkItemTracking/RestClient"], function(Service, WIT_Client) {
var witClient = Service.getCollectionClient(WIT_Client.WorkItemTrackingHttpClient);
// Query work items
witClient.queryByWiql({ query: "SELECT [System.Id] FROM WorkItems WHERE [System.State] = 'Active'" }, projectName)
.then(function(result) {
console.log("Found " + result.workItems.length + " active work items");
});
});
// Store extension data (per-user or per-project)
VSS.require(["VSS/Service", "VSS/Settings/RestClient"], function(Service, Settings_Client) {
var settingsClient = Service.getCollectionClient(Settings_Client.SettingsHttpClient);
// Save settings
var entries = { "myExtension.lastCheck": new Date().toISOString() };
settingsClient.setEntries(entries, "me");
});
REST API Access from Extensions
Pipeline tasks access the Azure DevOps REST API using the system access token. The task library provides helpers:
var tl = require("azure-pipelines-task-lib/task");
var https = require("https");
var url = require("url");
function callAzureDevOpsApi(apiPath, method, body) {
var token = tl.getVariable("System.AccessToken");
var orgUrl = tl.getVariable("System.CollectionUri");
var project = tl.getVariable("System.TeamProject");
var fullUrl = orgUrl + project + "/_apis/" + apiPath;
var parsed = url.parse(fullUrl);
var options = {
hostname: parsed.hostname,
path: parsed.path,
method: method || "GET",
headers: {
"Authorization": "Basic " + Buffer.from(":" + token).toString("base64"),
"Content-Type": "application/json",
"Accept": "application/json; api-version=7.0"
}
};
return new Promise(function(resolve, reject) {
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
try {
resolve(JSON.parse(data));
} catch (e) {
resolve(data);
}
});
});
req.on("error", function(err) { reject(err); });
if (body) {
req.write(JSON.stringify(body));
}
req.end();
});
}
// Example: Get recent releases
callAzureDevOpsApi("release/releases?$top=5&api-version=7.0", "GET").then(function(releases) {
console.log("Recent releases: " + releases.count);
});
Important: the System.AccessToken must be explicitly mapped in the pipeline YAML for the task to access it:
steps:
- task: DeploymentCheck@1
inputs:
targetEnvironment: 'production'
healthEndpoint: 'https://api.example.com/health'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
Authentication and Permissions
Extensions declare required scopes in the manifest. Here are the most common:
| Scope | Access |
|---|---|
vso.build |
Build definitions and builds (read) |
vso.build_execute |
Queue and manage builds |
vso.release |
Release definitions and releases (read) |
vso.release_manage |
Create and manage releases |
vso.work |
Work items (read) |
vso.work_write |
Create and update work items |
vso.project |
Project metadata (read) |
vso.code |
Source code (read) |
Pipeline tasks inherit the agent's permissions, which are controlled by the project's build service account. If your task needs to update work items or create tags, the build service identity needs those permissions in project settings.
For UI extensions, the SDK authenticates through the host frame session — no additional login is required. But the scopes you declared in the manifest limit what API calls succeed. If you add a scope after initial installation, users must re-authorize the extension.
Packaging and Publishing to Marketplace
Package the extension into a .vsix file:
tfx extension create --manifest-globs vss-extension.json
This produces a file like your-publisher-id.deployment-toolkit-1.0.0.vsix.
Before publishing, make sure your pipeline task has its own node_modules. Navigate to the task directory and install its dependencies:
cd tasks/deployment-check
npm install --production
cd ../..
tfx extension create --manifest-globs vss-extension.json
Publish to the marketplace:
# Create a PAT (Personal Access Token) in Azure DevOps with Marketplace scope
tfx extension publish --manifest-globs vss-extension.json --token YOUR_PAT
# Or publish as private and share with specific orgs
tfx extension publish --manifest-globs vss-extension.json --token YOUR_PAT --share-with your-org-name
For your first extension, I recommend publishing as private. Share it with your organization, install it, and verify everything works before going public.
Testing Extensions Locally
Testing hub and widget extensions locally before packaging saves a lot of time. Use webpack-dev-server or a simple HTTP server with the --override flag:
# Serve the extension files locally
npx http-server . --port 3000 --cors
# In Azure DevOps, install the dev version of your extension
# with baseUri override in the manifest:
Add a baseUri to the manifest during development:
{
"baseUri": "https://localhost:3000"
}
For pipeline tasks, test locally by mocking the task library inputs:
// test/test-deployment-check.js
var path = require("path");
// Mock environment variables that the task library reads
process.env["INPUT_TARGETENVIRONMENT"] = "staging";
process.env["INPUT_HEALTHENDPOINT"] = "https://httpbin.org/status/200";
process.env["INPUT_REQUIREDAPPROVALS"] = "1";
process.env["INPUT_FAILONWARNING"] = "false";
process.env["BUILD_SOURCEBRANCH"] = "refs/heads/release/1.0";
// Run the task
require("../tasks/deployment-check/index.js");
This approach lets you iterate quickly without pushing to a pipeline every time. Mock the tl.getVariable calls to simulate different pipeline contexts.
Debugging Extensions
For UI extensions running in the browser, standard browser DevTools work. The extension runs in an iframe, so you need to switch to the correct frame context in the console.
For pipeline tasks, add diagnostic logging:
var tl = require("azure-pipelines-task-lib/task");
// Enable debug output
tl.setVariable("System.Debug", "true");
// Different log levels
tl.debug("Debug message - only visible when System.Debug is true");
tl.warning("Warning message - appears with ⚠ icon");
tl.error("Error message - appears with ✗ icon");
// Group logs for readability
console.log("##vso[task.logissue type=warning;sourcepath=index.js;linenumber=42]Something looks off");
console.log("##[group]Deployment Check Details");
console.log("Checking endpoint...");
console.log("##[endgroup]");
The ##vso logging commands are pipeline-specific formatting directives. They create collapsible sections, color-coded messages, and structured log output.
Versioning and Updates
When you release a new version, update three places:
- The
versioninvss-extension.json(the extension version) - The
versionintask.json(the task version — Major, Minor, Patch) - The
package.jsonversion if applicable
The task version matters for pipeline compatibility. When users reference DeploymentCheck@1 in their YAML, they get the latest 1.x.x version. Bumping the Major version to 2 means they must update their pipeline YAML to DeploymentCheck@2.
My recommendation: bump Major only for breaking changes to inputs or behavior. Use Minor for new features and Patch for bug fixes. This follows semantic versioning and gives your users predictable upgrades.
# Bump and publish
tfx extension create --manifest-globs vss-extension.json --rev-version
tfx extension publish --manifest-globs vss-extension.json --token YOUR_PAT
The --rev-version flag auto-increments the patch version, which is handy for iterative development.
Common Issues and Troubleshooting
Widget shows "Failed to load" after installation. The most common cause is a mismatch between the contribution ID in vss-extension.json and the ID passed to VSS.register() in your widget JavaScript. These must be identical. Also verify the uri path in the contribution properties points to the correct HTML file relative to the package root.
Pipeline task cannot find node modules. Each pipeline task directory needs its own node_modules. The task runs in isolation on the agent — it does not share the extension root's node_modules. Run npm install --production inside each task directory before packaging. Also ensure the task directory is listed in the files array of the manifest.
Extension installs but hub page is blank. Check that the files containing your HTML and JavaScript have addressable: true in the manifest's files array. Non-addressable files are included in the package but not served via URL, so the iframe has nothing to load. Also confirm you are calling VSS.notifyLoadSucceeded() — without it, the host frame keeps showing a loading indicator.
REST API calls return 401 Unauthorized from pipeline task. The System.AccessToken is not available by default in pipeline tasks. You must explicitly map it via the env block in your YAML step, or check "Allow scripts to access the OAuth token" in classic pipeline options. Also verify the build service account has the necessary permissions in project settings.
Extension works in dev but breaks after publishing. This usually means baseUri is still set in the manifest. Remove it before publishing — when baseUri is present, Azure DevOps tries to load files from that URL instead of the packaged files. Also check that all file paths in the manifest use forward slashes, not backslashes.
Best Practices
Scope permissions minimally. Request only the scopes your extension actually needs. Overly broad permissions discourage adoption and may violate organizational security policies. You can always add scopes later with a version bump.
Bundle dependencies for pipeline tasks. Always include
node_modulesinside each task directory. Do not rely on the agent having any packages pre-installed. Usenpm install --productionto keep the package size small.Handle errors gracefully in widgets. Widgets that throw unhandled errors display a generic "failed to load" message. Wrap all async operations in try-catch and return
WidgetStatusHelper.Failure()with a meaningful message.Use output variables in pipeline tasks. Setting output variables lets downstream tasks and stages react to your task's results without parsing logs. This is cleaner and more reliable than log scraping.
Version your task independently from the extension. The extension version and task version serve different purposes. The extension version controls marketplace updates. The task version controls what pipelines consume. Increment them independently based on what changed.
Test with multiple Azure DevOps organization configurations. Extensions behave differently in organizations with different process templates (Agile, Scrum, CMMI) and permission structures. Test against at least two different configurations before publishing publicly.
Include a comprehensive overview.md. The marketplace listing pulls from
overview.md. Include screenshots, configuration examples, and known limitations. A well-documented extension gets installed. A poorly documented one gets skipped, regardless of quality.Use extension data storage instead of external databases. The Azure DevOps Extension Data Service provides per-extension storage scoped to the organization. Use it for settings and state instead of standing up external storage, unless you have specific scalability requirements.
Pin your SDK versions. The
azure-devops-extension-sdkandazure-pipelines-task-libpackages can introduce breaking changes between major versions. Pin to specific versions in yourpackage.jsonto avoid surprises during CI builds.