Building Custom Service Hooks
Complete guide to building custom service hooks in Azure DevOps for real-time event processing, including webhook consumers, service bus integrations, and event-driven automation pipelines.
Building Custom Service Hooks
Overview
Service hooks are Azure DevOps's event system — they fire when something happens in your project and let you wire that event to an external system. Out of the box, Azure DevOps supports sending events to Slack, Teams, Jenkins, and a handful of other services. But the real power comes when you build custom service hook consumers that process events your way. I have used custom service hooks to trigger deployment workflows, sync data between systems in real time, and build audit logs that capture every change across an organization.
Prerequisites
- An Azure DevOps organization with Project Administrator or Project Collection Administrator permissions
- Node.js 16 or later installed
- A publicly accessible URL for receiving webhooks (ngrok works for development)
- Basic understanding of Azure DevOps REST API and event-driven architecture
- A Personal Access Token with
vso.servicehooksandvso.servicehooks_writescopes
Understanding Service Hook Events
Azure DevOps publishes events across several categories. Each event type carries a specific payload with details about what happened.
Event Categories and Types
| Category | Event | Trigger |
|---|---|---|
| Build | build.complete |
A build finishes (success or failure) |
| Code | git.push |
Commits pushed to a repository |
| Code | git.pullrequest.created |
New pull request opened |
| Code | git.pullrequest.updated |
PR updated (votes, status, merge) |
| Code | git.pullrequest.merged |
PR merge completes |
| Work Items | workitem.created |
New work item created |
| Work Items | workitem.updated |
Work item fields changed |
| Work Items | workitem.commented |
Comment added to work item |
| Pipelines | ms.vss-pipelines.run-state-changed-event |
Pipeline run status changes |
| Release | ms.vss-release.deployment-completed-event |
Deployment finishes |
Every event payload follows the same envelope structure:
{
"subscriptionId": "subscription-guid",
"notificationId": 42,
"id": "event-guid",
"eventType": "git.push",
"publisherId": "tfs",
"message": {
"text": "User pushed 3 commits to branch refs/heads/main"
},
"detailedMessage": {
"text": "User pushed 3 commits to branch refs/heads/main..."
},
"resource": {
// Event-specific data lives here
},
"resourceVersion": "1.0",
"resourceContainers": {
"collection": { "id": "collection-guid" },
"account": { "id": "account-guid" },
"project": { "id": "project-guid" }
},
"createdDate": "2026-02-09T10:30:00.000Z"
}
The resource object is where the interesting data lives, and its shape changes depending on the event type.
Creating Service Hook Subscriptions
Via the REST API
The UI only covers basic scenarios. For production automation, manage subscriptions through the API:
var https = require("https");
function ServiceHookManager(orgUrl, pat) {
this.orgUrl = orgUrl;
this.orgName = orgUrl.split("/").pop();
this.pat = pat;
this.authHeader = "Basic " + Buffer.from(":" + pat).toString("base64");
}
ServiceHookManager.prototype._request = function(method, path, body, callback) {
var bodyStr = body ? JSON.stringify(body) : "";
var options = {
hostname: "dev.azure.com",
path: "/" + this.orgName + path,
method: method,
headers: {
"Content-Type": "application/json",
"Authorization": this.authHeader
}
};
if (body) {
options.headers["Content-Length"] = Buffer.byteLength(bodyStr);
}
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode >= 200 && res.statusCode < 300) {
try { callback(null, JSON.parse(data)); }
catch (e) { callback(null, data); }
} else {
callback(new Error(method + " " + path + ": " + res.statusCode + " - " + data));
}
});
});
req.on("error", callback);
if (body) req.write(bodyStr);
req.end();
};
ServiceHookManager.prototype.createSubscription = function(options, callback) {
var subscription = {
publisherId: options.publisherId || "tfs",
eventType: options.eventType,
resourceVersion: options.resourceVersion || "1.0",
consumerId: "webHooks",
consumerActionId: "httpRequest",
publisherInputs: options.filters || {},
consumerInputs: {
url: options.webhookUrl,
httpHeaders: options.headers || "",
resourceDetailsToSend: options.detailLevel || "all",
messagesToSend: options.messages || "all"
}
};
if (options.projectId) {
subscription.publisherInputs.projectId = options.projectId;
}
this._request("POST", "/_apis/hooks/subscriptions?api-version=7.1", subscription, callback);
};
ServiceHookManager.prototype.listSubscriptions = function(callback) {
this._request("GET", "/_apis/hooks/subscriptions?api-version=7.1", null, callback);
};
ServiceHookManager.prototype.deleteSubscription = function(subscriptionId, callback) {
this._request("DELETE", "/_apis/hooks/subscriptions/" + subscriptionId + "?api-version=7.1", null, callback);
};
ServiceHookManager.prototype.getNotifications = function(subscriptionId, callback) {
this._request(
"GET",
"/_apis/hooks/subscriptions/" + subscriptionId + "/notifications?api-version=7.1",
null,
callback
);
};
Subscription Filtering
You can filter which events trigger the hook. For example, subscribing to pushes on a specific branch:
var manager = new ServiceHookManager(process.env.ORG_URL, process.env.PAT);
// Only trigger on pushes to the main branch
manager.createSubscription({
eventType: "git.push",
projectId: "my-project-id",
webhookUrl: "https://myapp.example.com/hooks/push",
filters: {
projectId: "my-project-id",
repository: "my-repo-id",
branch: "refs/heads/main"
},
headers: "X-Hook-Secret: my-shared-secret-123"
}, function(err, result) {
if (err) {
console.error("Subscription creation failed:", err.message);
return;
}
console.log("Subscription created:", result.id);
console.log("Status:", result.status);
});
// Only trigger on completed builds that failed
manager.createSubscription({
eventType: "build.complete",
projectId: "my-project-id",
webhookUrl: "https://myapp.example.com/hooks/build-failed",
filters: {
projectId: "my-project-id",
definitionName: "production-pipeline",
buildStatus: "failed"
}
}, function(err, result) {
if (err) return console.error(err.message);
console.log("Build failure hook created:", result.id);
});
Building a Webhook Consumer
The webhook consumer is the server that receives and processes events. Here is a production-grade consumer with signature verification, retry handling, and event routing:
var express = require("express");
var crypto = require("crypto");
var fs = require("fs");
var app = express();
app.use(express.json({ limit: "1mb" }));
var HOOK_SECRET = process.env.HOOK_SECRET || "my-shared-secret-123";
// Middleware: Verify webhook signature
function verifySignature(req, res, next) {
var signature = req.headers["x-hook-secret"];
if (signature !== HOOK_SECRET) {
console.warn("Invalid signature from " + req.ip + " — rejecting");
return res.status(401).json({ error: "Invalid signature" });
}
next();
}
// Middleware: Log all incoming events
function logEvent(req, res, next) {
var event = req.body;
console.log("[%s] Event: %s | Subscription: %s | Notification: %d",
new Date().toISOString(),
event.eventType,
event.subscriptionId,
event.notificationId
);
next();
}
// Middleware: Deduplication
var processedNotifications = {};
var DEDUP_WINDOW_MS = 5 * 60 * 1000;
function deduplicate(req, res, next) {
var notifId = req.body.subscriptionId + ":" + req.body.notificationId;
if (processedNotifications[notifId]) {
console.log("Duplicate notification %s — skipping", notifId);
return res.status(200).json({ status: "duplicate" });
}
processedNotifications[notifId] = Date.now();
// Clean old entries periodically
var keys = Object.keys(processedNotifications);
if (keys.length > 1000) {
var cutoff = Date.now() - DEDUP_WINDOW_MS;
keys.forEach(function(key) {
if (processedNotifications[key] < cutoff) {
delete processedNotifications[key];
}
});
}
next();
}
app.use("/hooks", verifySignature, logEvent, deduplicate);
// Event handlers
var handlers = {};
handlers["git.push"] = function(event, callback) {
var push = event.resource;
var repo = push.repository.name;
var branch = push.refUpdates[0].name.replace("refs/heads/", "");
var commitCount = push.commits.length;
var pusher = push.pushedBy.displayName;
console.log("Push: %s pushed %d commits to %s/%s", pusher, commitCount, repo, branch);
push.commits.forEach(function(commit) {
console.log(" - %s: %s", commit.commitId.substring(0, 8), commit.comment);
});
callback(null, { processed: true, commits: commitCount });
};
handlers["git.pullrequest.created"] = function(event, callback) {
var pr = event.resource;
console.log("PR #%d created: %s", pr.pullRequestId, pr.title);
console.log(" Author: %s", pr.createdBy.displayName);
console.log(" Source: %s -> Target: %s", pr.sourceRefName, pr.targetRefName);
console.log(" Reviewers: %s", pr.reviewers.map(function(r) {
return r.displayName;
}).join(", "));
callback(null, { processed: true, prId: pr.pullRequestId });
};
handlers["git.pullrequest.merged"] = function(event, callback) {
var pr = event.resource;
console.log("PR #%d merged: %s", pr.pullRequestId, pr.title);
console.log(" Merge commit: %s", pr.lastMergeCommit.commitId);
callback(null, { processed: true, prId: pr.pullRequestId, merged: true });
};
handlers["build.complete"] = function(event, callback) {
var build = event.resource;
var status = build.result;
var duration = 0;
if (build.startTime && build.finishTime) {
duration = Math.round((new Date(build.finishTime) - new Date(build.startTime)) / 1000);
}
console.log("Build %s #%d: %s (took %ds)", build.definition.name, build.id, status, duration);
if (status === "failed") {
console.log(" FAILURE — triggering alert workflow");
}
callback(null, {
processed: true,
buildId: build.id,
status: status,
duration: duration
});
};
handlers["workitem.created"] = function(event, callback) {
var wi = event.resource;
var fields = wi.fields || {};
console.log("Work item #%d created: [%s] %s",
wi.id,
fields["System.WorkItemType"],
fields["System.Title"]
);
console.log(" Assigned to: %s", fields["System.AssignedTo"] || "Unassigned");
console.log(" State: %s", fields["System.State"]);
callback(null, { processed: true, workItemId: wi.id });
};
handlers["workitem.updated"] = function(event, callback) {
var wi = event.resource;
var revision = event.resource.revision;
var fields = wi.fields || {};
console.log("Work item #%d updated: %s", wi.id, fields["System.Title"]);
if (event.resource.fields && event.resource.fields["System.State"]) {
console.log(" State changed: %s -> %s",
event.resource.fields["System.State"].oldValue,
event.resource.fields["System.State"].newValue
);
}
callback(null, { processed: true, workItemId: wi.id });
};
// Main hook endpoint with event routing
app.post("/hooks/:channel", function(req, res) {
var event = req.body;
var channel = req.params.channel;
var handler = handlers[event.eventType];
if (!handler) {
console.log("No handler for event type: %s (channel: %s)", event.eventType, channel);
return res.status(200).json({ status: "ignored", eventType: event.eventType });
}
handler(event, function(err, result) {
if (err) {
console.error("Handler error for %s: %s", event.eventType, err.message);
return res.status(500).json({ error: err.message });
}
res.json({ status: "processed", result: result });
});
});
// Health check endpoint for monitoring
app.get("/hooks/health", function(req, res) {
res.json({
status: "healthy",
registeredHandlers: Object.keys(handlers),
processedCount: Object.keys(processedNotifications).length,
uptime: process.uptime()
});
});
Event-Driven Automation Pipeline
Beyond simple event processing, you can chain service hooks into complex automation workflows. Here is an event pipeline that routes events through multiple processing stages:
function EventPipeline(name) {
this.name = name;
this.stages = [];
this.errorHandler = null;
}
EventPipeline.prototype.addStage = function(name, processor) {
this.stages.push({ name: name, processor: processor });
return this;
};
EventPipeline.prototype.onError = function(handler) {
this.errorHandler = handler;
return this;
};
EventPipeline.prototype.execute = function(event, callback) {
var self = this;
var context = {
event: event,
results: {},
startTime: Date.now()
};
var index = 0;
function next(err) {
if (err) {
console.error("Pipeline '%s' failed at stage '%s': %s",
self.name, self.stages[index - 1].name, err.message);
if (self.errorHandler) {
return self.errorHandler(err, context, callback);
}
return callback(err);
}
if (index >= self.stages.length) {
context.duration = Date.now() - context.startTime;
console.log("Pipeline '%s' completed in %dms", self.name, context.duration);
return callback(null, context);
}
var stage = self.stages[index];
index++;
console.log("Pipeline '%s' executing stage '%s'", self.name, stage.name);
try {
stage.processor(context, function(stageErr, result) {
if (result) {
context.results[stage.name] = result;
}
next(stageErr);
});
} catch (e) {
next(e);
}
}
next();
};
// Example: PR merge triggers deploy verification pipeline
var deployVerificationPipeline = new EventPipeline("deploy-verification");
deployVerificationPipeline
.addStage("validate-branch", function(ctx, done) {
var pr = ctx.event.resource;
var targetBranch = pr.targetRefName.replace("refs/heads/", "");
if (targetBranch !== "main" && targetBranch !== "release") {
return done(new Error("Skipping: target branch " + targetBranch + " is not deployable"));
}
done(null, { branch: targetBranch });
})
.addStage("check-approvals", function(ctx, done) {
var pr = ctx.event.resource;
var approvals = pr.reviewers.filter(function(r) {
return r.vote === 10;
});
if (approvals.length < 2) {
return done(new Error("Insufficient approvals: " + approvals.length + " of 2 required"));
}
done(null, { approvals: approvals.length });
})
.addStage("trigger-deploy", function(ctx, done) {
var branch = ctx.results["validate-branch"].branch;
var environment = branch === "main" ? "production" : "staging";
console.log("Triggering deployment to %s", environment);
// Trigger a release pipeline via REST API
var orgName = process.env.ORG_URL.split("/").pop();
var projectId = ctx.event.resourceContainers.project.id;
var body = {
definitionId: parseInt(process.env.RELEASE_DEFINITION_ID, 10),
description: "Auto-triggered by PR #" + ctx.event.resource.pullRequestId,
isDraft: false,
reason: "continuousIntegration",
environmentsMetadata: [{
definitionEnvironmentId: parseInt(process.env[environment.toUpperCase() + "_ENV_ID"], 10),
scheduledDeploymentTime: null
}]
};
done(null, { environment: environment, triggered: true });
})
.onError(function(err, ctx, done) {
console.log("Deploy verification failed: %s — notifying team", err.message);
done(null, ctx);
});
Subscription Health Monitoring
Service hook subscriptions can silently stop working. Azure DevOps disables subscriptions that fail too many consecutive deliveries. Here is a monitoring utility:
function SubscriptionMonitor(manager) {
this.manager = manager;
}
SubscriptionMonitor.prototype.checkHealth = function(callback) {
var self = this;
this.manager.listSubscriptions(function(err, result) {
if (err) return callback(err);
var subscriptions = result.value || [];
var report = {
total: subscriptions.length,
enabled: 0,
disabled: 0,
probation: 0,
details: []
};
var pending = subscriptions.length;
if (pending === 0) return callback(null, report);
subscriptions.forEach(function(sub) {
var status = sub.status || "unknown";
if (status === "enabled") report.enabled++;
else if (status === "disabledBySystem") report.disabled++;
else if (status === "enabledWithRestrictions") report.probation++;
self.manager.getNotifications(sub.id, function(notifErr, notifs) {
var recentFailures = 0;
var lastSuccess = null;
var lastFailure = null;
if (!notifErr && notifs.value) {
notifs.value.forEach(function(n) {
if (n.status === "completed") {
if (!lastSuccess || n.createdDate > lastSuccess) {
lastSuccess = n.createdDate;
}
} else if (n.status === "failed") {
recentFailures++;
if (!lastFailure || n.createdDate > lastFailure) {
lastFailure = n.createdDate;
}
}
});
}
report.details.push({
id: sub.id,
eventType: sub.eventType,
consumerUrl: sub.consumerInputs ? sub.consumerInputs.url : "N/A",
status: status,
recentFailures: recentFailures,
lastSuccess: lastSuccess,
lastFailure: lastFailure
});
pending--;
if (pending === 0) {
report.details.sort(function(a, b) {
return b.recentFailures - a.recentFailures;
});
callback(null, report);
}
});
});
});
};
// Usage
var monitor = new SubscriptionMonitor(manager);
monitor.checkHealth(function(err, report) {
if (err) {
console.error("Health check failed:", err.message);
return;
}
console.log("=== Service Hook Health Report ===");
console.log("Total: %d | Enabled: %d | Disabled: %d | Probation: %d",
report.total, report.enabled, report.disabled, report.probation);
report.details.forEach(function(sub) {
if (sub.recentFailures > 0 || sub.status !== "enabled") {
console.log("\n [%s] %s -> %s", sub.status.toUpperCase(), sub.eventType, sub.consumerUrl);
console.log(" Recent failures: %d", sub.recentFailures);
console.log(" Last success: %s", sub.lastSuccess || "never");
console.log(" Last failure: %s", sub.lastFailure || "never");
}
});
});
Complete Working Example: Multi-Channel Event Router
This is a complete Express.js application that receives Azure DevOps service hook events and routes them to multiple channels — Slack, a log file, and a custom metrics collector. It includes subscription management endpoints and health monitoring.
var express = require("express");
var https = require("https");
var fs = require("fs");
var path = require("path");
var app = express();
app.use(express.json({ limit: "2mb" }));
var PORT = process.env.PORT || 4000;
var HOOK_SECRET = process.env.HOOK_SECRET;
var SLACK_WEBHOOK_URL = process.env.SLACK_WEBHOOK_URL;
// Event log for audit trail
var LOG_DIR = path.join(__dirname, "event-logs");
if (!fs.existsSync(LOG_DIR)) {
fs.mkdirSync(LOG_DIR, { recursive: true });
}
// Metrics counters
var metrics = {
totalReceived: 0,
byEventType: {},
byChannel: {},
errors: 0,
lastEvent: null
};
// Channel: File Logger
function logToFile(event, callback) {
var date = new Date().toISOString().split("T")[0];
var logFile = path.join(LOG_DIR, "events-" + date + ".jsonl");
var entry = JSON.stringify({
timestamp: new Date().toISOString(),
eventType: event.eventType,
subscriptionId: event.subscriptionId,
notificationId: event.notificationId,
message: event.message ? event.message.text : "",
resource: event.resource
}) + "\n";
fs.appendFile(logFile, entry, function(err) {
if (err) return callback(err);
callback(null, { channel: "file", file: logFile });
});
}
// Channel: Slack Notification
function sendToSlack(event, callback) {
if (!SLACK_WEBHOOK_URL) {
return callback(null, { channel: "slack", skipped: true, reason: "no webhook configured" });
}
var color = "#36a64f";
if (event.eventType.indexOf("failed") !== -1 || (event.resource && event.resource.result === "failed")) {
color = "#cc0000";
}
var payload = JSON.stringify({
attachments: [{
color: color,
title: event.eventType,
text: event.message ? event.message.text : "No message",
footer: "Azure DevOps Service Hook",
ts: Math.floor(Date.now() / 1000)
}]
});
var parsed = require("url").parse(SLACK_WEBHOOK_URL);
var options = {
hostname: parsed.hostname,
path: parsed.path,
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(payload)
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode === 200) {
callback(null, { channel: "slack", sent: true });
} else {
callback(new Error("Slack returned " + res.statusCode + ": " + data));
}
});
});
req.on("error", callback);
req.write(payload);
req.end();
}
// Channel: Metrics Collector
function collectMetrics(event, callback) {
metrics.totalReceived++;
metrics.lastEvent = new Date().toISOString();
if (!metrics.byEventType[event.eventType]) {
metrics.byEventType[event.eventType] = 0;
}
metrics.byEventType[event.eventType]++;
callback(null, { channel: "metrics", counted: true });
}
// Route events through all channels
function routeEvent(event, channels, callback) {
var results = [];
var errors = [];
var pending = channels.length;
channels.forEach(function(channel) {
channel.handler(event, function(err, result) {
if (err) {
errors.push({ channel: channel.name, error: err.message });
metrics.errors++;
} else {
results.push(result);
}
if (!metrics.byChannel[channel.name]) {
metrics.byChannel[channel.name] = { success: 0, error: 0 };
}
metrics.byChannel[channel.name][err ? "error" : "success"]++;
pending--;
if (pending === 0) {
callback(errors.length > 0 ? errors : null, results);
}
});
});
}
// Define active channels
var channels = [
{ name: "file", handler: logToFile },
{ name: "slack", handler: sendToSlack },
{ name: "metrics", handler: collectMetrics }
];
// Webhook receiver endpoint
app.post("/hooks/receive", function(req, res) {
var secret = req.headers["x-hook-secret"];
if (HOOK_SECRET && secret !== HOOK_SECRET) {
return res.status(401).json({ error: "Unauthorized" });
}
var event = req.body;
if (!event.eventType) {
return res.status(400).json({ error: "Missing eventType" });
}
routeEvent(event, channels, function(errors, results) {
if (errors) {
console.error("Some channels failed:", JSON.stringify(errors));
}
res.json({
status: "processed",
eventType: event.eventType,
channels: results,
errors: errors || []
});
});
});
// Metrics endpoint
app.get("/metrics", function(req, res) {
res.json(metrics);
});
// Event log query endpoint
app.get("/logs", function(req, res) {
var date = req.query.date || new Date().toISOString().split("T")[0];
var logFile = path.join(LOG_DIR, "events-" + date + ".jsonl");
if (!fs.existsSync(logFile)) {
return res.json({ date: date, events: [], count: 0 });
}
var content = fs.readFileSync(logFile, "utf8").trim();
var events = content.split("\n").map(function(line) {
try { return JSON.parse(line); }
catch (e) { return null; }
}).filter(Boolean);
var eventType = req.query.eventType;
if (eventType) {
events = events.filter(function(e) { return e.eventType === eventType; });
}
res.json({ date: date, events: events, count: events.length });
});
// Subscription management
app.post("/subscriptions/setup", function(req, res) {
var orgUrl = process.env.ORG_URL;
var pat = process.env.AZURE_DEVOPS_PAT;
var baseWebhookUrl = req.body.webhookUrl;
var projectId = req.body.projectId;
if (!orgUrl || !pat) {
return res.status(400).json({ error: "ORG_URL and AZURE_DEVOPS_PAT required" });
}
var manager = new ServiceHookManager(orgUrl, pat);
var eventTypes = [
"git.push",
"git.pullrequest.created",
"git.pullrequest.merged",
"build.complete",
"workitem.created",
"workitem.updated"
];
var created = [];
var pending = eventTypes.length;
eventTypes.forEach(function(eventType) {
manager.createSubscription({
eventType: eventType,
projectId: projectId,
webhookUrl: baseWebhookUrl,
headers: HOOK_SECRET ? "X-Hook-Secret: " + HOOK_SECRET : ""
}, function(err, result) {
if (err) {
created.push({ eventType: eventType, error: err.message });
} else {
created.push({ eventType: eventType, id: result.id, status: result.status });
}
pending--;
if (pending === 0) {
res.json({ subscriptions: created });
}
});
});
});
app.listen(PORT, function() {
console.log("Event Router running on port " + PORT);
console.log("Webhook endpoint: POST /hooks/receive");
console.log("Metrics: GET /metrics");
console.log("Event logs: GET /logs?date=2026-02-09&eventType=git.push");
});
Test it:
# Start the server
node event-router.js
# Simulate a git.push event
curl -X POST http://localhost:4000/hooks/receive \
-H "Content-Type: application/json" \
-H "X-Hook-Secret: my-shared-secret-123" \
-d '{
"subscriptionId": "test-sub-1",
"notificationId": 1,
"eventType": "git.push",
"message": {"text": "Shane pushed 2 commits to main"},
"resource": {
"repository": {"name": "my-app"},
"refUpdates": [{"name": "refs/heads/main"}],
"pushedBy": {"displayName": "Shane"},
"commits": [
{"commitId": "abc12345", "comment": "Fix auth bug"},
{"commitId": "def67890", "comment": "Update tests"}
]
}
}'
# Check metrics
curl http://localhost:4000/metrics
# Query event logs
curl "http://localhost:4000/logs?date=2026-02-09"
Expected metrics output:
{
"totalReceived": 1,
"byEventType": {
"git.push": 1
},
"byChannel": {
"file": {"success": 1, "error": 0},
"slack": {"success": 1, "error": 0},
"metrics": {"success": 1, "error": 0}
},
"errors": 0,
"lastEvent": "2026-02-09T15:30:00.000Z"
}
Common Issues and Troubleshooting
Subscription stuck in "disabledBySystem" state
Azure DevOps disables subscriptions after multiple consecutive delivery failures. You cannot re-enable a disabled subscription — you must delete it and create a new one.
# Check subscription status
curl -u :$PAT "https://dev.azure.com/myorg/_apis/hooks/subscriptions?api-version=7.1" \
| jq '.value[] | select(.status == "disabledBySystem") | {id, eventType, status}'
Common causes: the consumer URL is unreachable, SSL certificate expired, or the server returns 5xx errors consistently.
Events arrive out of order
Service hooks do not guarantee delivery order. If your automation depends on processing events sequentially (for example, workitem.created before workitem.updated), you need to implement ordering yourself:
var eventBuffer = {};
var BUFFER_TIMEOUT_MS = 5000;
function processInOrder(event, callback) {
var resourceId = event.resource.id || event.resource.pullRequestId || "unknown";
if (!eventBuffer[resourceId]) {
eventBuffer[resourceId] = [];
}
eventBuffer[resourceId].push(event);
setTimeout(function() {
var events = eventBuffer[resourceId];
delete eventBuffer[resourceId];
if (events) {
events.sort(function(a, b) {
return new Date(a.createdDate) - new Date(b.createdDate);
});
events.forEach(function(e) { callback(e); });
}
}, BUFFER_TIMEOUT_MS);
}
Webhook receives duplicate events
Azure DevOps retries failed deliveries. If your server takes too long to respond (over 20 seconds), Azure DevOps considers it a failure and retries. Always respond with 200 quickly and process the event asynchronously if needed. Use the notificationId for deduplication.
"TF400898: An Internal Error Occurred" when creating subscriptions
This cryptic error usually means the publisherInputs contain an invalid filter value. Common causes:
- Using a repository name instead of the repository ID in the
repositoryfilter - Using a branch name without the
refs/heads/prefix - Using a project name instead of the project GUID
// WRONG — uses project name
{ projectId: "MyProject" }
// CORRECT — uses project GUID
{ projectId: "a1b2c3d4-e5f6-7890-abcd-ef1234567890" }
Events missing the resource object details
By default, service hooks send minimal resource information. When creating the subscription, set resourceDetailsToSend to all:
consumerInputs: {
url: webhookUrl,
resourceDetailsToSend: "all",
messagesToSend: "all"
}
Without this, you get the resource ID but not the full object with fields, dates, and related data.
Best Practices
Respond to webhooks quickly. Return a 200 status within 5 seconds and process events asynchronously. Azure DevOps will retry on slow responses, causing duplicates and eventually disabling your subscription.
Implement idempotent event handlers. Between retries, network issues, and duplicate deliveries, your handlers will see the same event more than once. Design handlers so processing an event twice produces the same result as processing it once.
Use shared secrets for authentication. Pass a secret in the custom HTTP headers when creating subscriptions. Verify this secret on every incoming request. Without this, anyone who discovers your webhook URL can send fake events.
Monitor subscription health daily. Azure DevOps silently disables broken subscriptions. Build automated checks that verify all expected subscriptions are active and have recent successful deliveries.
Scope subscriptions narrowly. Subscribe to specific projects, repositories, and branches rather than organization-wide events. Broad subscriptions generate noise and increase processing cost. You can always add more subscriptions later.
Log every event to persistent storage. Even if you only care about builds and PRs today, log everything. When something goes wrong two months from now, you will wish you had the full event history to reconstruct what happened.
Handle subscription recreation gracefully. Subscriptions can be deleted by organization admins or disabled by the system. Build a setup script that checks for existing subscriptions and recreates missing ones. Run this script on application startup.
Test with real events, not just mock payloads. Azure DevOps event payloads have undocumented fields and edge cases. Use a tool like ngrok to point a service hook at your local development server and trigger real events. The payload structure for a PR with 50 reviewers looks very different from one with 2.