Azure DevOps Audit Logging and Compliance
Complete guide to Azure DevOps audit logging for compliance, covering audit log queries, streaming to external SIEM systems, building compliance reports, and automating audit trail analysis with Node.js.
Azure DevOps Audit Logging and Compliance
Overview
Every change in Azure DevOps generates an audit event — who modified a pipeline, who changed permissions, who accessed a service connection. For organizations subject to SOC 2, HIPAA, PCI-DSS, or internal compliance requirements, these audit logs are not optional — they are evidence. I have built audit pipelines for several regulated teams, and the difference between "we think we are compliant" and "we can prove it" comes down to whether you are collecting, analyzing, and retaining these logs properly.
Prerequisites
- Azure DevOps organization with Organization Owner or Project Collection Administrator permissions
- Azure DevOps audit feature enabled (available on all tiers, but streaming requires Azure DevOps Services)
- Node.js 16 or later for automation scripts
- An Azure Log Analytics workspace or Splunk instance for log streaming (optional but recommended)
- Personal Access Token with
vso.auditlogscope - Basic understanding of compliance frameworks (SOC 2, ISO 27001, PCI-DSS)
Understanding Azure DevOps Audit Events
Azure DevOps captures audit events across every service area. Each event includes who did what, when, and from where.
Event Structure
{
"id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
"correlationId": "b2c3d4e5-f6a7-8901-bcde-f12345678901",
"activityId": "c3d4e5f6-a7b8-9012-cdef-123456789012",
"actorUserId": "d4e5f6a7-b8c9-0123-defg-456789012345",
"actorDisplayName": "Shane Larson",
"actorClientId": "00000000-0000-0000-0000-000000000000",
"timestamp": "2026-02-10T14:23:45.123Z",
"scopeType": "organization",
"scopeDisplayName": "my-org",
"ipAddress": "203.0.113.42",
"userAgent": "Mozilla/5.0...",
"actionId": "Security.ModifyPermission",
"details": "Permission 'GenericContribute' was modified for user '[email protected]'",
"area": "Security",
"category": "modify",
"categoryDisplayName": "Modify",
"actorUPN": "[email protected]",
"data": {
"PermissionName": "GenericContribute",
"TargetUser": "[email protected]",
"PreviousValue": "Deny",
"NewValue": "Allow"
}
}
Key Event Categories
| Area | Action Examples | Compliance Relevance |
|---|---|---|
| Security | ModifyPermission, RemovePermission, AddIdentity | Access control changes |
| Policy | PolicyConfigModified, PolicyConfigRemoved | Branch protection changes |
| Git | RefUpdatePoliciesBypassed, RepositoryCreated | Code integrity |
| Pipelines | PipelineModified, PipelineDeleted, RunCompleted | Build/deploy integrity |
| Library | VariableGroupModified, SecureFileModified | Secrets management |
| ServiceEndpoint | ServiceEndpointCreated, ServiceEndpointModified | Third-party access |
| Token | TokenCreated, TokenRevoked | Authentication management |
| Group | GroupMemberAdded, GroupMemberRemoved | Identity changes |
| Project | ProjectCreated, ProjectDeleted | Organizational changes |
Querying the Audit Log API
Basic Audit Query
// scripts/query-audit-log.js
var https = require("https");
var ORG = process.env.AZURE_ORG;
var PAT = process.env.AZURE_PAT;
function queryAuditLog(startTime, endTime, continuationToken, callback) {
var auth = Buffer.from(":" + PAT).toString("base64");
var path = "/" + ORG + "/_apis/audit/auditlog?startTime=" +
encodeURIComponent(startTime) +
"&endTime=" + encodeURIComponent(endTime) +
"&api-version=7.1";
if (continuationToken) {
path += "&continuationToken=" + encodeURIComponent(continuationToken);
}
var options = {
hostname: "auditservice.dev.azure.com",
path: path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Accept": "application/json"
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode === 200) {
callback(null, JSON.parse(data));
} else {
callback(new Error("Audit API error (" + res.statusCode + "): " + data.substring(0, 300)));
}
});
});
req.on("error", callback);
req.end();
}
// Collect all events with pagination
function collectAllEvents(startTime, endTime, callback) {
var allEvents = [];
function fetchPage(token) {
queryAuditLog(startTime, endTime, token, function(err, data) {
if (err) return callback(err);
var events = data.decoratedAuditLogEntries || [];
allEvents = allEvents.concat(events);
console.log(" Fetched " + events.length + " events (total: " + allEvents.length + ")");
if (data.continuationToken) {
fetchPage(data.continuationToken);
} else {
callback(null, allEvents);
}
});
}
fetchPage(null);
}
// Query last 7 days
var endTime = new Date().toISOString();
var startTime = new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString();
console.log("Querying audit log from " + startTime + " to " + endTime + "\n");
collectAllEvents(startTime, endTime, function(err, events) {
if (err) {
console.error("Failed:", err.message);
process.exit(1);
}
console.log("\nTotal events: " + events.length);
// Group by area
var areas = {};
events.forEach(function(e) {
var area = e.area || "Unknown";
if (!areas[area]) areas[area] = 0;
areas[area]++;
});
console.log("\nEvents by area:");
Object.keys(areas).sort().forEach(function(area) {
console.log(" " + area + ": " + areas[area]);
});
});
Output:
Querying audit log from 2026-02-03T14:00:00.000Z to 2026-02-10T14:00:00.000Z
Fetched 200 events (total: 200)
Fetched 200 events (total: 400)
Fetched 143 events (total: 543)
Total events: 543
Events by area:
Git: 234
Group: 12
Library: 8
Pipelines: 187
Policy: 15
Project: 3
Security: 24
ServiceEndpoint: 6
Token: 54
Filtering for Security-Critical Events
// scripts/security-events.js
var https = require("https");
var ORG = process.env.AZURE_ORG;
var PAT = process.env.AZURE_PAT;
var CRITICAL_ACTIONS = [
"Security.ModifyPermission",
"Security.RemovePermission",
"Security.ModifyAccessControlList",
"Policy.PolicyConfigModified",
"Policy.PolicyConfigRemoved",
"Git.RefUpdatePoliciesBypassed",
"ServiceEndpoint.Create",
"ServiceEndpoint.Modify",
"ServiceEndpoint.Delete",
"Library.VariableGroupModified",
"Library.SecureFileModified",
"Group.UpdateGroupMembership.Add",
"Group.UpdateGroupMembership.Remove",
"Token.PatTokenCreateEvent",
"Token.PatTokenRevokeEvent",
"Pipelines.PipelineModified",
"Pipelines.PipelineDeleted",
"Project.AreaPath.Delete",
"Project.Process.Modify"
];
function queryAuditLog(startTime, continuationToken, callback) {
var auth = Buffer.from(":" + PAT).toString("base64");
var path = "/" + ORG + "/_apis/audit/auditlog?startTime=" +
encodeURIComponent(startTime) + "&api-version=7.1";
if (continuationToken) {
path += "&continuationToken=" + encodeURIComponent(continuationToken);
}
var options = {
hostname: "auditservice.dev.azure.com",
path: path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Accept": "application/json"
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode === 200) {
callback(null, JSON.parse(data));
} else {
callback(new Error("API error: " + res.statusCode));
}
});
});
req.on("error", callback);
req.end();
}
function collectSecurityEvents(days, callback) {
var startTime = new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString();
var critical = [];
function fetchPage(token) {
queryAuditLog(startTime, token, function(err, data) {
if (err) return callback(err);
var events = data.decoratedAuditLogEntries || [];
events.forEach(function(e) {
if (CRITICAL_ACTIONS.indexOf(e.actionId) !== -1) {
critical.push(e);
}
});
if (data.continuationToken) {
fetchPage(data.continuationToken);
} else {
callback(null, critical);
}
});
}
fetchPage(null);
}
collectSecurityEvents(30, function(err, events) {
if (err) {
console.error("Failed:", err.message);
process.exit(1);
}
console.log("Security-Critical Events (Last 30 Days)");
console.log("========================================\n");
console.log("Total critical events: " + events.length + "\n");
// Classify by severity
var high = [];
var medium = [];
events.forEach(function(e) {
var isHigh = e.actionId.indexOf("Delete") !== -1 ||
e.actionId.indexOf("PolicyConfigRemoved") !== -1 ||
e.actionId.indexOf("Bypassed") !== -1 ||
e.actionId.indexOf("RemovePermission") !== -1;
if (isHigh) {
high.push(e);
} else {
medium.push(e);
}
});
if (high.length > 0) {
console.log("--- HIGH SEVERITY (" + high.length + ") ---\n");
high.forEach(function(e) {
console.log("[HIGH] " + e.timestamp);
console.log(" Action: " + e.actionId);
console.log(" Actor: " + (e.actorDisplayName || "unknown") + " (" + (e.actorUPN || "") + ")");
console.log(" IP: " + (e.ipAddress || "unknown"));
console.log(" Details: " + (e.details || "none"));
console.log("");
});
}
if (medium.length > 0) {
console.log("--- MEDIUM SEVERITY (" + medium.length + ") ---\n");
medium.forEach(function(e) {
console.log("[MEDIUM] " + e.timestamp);
console.log(" Action: " + e.actionId);
console.log(" Actor: " + (e.actorDisplayName || "unknown"));
console.log(" Details: " + (e.details || "none"));
console.log("");
});
}
});
Streaming Audit Logs to External Systems
Azure DevOps supports streaming audit events to external destinations for long-term retention and SIEM integration.
Configuring Audit Streaming
Navigate to Organization Settings > Auditing > Streams.
Supported destinations:
- Azure Monitor Log Analytics — native Azure integration
- Splunk — via HEC (HTTP Event Collector)
- Azure Event Grid — for custom processing pipelines
Azure Monitor Log Analytics Setup
# Create a Log Analytics workspace
az monitor log-analytics workspace create \
--resource-group rg-devops-audit \
--workspace-name law-devops-audit \
--location eastus \
--retention-in-days 365
# Get the workspace ID and key
az monitor log-analytics workspace show \
--resource-group rg-devops-audit \
--workspace-name law-devops-audit \
--query customerId -o tsv
az monitor log-analytics workspace get-shared-keys \
--resource-group rg-devops-audit \
--workspace-name law-devops-audit \
--query primarySharedKey -o tsv
Then in Azure DevOps:
- Go to Organization Settings > Auditing > Streams
- Click New stream > Azure Monitor Log Analytics
- Enter the Workspace ID and Primary Key
- Select event categories to stream
- Save
Querying in Log Analytics
Once streaming is configured, query audit events with KQL:
// All security-related events in the last 24 hours
AzureDevOpsAuditing
| where TimeGenerated > ago(24h)
| where Area == "Security" or Area == "Policy"
| project TimeGenerated, ActorDisplayName, ActionId, Details, IpAddress
| order by TimeGenerated desc
// Permission changes with before/after values
AzureDevOpsAuditing
| where ActionId startswith "Security.Modify"
| extend PreviousValue = tostring(Data.PreviousValue)
| extend NewValue = tostring(Data.NewValue)
| project TimeGenerated, ActorDisplayName, Details, PreviousValue, NewValue
| order by TimeGenerated desc
// Policy bypasses (branch protection overrides)
AzureDevOpsAuditing
| where ActionId == "Git.RefUpdatePoliciesBypassed"
| project TimeGenerated, ActorDisplayName, Details, IpAddress
| order by TimeGenerated desc
// Unusual activity: events outside business hours
AzureDevOpsAuditing
| where TimeGenerated > ago(7d)
| extend HourOfDay = datetime_part("Hour", TimeGenerated)
| where HourOfDay < 7 or HourOfDay > 19
| where Area in ("Security", "Policy", "ServiceEndpoint", "Library")
| project TimeGenerated, ActorDisplayName, ActionId, Details, IpAddress
| order by TimeGenerated desc
Splunk Integration
Configure the Splunk HEC endpoint:
# In Splunk, create an HEC token:
# Settings > Data Inputs > HTTP Event Collector > New Token
# Token name: azure-devops-audit
# Source type: _json
# Index: azure_devops
In Azure DevOps:
- Go to Organization Settings > Auditing > Streams
- Click New stream > Splunk
- Enter the HEC URL:
https://splunk.yourcompany.com:8088/services/collector - Enter the HEC token
- Save
Splunk search queries:
// All Azure DevOps audit events
index=azure_devops sourcetype=_json
| table _time, actorDisplayName, actionId, details, ipAddress
// Permission escalation detection
index=azure_devops sourcetype=_json actionId="Security.ModifyPermission"
| eval isEscalation=if(like(details, "%Allow%") AND like(details, "%Deny%"), "yes", "no")
| where isEscalation="yes"
| table _time, actorDisplayName, details
Custom Streaming with Event Grid
For custom processing, stream to Azure Event Grid and route to Azure Functions, Logic Apps, or your own webhook:
// azure-function/process-audit-event/index.js
var https = require("https");
var SLACK_WEBHOOK = process.env.SLACK_WEBHOOK_URL;
module.exports = function(context, eventGridEvent) {
var event = eventGridEvent.data;
context.log("Audit event: " + event.actionId + " by " + event.actorDisplayName);
// Only alert on high-severity events
var highSeverity = [
"Security.ModifyPermission",
"Security.RemovePermission",
"Git.RefUpdatePoliciesBypassed",
"Policy.PolicyConfigRemoved",
"ServiceEndpoint.Delete",
"Pipelines.PipelineDeleted"
];
if (highSeverity.indexOf(event.actionId) === -1) {
context.done();
return;
}
var message = {
text: ":rotating_light: *Azure DevOps Security Alert*\n" +
"*Action:* `" + event.actionId + "`\n" +
"*Actor:* " + event.actorDisplayName + " (" + (event.actorUPN || "") + ")\n" +
"*Details:* " + (event.details || "none") + "\n" +
"*IP:* " + (event.ipAddress || "unknown") + "\n" +
"*Time:* " + event.timestamp
};
var body = JSON.stringify(message);
var parsed = new URL(SLACK_WEBHOOK);
var options = {
hostname: parsed.hostname,
path: parsed.pathname,
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(body)
}
};
var req = https.request(options, function(res) {
context.log("Slack alert sent: " + res.statusCode);
context.done();
});
req.on("error", function(err) {
context.log.error("Slack alert failed:", err.message);
context.done();
});
req.write(body);
req.end();
};
Building Compliance Reports
Automated Monthly Compliance Report
// scripts/compliance-report.js
var https = require("https");
var fs = require("fs");
var ORG = process.env.AZURE_ORG;
var PAT = process.env.AZURE_PAT;
function queryAuditLog(startTime, endTime, token, callback) {
var auth = Buffer.from(":" + PAT).toString("base64");
var path = "/" + ORG + "/_apis/audit/auditlog?startTime=" +
encodeURIComponent(startTime) + "&endTime=" + encodeURIComponent(endTime) +
"&api-version=7.1";
if (token) path += "&continuationToken=" + encodeURIComponent(token);
var options = {
hostname: "auditservice.dev.azure.com",
path: path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Accept": "application/json"
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode === 200) callback(null, JSON.parse(data));
else callback(new Error("API error: " + res.statusCode));
});
});
req.on("error", callback);
req.end();
}
function collectAll(startTime, endTime, callback) {
var events = [];
function fetch(token) {
queryAuditLog(startTime, endTime, token, function(err, data) {
if (err) return callback(err);
events = events.concat(data.decoratedAuditLogEntries || []);
if (data.continuationToken) fetch(data.continuationToken);
else callback(null, events);
});
}
fetch(null);
}
function generateReport(events) {
var report = {
generatedAt: new Date().toISOString(),
organization: ORG,
period: {},
summary: {},
accessChanges: [],
policyChanges: [],
policyBypasses: [],
serviceConnectionChanges: [],
tokenActivity: [],
recommendations: []
};
// Classify events
events.forEach(function(e) {
if (e.actionId && e.actionId.indexOf("Security.") === 0) {
report.accessChanges.push({
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details,
ip: e.ipAddress
});
}
if (e.actionId && e.actionId.indexOf("Policy.") === 0) {
report.policyChanges.push({
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details
});
}
if (e.actionId === "Git.RefUpdatePoliciesBypassed") {
report.policyBypasses.push({
timestamp: e.timestamp,
actor: e.actorDisplayName,
details: e.details,
ip: e.ipAddress
});
}
if (e.actionId && e.actionId.indexOf("ServiceEndpoint.") === 0) {
report.serviceConnectionChanges.push({
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details
});
}
if (e.actionId && e.actionId.indexOf("Token.") === 0) {
report.tokenActivity.push({
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details
});
}
});
// Summary
report.summary = {
totalEvents: events.length,
accessChanges: report.accessChanges.length,
policyChanges: report.policyChanges.length,
policyBypasses: report.policyBypasses.length,
serviceConnectionChanges: report.serviceConnectionChanges.length,
tokenActivity: report.tokenActivity.length
};
// Unique actors
var actors = {};
events.forEach(function(e) {
if (e.actorDisplayName) actors[e.actorDisplayName] = true;
});
report.summary.uniqueActors = Object.keys(actors).length;
// Recommendations
if (report.policyBypasses.length > 5) {
report.recommendations.push("HIGH: " + report.policyBypasses.length + " policy bypasses detected. Review bypass permissions and consider restricting the 'Bypass policies' permission.");
}
if (report.serviceConnectionChanges.length > 0) {
report.recommendations.push("MEDIUM: " + report.serviceConnectionChanges.length + " service connection changes. Verify each change was authorized.");
}
var tokenCreations = report.tokenActivity.filter(function(t) {
return t.action.indexOf("Create") !== -1;
});
if (tokenCreations.length > 20) {
report.recommendations.push("MEDIUM: " + tokenCreations.length + " PAT tokens created. Review whether all are necessary and have appropriate scopes.");
}
return report;
}
// Generate report for last 30 days
var endTime = new Date().toISOString();
var startTime = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000).toISOString();
console.log("Generating compliance report...");
console.log("Period: " + startTime.substring(0, 10) + " to " + endTime.substring(0, 10) + "\n");
collectAll(startTime, endTime, function(err, events) {
if (err) {
console.error("Failed:", err.message);
process.exit(1);
}
var report = generateReport(events);
report.period = { start: startTime, end: endTime };
// Write JSON report
var reportPath = "compliance-report-" + endTime.substring(0, 7) + ".json";
fs.writeFileSync(reportPath, JSON.stringify(report, null, 2));
// Print summary
console.log("=== Compliance Report Summary ===\n");
console.log("Total audit events: " + report.summary.totalEvents);
console.log("Unique actors: " + report.summary.uniqueActors);
console.log("Access control changes: " + report.summary.accessChanges);
console.log("Policy changes: " + report.summary.policyChanges);
console.log("Policy bypasses: " + report.summary.policyBypasses);
console.log("Service connection changes: " + report.summary.serviceConnectionChanges);
console.log("Token activity: " + report.summary.tokenActivity);
if (report.recommendations.length > 0) {
console.log("\n=== Recommendations ===\n");
report.recommendations.forEach(function(r) { console.log(" " + r); });
}
console.log("\nReport saved to: " + reportPath);
});
Retention and Archival
Azure DevOps retains audit logs for 90 days by default. For compliance, you need longer retention.
Archival Pipeline
# audit-archive-pipeline.yml
schedules:
- cron: "0 2 1 * *"
displayName: "Monthly audit archive"
branches:
include:
- main
always: true
pool:
vmImage: "ubuntu-latest"
steps:
- script: |
START_DATE=$(date -u -d "-35 days" +%Y-%m-%dT00:00:00Z)
END_DATE=$(date -u -d "-5 days" +%Y-%m-%dT00:00:00Z)
echo "Archiving audit logs from $START_DATE to $END_DATE"
node scripts/compliance-report.js
displayName: "Generate compliance report"
env:
AZURE_ORG: $(System.CollectionUri)
AZURE_PAT: $(AuditPAT)
- task: AzureCLI@2
displayName: "Upload to blob storage"
inputs:
azureSubscription: "Azure-Compliance"
scriptType: "bash"
scriptLocation: "inlineScript"
inlineScript: |
REPORT_FILE=$(ls compliance-report-*.json | head -1)
az storage blob upload \
--account-name stauditarchive \
--container-name audit-logs \
--name "$(date +%Y/%m)/$REPORT_FILE" \
--file "$REPORT_FILE" \
--auth-mode login
echo "Archived: $REPORT_FILE"
Complete Working Example
A compliance monitoring system that queries the audit log, checks for violations, generates reports, and alerts on anomalies:
// scripts/compliance-monitor.js
var https = require("https");
var ORG = process.env.AZURE_ORG;
var PAT = process.env.AZURE_PAT;
var SLACK_WEBHOOK = process.env.SLACK_WEBHOOK_URL;
// Compliance rules
var RULES = [
{
id: "CC-001",
name: "Policy bypass detected",
description: "Branch policy was bypassed during PR completion",
severity: "HIGH",
match: function(e) { return e.actionId === "Git.RefUpdatePoliciesBypassed"; }
},
{
id: "CC-002",
name: "Branch policy removed",
description: "A branch protection policy was deleted",
severity: "HIGH",
match: function(e) { return e.actionId === "Policy.PolicyConfigRemoved"; }
},
{
id: "CC-003",
name: "Service connection modified",
description: "A service connection was created, modified, or deleted",
severity: "MEDIUM",
match: function(e) { return e.actionId && e.actionId.indexOf("ServiceEndpoint.") === 0; }
},
{
id: "CC-004",
name: "Permission escalation",
description: "A permission was changed from Deny/Not Set to Allow",
severity: "HIGH",
match: function(e) {
return e.actionId === "Security.ModifyPermission" &&
e.data && e.data.NewValue === "Allow";
}
},
{
id: "CC-005",
name: "After-hours activity",
description: "Security-sensitive action performed outside business hours",
severity: "MEDIUM",
match: function(e) {
var securityAreas = ["Security", "Policy", "ServiceEndpoint", "Library"];
if (securityAreas.indexOf(e.area) === -1) return false;
var hour = new Date(e.timestamp).getUTCHours();
return hour < 7 || hour > 19; // UTC business hours
}
},
{
id: "CC-006",
name: "Full-access PAT created",
description: "A PAT with full access scope was created",
severity: "HIGH",
match: function(e) {
return e.actionId === "Token.PatTokenCreateEvent" &&
e.details && e.details.indexOf("full") !== -1;
}
}
];
function queryAuditLog(startTime, token, callback) {
var auth = Buffer.from(":" + PAT).toString("base64");
var path = "/" + ORG + "/_apis/audit/auditlog?startTime=" +
encodeURIComponent(startTime) + "&api-version=7.1";
if (token) path += "&continuationToken=" + encodeURIComponent(token);
var options = {
hostname: "auditservice.dev.azure.com",
path: path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Accept": "application/json"
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
if (res.statusCode === 200) callback(null, JSON.parse(data));
else callback(new Error("API error: " + res.statusCode));
});
});
req.on("error", callback);
req.end();
}
function sendSlackAlert(violations, callback) {
if (!SLACK_WEBHOOK || violations.length === 0) return callback(null);
var blocks = violations.map(function(v) {
return "[" + v.severity + "] *" + v.ruleId + ": " + v.ruleName + "*\n" +
" Actor: " + v.actor + " | IP: " + v.ip + "\n" +
" Details: " + v.details + "\n" +
" Time: " + v.timestamp;
});
var message = {
text: ":shield: *Compliance Violations Detected*\n" +
"Organization: `" + ORG + "`\n\n" +
blocks.join("\n\n") + "\n\n" +
"_" + violations.length + " violation(s) found._"
};
var body = JSON.stringify(message);
var parsed = new URL(SLACK_WEBHOOK);
var options = {
hostname: parsed.hostname,
path: parsed.pathname,
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(body)
}
};
var req = https.request(options, function(res) {
callback(null);
});
req.on("error", callback);
req.write(body);
req.end();
}
// Check last 24 hours
var startTime = new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString();
var violations = [];
function collectAndCheck(token) {
queryAuditLog(startTime, token, function(err, data) {
if (err) {
console.error("Audit query failed:", err.message);
process.exit(1);
}
var events = data.decoratedAuditLogEntries || [];
events.forEach(function(e) {
RULES.forEach(function(rule) {
if (rule.match(e)) {
violations.push({
ruleId: rule.id,
ruleName: rule.name,
severity: rule.severity,
actor: e.actorDisplayName || "unknown",
ip: e.ipAddress || "unknown",
details: e.details || "none",
timestamp: e.timestamp
});
}
});
});
if (data.continuationToken) {
collectAndCheck(data.continuationToken);
} else {
// Done collecting
console.log("Compliance Check Results (Last 24 Hours)");
console.log("=========================================\n");
console.log("Rules checked: " + RULES.length);
console.log("Violations found: " + violations.length + "\n");
if (violations.length > 0) {
violations.forEach(function(v) {
console.log("[" + v.severity + "] " + v.ruleId + ": " + v.ruleName);
console.log(" Actor: " + v.actor + " | IP: " + v.ip);
console.log(" Details: " + v.details);
console.log(" Time: " + v.timestamp + "\n");
});
sendSlackAlert(violations, function(err2) {
if (err2) console.error("Slack alert failed:", err2.message);
else console.log("Slack alert sent.");
});
} else {
console.log("No violations detected. All clear.");
}
}
});
}
collectAndCheck(null);
Common Issues and Troubleshooting
"VS403356: You are not authorized to access the audit log"
HTTP 403: {"message":"VS403356: You are not authorized to access the audit log."}
The PAT does not have the vso.auditlog scope, or the user is not an Organization Owner or Project Collection Administrator. Audit log access requires elevated permissions — regular project contributors cannot query it. Create a PAT with the audit scope and use an admin account.
Audit events are delayed by several minutes
Audit events are not real-time. There is a processing delay of 2-15 minutes between when an action occurs and when it appears in the audit log. Do not use the audit API for real-time security monitoring. For near-real-time alerts, use audit streaming to Event Grid with a function trigger.
"Continuation token has expired"
HTTP 400: {"message":"The continuation token is no longer valid."}
Continuation tokens expire after a short period (typically minutes). If your collection script pauses between pages — due to rate limiting, error handling, or processing — the token may expire. Restart the collection from the beginning. For large date ranges, break the query into smaller windows (daily or weekly).
Audit streaming shows "Disconnected" status
The streaming connection to your Log Analytics workspace or Splunk instance has failed. Common causes: the workspace key was rotated, the Splunk HEC token expired, or a firewall rule is blocking outbound traffic from Azure DevOps. Check the stream status in Organization Settings > Auditing > Streams and verify the destination credentials.
Best Practices
Stream audit logs to an external system for long-term retention. Azure DevOps retains logs for 90 days. Compliance frameworks typically require 1-7 years. Stream to Log Analytics or Splunk and configure appropriate retention policies.
Run daily compliance checks, not just monthly reports. The compliance monitor script takes seconds to run. A daily scheduled pipeline that checks for violations catches problems within 24 hours instead of waiting for the monthly audit.
Define compliance rules as code. The rule-based approach in the monitor script makes it easy to add new rules, modify thresholds, and version-control your compliance policy alongside your infrastructure code.
Alert on high-severity violations immediately. Policy bypasses, permission escalations, and service connection deletions should trigger Slack or PagerDuty alerts within minutes, not appear in a monthly report.
Correlate audit events with deployment logs. When investigating an incident, cross-reference audit events (who changed what) with pipeline run logs (what was deployed). The
correlationIdfield in audit events helps link related actions.Review after-hours security events separately. A permission change at 2 AM deserves more scrutiny than the same change at 2 PM. The compliance monitor script flags after-hours activity as a separate rule.
Archive compliance reports to immutable storage. Store monthly reports in Azure Blob Storage with immutability policies enabled. This prevents retroactive tampering with compliance evidence.