Azure DevOps Audit Logging and Compliance
Implement comprehensive audit logging and compliance reporting for Azure DevOps with automated collection, analysis, and alerting
Azure DevOps Audit Logging and Compliance
Overview
Azure DevOps audit logging captures every significant action across your organization — permission changes, pipeline modifications, repository access, and policy updates. If you are operating under SOC 2, ISO 27001, HIPAA, or any regulatory framework that demands accountability over your software delivery lifecycle, audit logs are not optional. They are the foundation of your compliance posture. This article covers how to programmatically collect, analyze, and report on Azure DevOps audit events using Node.js, with practical patterns for streaming to external SIEM systems, automating compliance reports, and alerting on suspicious activity.
Prerequisites
- An Azure DevOps organization with Organization Owner or Project Collection Administrator role
- A Personal Access Token (PAT) with Audit Log: Read scope
- Node.js v16 or later installed
- Familiarity with the Azure DevOps REST API
- For external streaming: access to Splunk, Elasticsearch, or equivalent SIEM
- Basic understanding of compliance frameworks (SOC 2, ISO 27001)
Audit Log Fundamentals
Azure DevOps records audit events at the organization level. Every time a user modifies a security policy, changes a pipeline definition, adds a repository, or adjusts permissions, the platform creates an immutable audit record. These records contain:
- Timestamp — when the action occurred (UTC)
- Actor — who performed the action (user principal name or service identity)
- Action ID — a machine-readable identifier for the event type
- Area — the functional area (e.g., Git, Pipelines, Security)
- Scope — the project or organization context
- Details — a human-readable description of what changed
- IP Address — the originating IP of the request
- Correlation ID — for linking related events across services
Audit logs are retained for 90 days by default in Azure DevOps. If you need longer retention — and you almost certainly do for compliance — you must export and archive them yourself. That is the core problem this article solves.
Accessing Audit Logs via the UI
Before writing code, understand what is available in the portal. Navigate to Organization Settings > Auditing in Azure DevOps. The UI provides:
- A chronological event feed with filtering by date range
- Export to CSV or JSON for a selected time window
- Filtering by area (Security, Policy, Git, etc.)
- Search by actor or action description
The UI is fine for ad-hoc investigations but completely inadequate for compliance programs. You cannot schedule exports, you cannot alert in real-time, and you cannot correlate events across systems. That is where the API comes in.
Accessing Audit Logs via the REST API
The Audit Log API endpoint follows this pattern:
GET https://auditservice.dev.azure.com/{organization}/_apis/audit/auditlog?api-version=7.1-preview.1
Here is a basic Node.js call to retrieve audit events:
var https = require("https");
var url = require("url");
var ORG = process.env.AZURE_DEVOPS_ORG;
var PAT = process.env.AZURE_DEVOPS_PAT;
function getAuditLogs(startTime, endTime, continuationToken, callback) {
var baseUrl = "https://auditservice.dev.azure.com/" + ORG +
"/_apis/audit/auditlog?api-version=7.1-preview.1";
if (startTime) {
baseUrl += "&startTime=" + encodeURIComponent(startTime);
}
if (endTime) {
baseUrl += "&endTime=" + encodeURIComponent(endTime);
}
if (continuationToken) {
baseUrl += "&continuationToken=" + encodeURIComponent(continuationToken);
}
var parsed = url.parse(baseUrl);
var auth = Buffer.from(":" + PAT).toString("base64");
var options = {
hostname: parsed.hostname,
path: parsed.path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Content-Type": "application/json"
}
};
var req = https.request(options, function(res) {
var body = "";
res.on("data", function(chunk) { body += chunk; });
res.on("end", function() {
if (res.statusCode !== 200) {
return callback(new Error("API returned " + res.statusCode + ": " + body));
}
var data = JSON.parse(body);
callback(null, data);
});
});
req.on("error", callback);
req.end();
}
// Fetch last 24 hours
var now = new Date();
var yesterday = new Date(now.getTime() - 24 * 60 * 60 * 1000);
getAuditLogs(yesterday.toISOString(), now.toISOString(), null, function(err, data) {
if (err) {
console.error("Failed to fetch audit logs:", err.message);
return;
}
console.log("Retrieved " + data.decoratedAuditLogEntries.length + " events");
data.decoratedAuditLogEntries.forEach(function(entry) {
console.log("[" + entry.timestamp + "] " + entry.actorDisplayName +
" - " + entry.actionId + " - " + entry.details);
});
});
The API returns paginated results. When hasMore is true, use the continuationToken from the response to fetch the next page. For compliance collection, you must paginate through all results.
Audit Log Event Categories
Azure DevOps organizes audit events into areas. Here are the critical ones for compliance:
| Area | Example Events | Compliance Relevance |
|---|---|---|
| Security | Permission changes, group membership, PAT creation | SOC 2 CC6.1, ISO 27001 A.9 |
| Policy | Branch policy changes, required reviewers | SOC 2 CC8.1, ISO 27001 A.14 |
| Git | Repository creation, deletion, force push | SOC 2 CC6.1, ISO 27001 A.12 |
| Pipelines | Pipeline edits, approvals, variable changes | SOC 2 CC7.1, ISO 27001 A.14 |
| Organization | Billing, extensions, process templates | SOC 2 CC6.2, ISO 27001 A.6 |
| Licensing | License changes, access level modifications | SOC 2 CC6.3, ISO 27001 A.9 |
| Token | PAT creation, revocation, scope changes | SOC 2 CC6.1, ISO 27001 A.9 |
Understanding these categories is essential for building targeted compliance reports. You do not need to analyze every event — focus on the areas your compliance framework requires.
Building a Node.js Audit Log Collector
A production collector needs to handle pagination, rate limiting, incremental collection, and persistent state. Here is a robust implementation:
var https = require("https");
var fs = require("fs");
var path = require("path");
var CONFIG = {
org: process.env.AZURE_DEVOPS_ORG,
pat: process.env.AZURE_DEVOPS_PAT,
stateFile: path.join(__dirname, "collector-state.json"),
outputDir: path.join(__dirname, "audit-logs"),
batchSize: 200,
rateLimitDelay: 1000
};
function AuditCollector(config) {
this.config = config;
this.state = this.loadState();
}
AuditCollector.prototype.loadState = function() {
try {
var raw = fs.readFileSync(this.config.stateFile, "utf8");
return JSON.parse(raw);
} catch (e) {
return { lastCollectionTime: null, totalEventsCollected: 0 };
}
};
AuditCollector.prototype.saveState = function() {
fs.writeFileSync(this.config.stateFile, JSON.stringify(this.state, null, 2));
};
AuditCollector.prototype.makeRequest = function(urlStr, callback) {
var parsed = require("url").parse(urlStr);
var auth = Buffer.from(":" + this.config.pat).toString("base64");
var options = {
hostname: parsed.hostname,
path: parsed.path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Content-Type": "application/json"
}
};
var req = https.request(options, function(res) {
var body = "";
res.on("data", function(chunk) { body += chunk; });
res.on("end", function() {
if (res.statusCode === 429) {
var retryAfter = parseInt(res.headers["retry-after"] || "5", 10);
return callback(null, { rateLimited: true, retryAfter: retryAfter });
}
if (res.statusCode !== 200) {
return callback(new Error("HTTP " + res.statusCode + ": " + body));
}
callback(null, JSON.parse(body));
});
});
req.on("error", callback);
req.end();
};
AuditCollector.prototype.collectAll = function(startTime, endTime, callback) {
var self = this;
var allEvents = [];
var continuationToken = null;
function fetchPage() {
var url = "https://auditservice.dev.azure.com/" + self.config.org +
"/_apis/audit/auditlog?api-version=7.1-preview.1" +
"&startTime=" + encodeURIComponent(startTime) +
"&endTime=" + encodeURIComponent(endTime) +
"&batchSize=" + self.config.batchSize;
if (continuationToken) {
url += "&continuationToken=" + encodeURIComponent(continuationToken);
}
self.makeRequest(url, function(err, data) {
if (err) return callback(err);
if (data.rateLimited) {
console.log("Rate limited. Retrying in " + data.retryAfter + "s...");
setTimeout(fetchPage, data.retryAfter * 1000);
return;
}
var entries = data.decoratedAuditLogEntries || [];
allEvents = allEvents.concat(entries);
console.log("Fetched " + entries.length + " events (total: " + allEvents.length + ")");
if (data.hasMore && data.continuationToken) {
continuationToken = data.continuationToken;
setTimeout(fetchPage, self.config.rateLimitDelay);
} else {
callback(null, allEvents);
}
});
}
fetchPage();
};
AuditCollector.prototype.run = function(callback) {
var self = this;
var endTime = new Date().toISOString();
var startTime = this.state.lastCollectionTime ||
new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString();
console.log("Collecting audit logs from " + startTime + " to " + endTime);
this.collectAll(startTime, endTime, function(err, events) {
if (err) return callback(err);
if (events.length === 0) {
console.log("No new events found.");
return callback(null, []);
}
// Ensure output directory exists
if (!fs.existsSync(self.config.outputDir)) {
fs.mkdirSync(self.config.outputDir, { recursive: true });
}
// Write events to date-stamped file
var dateStr = new Date().toISOString().split("T")[0];
var filename = "audit-" + dateStr + "-" + Date.now() + ".json";
var filepath = path.join(self.config.outputDir, filename);
fs.writeFileSync(filepath, JSON.stringify(events, null, 2));
console.log("Wrote " + events.length + " events to " + filepath);
// Update state
self.state.lastCollectionTime = endTime;
self.state.totalEventsCollected += events.length;
self.saveState();
callback(null, events);
});
};
// Run the collector
var collector = new AuditCollector(CONFIG);
collector.run(function(err, events) {
if (err) {
console.error("Collection failed:", err.message);
process.exit(1);
}
console.log("Collection complete. Total events ever collected: " +
collector.state.totalEventsCollected);
});
Run this on a cron schedule — every hour is typical for compliance. The state file tracks where the last run left off so you never miss or duplicate events.
Filtering and Querying Audit Events
Once you have collected events, you need to filter them for specific compliance scenarios. Here is a query engine:
function AuditQueryEngine(events) {
this.events = events;
}
AuditQueryEngine.prototype.byArea = function(area) {
return new AuditQueryEngine(this.events.filter(function(e) {
return e.area === area;
}));
};
AuditQueryEngine.prototype.byActor = function(actorName) {
return new AuditQueryEngine(this.events.filter(function(e) {
return e.actorDisplayName === actorName ||
e.actorUPN === actorName;
}));
};
AuditQueryEngine.prototype.byActionId = function(actionId) {
return new AuditQueryEngine(this.events.filter(function(e) {
return e.actionId === actionId;
}));
};
AuditQueryEngine.prototype.byTimeRange = function(start, end) {
var startMs = new Date(start).getTime();
var endMs = new Date(end).getTime();
return new AuditQueryEngine(this.events.filter(function(e) {
var ts = new Date(e.timestamp).getTime();
return ts >= startMs && ts <= endMs;
}));
};
AuditQueryEngine.prototype.suspicious = function() {
var suspiciousActions = [
"Security.ModifyPermission",
"Security.RemovePermission",
"Git.RepositoryDeleted",
"Policy.PolicyConfigRemoved",
"Pipelines.PipelineModified",
"Token.PatCreateEvent",
"Group.UpdateGroupMembership.Add",
"Group.UpdateGroupMembership.Remove",
"Security.ModifyAccessControlLists"
];
return new AuditQueryEngine(this.events.filter(function(e) {
return suspiciousActions.indexOf(e.actionId) !== -1;
}));
};
AuditQueryEngine.prototype.afterHours = function() {
return new AuditQueryEngine(this.events.filter(function(e) {
var hour = new Date(e.timestamp).getUTCHours();
return hour < 6 || hour > 22;
}));
};
AuditQueryEngine.prototype.results = function() {
return this.events;
};
AuditQueryEngine.prototype.count = function() {
return this.events.length;
};
// Usage
var query = new AuditQueryEngine(events);
// Find all permission changes
var permissionChanges = query.byArea("Security").results();
// Find suspicious after-hours activity
var afterHoursSuspicious = query.suspicious().afterHours().results();
// Find all actions by a specific user
var userActions = query.byActor("[email protected]").results();
console.log("Permission changes:", permissionChanges.length);
console.log("After-hours suspicious events:", afterHoursSuspicious.length);
console.log("Admin actions:", userActions.length);
The chainable query interface makes it straightforward to build complex filters for different compliance scenarios without writing repetitive filter logic.
Compliance Reporting Automation
Auditors want reports, not raw JSON. Here is a compliance report generator:
var fs = require("fs");
function ComplianceReporter(events) {
this.events = events;
this.query = new AuditQueryEngine(events);
}
ComplianceReporter.prototype.generateSOC2Report = function(startDate, endDate) {
var filtered = this.query.byTimeRange(startDate, endDate).results();
var queryFiltered = new AuditQueryEngine(filtered);
var report = {
reportTitle: "SOC 2 Type II Audit Log Report",
organization: process.env.AZURE_DEVOPS_ORG,
reportPeriod: { start: startDate, end: endDate },
generatedAt: new Date().toISOString(),
totalEvents: filtered.length,
controls: {}
};
// CC6.1 - Logical and Physical Access Controls
var accessEvents = queryFiltered.byArea("Security").results();
var tokenEvents = queryFiltered.byArea("Token").results();
report.controls["CC6.1"] = {
name: "Logical and Physical Access Controls",
eventCount: accessEvents.length + tokenEvents.length,
permissionChanges: accessEvents.filter(function(e) {
return e.actionId.indexOf("ModifyPermission") !== -1;
}).length,
patCreations: tokenEvents.filter(function(e) {
return e.actionId === "Token.PatCreateEvent";
}).length,
groupMembershipChanges: accessEvents.filter(function(e) {
return e.actionId.indexOf("GroupMembership") !== -1;
}).length,
status: "REVIEWED"
};
// CC7.1 - System Operations Monitoring
var pipelineEvents = queryFiltered.byArea("Pipelines").results();
report.controls["CC7.1"] = {
name: "System Operations Monitoring",
eventCount: pipelineEvents.length,
pipelineModifications: pipelineEvents.filter(function(e) {
return e.actionId === "Pipelines.PipelineModified";
}).length,
pipelineRuns: pipelineEvents.filter(function(e) {
return e.actionId === "Pipelines.PipelineRunStateChanged";
}).length,
status: "REVIEWED"
};
// CC8.1 - Change Management
var policyEvents = queryFiltered.byArea("Policy").results();
var gitEvents = queryFiltered.byArea("Git").results();
report.controls["CC8.1"] = {
name: "Change Management",
eventCount: policyEvents.length + gitEvents.length,
policyChanges: policyEvents.length,
repositoryChanges: gitEvents.length,
branchPolicyViolations: policyEvents.filter(function(e) {
return e.actionId === "Policy.PolicyConfigRemoved";
}).length,
status: "REVIEWED"
};
return report;
};
ComplianceReporter.prototype.generateISO27001Report = function(startDate, endDate) {
var filtered = this.query.byTimeRange(startDate, endDate).results();
var queryFiltered = new AuditQueryEngine(filtered);
var report = {
reportTitle: "ISO 27001 Annex A Control Evidence Report",
organization: process.env.AZURE_DEVOPS_ORG,
reportPeriod: { start: startDate, end: endDate },
generatedAt: new Date().toISOString(),
totalEvents: filtered.length,
annexAControls: {}
};
// A.9 - Access Control
report.annexAControls["A.9"] = {
name: "Access Control",
description: "User access management and access control policy evidence",
events: queryFiltered.byArea("Security").results().map(function(e) {
return {
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details,
ip: e.ipAddress
};
})
};
// A.12 - Operations Security
report.annexAControls["A.12"] = {
name: "Operations Security",
description: "Operational procedures and change management evidence",
events: queryFiltered.byArea("Git").results().map(function(e) {
return {
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details
};
})
};
// A.14 - System Acquisition, Development and Maintenance
report.annexAControls["A.14"] = {
name: "System Acquisition, Development and Maintenance",
description: "Secure development lifecycle evidence",
events: queryFiltered.byArea("Pipelines").results()
.concat(queryFiltered.byArea("Policy").results())
.map(function(e) {
return {
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details
};
})
};
return report;
};
ComplianceReporter.prototype.writeReport = function(report, outputPath) {
fs.writeFileSync(outputPath, JSON.stringify(report, null, 2));
console.log("Report written to " + outputPath);
console.log(" Total events analyzed: " + report.totalEvents);
console.log(" Report period: " + report.reportPeriod.start +
" to " + report.reportPeriod.end);
};
// Generate quarterly SOC 2 report
var reporter = new ComplianceReporter(allEvents);
var soc2Report = reporter.generateSOC2Report("2026-01-01", "2026-03-31");
reporter.writeReport(soc2Report, "./reports/soc2-q1-2026.json");
Sample output:
Report written to ./reports/soc2-q1-2026.json
Total events analyzed: 14,827
Report period: 2026-01-01 to 2026-03-31
SOC 2 and ISO 27001 Control Mapping
Here is how Azure DevOps audit events map to specific compliance controls. Use this table when building targeted collection and reporting:
SOC 2 Trust Services Criteria:
- CC6.1 (Logical Access):
Security.ModifyPermission,Security.ModifyAccessControlLists,Group.UpdateGroupMembership.*,Token.Pat*Event - CC6.2 (Prior to Access):
Organization.UserAdded,Licensing.Assigned,Group.CreateGroup - CC6.3 (Access Removal):
Organization.UserRemoved,Licensing.Removed,Token.PatRevokeEvent - CC7.1 (Monitoring):
Pipelines.*,Git.RefUpdatePoliciesBypassed - CC8.1 (Change Management):
Policy.*,Git.RepositoryCreated,Git.RepositoryDeleted,Pipelines.PipelineModified
ISO 27001 Annex A:
- A.9.2 (User Access Management): All Security and Licensing events
- A.9.4 (System Access Control): Token events, authentication events
- A.12.1 (Operational Procedures): Pipeline run events, deployment events
- A.12.4 (Logging and Monitoring): All audit events (meta-compliance)
- A.14.2 (Security in Development): Policy events, code review events, branch protection events
Retention Policies and Archival
Azure DevOps retains audit logs for 90 days. Most compliance frameworks require significantly longer retention:
| Framework | Minimum Retention |
|---|---|
| SOC 2 | 1 year |
| ISO 27001 | 3 years |
| HIPAA | 6 years |
| PCI DSS | 1 year |
| FedRAMP | 3 years |
Here is a retention and archival strategy in Node.js:
var fs = require("fs");
var path = require("path");
var zlib = require("zlib");
function AuditArchiver(config) {
this.archiveDir = config.archiveDir;
this.retentionDays = config.retentionDays || 1095; // 3 years default
}
AuditArchiver.prototype.archiveFile = function(sourcePath, callback) {
var filename = path.basename(sourcePath);
var dateMatch = filename.match(/audit-(\d{4}-\d{2}-\d{2})/);
if (!dateMatch) {
return callback(new Error("Cannot parse date from filename: " + filename));
}
var yearMonth = dateMatch[1].substring(0, 7); // YYYY-MM
var monthDir = path.join(this.archiveDir, yearMonth);
if (!fs.existsSync(monthDir)) {
fs.mkdirSync(monthDir, { recursive: true });
}
var gzipDest = path.join(monthDir, filename + ".gz");
var input = fs.createReadStream(sourcePath);
var output = fs.createWriteStream(gzipDest);
var gzip = zlib.createGzip({ level: 9 });
input.pipe(gzip).pipe(output);
output.on("finish", function() {
var originalSize = fs.statSync(sourcePath).size;
var compressedSize = fs.statSync(gzipDest).size;
var ratio = ((1 - compressedSize / originalSize) * 100).toFixed(1);
console.log("Archived: " + filename + " (" + ratio + "% compression)");
// Generate SHA-256 hash for integrity verification
var crypto = require("crypto");
var hash = crypto.createHash("sha256");
var fileBuffer = fs.readFileSync(gzipDest);
hash.update(fileBuffer);
var checksum = hash.digest("hex");
// Write checksum file
fs.writeFileSync(gzipDest + ".sha256", checksum + " " + filename + ".gz\n");
// Remove original
fs.unlinkSync(sourcePath);
callback(null, { path: gzipDest, checksum: checksum, compressionRatio: ratio });
});
output.on("error", callback);
};
AuditArchiver.prototype.purgeExpired = function() {
var self = this;
var cutoffDate = new Date(Date.now() - this.retentionDays * 24 * 60 * 60 * 1000);
var purgedCount = 0;
if (!fs.existsSync(this.archiveDir)) return purgedCount;
var monthDirs = fs.readdirSync(this.archiveDir);
monthDirs.forEach(function(monthDir) {
var dirDate = new Date(monthDir + "-01");
if (dirDate < cutoffDate) {
var fullPath = path.join(self.archiveDir, monthDir);
fs.rmSync(fullPath, { recursive: true });
purgedCount++;
console.log("Purged expired archive: " + monthDir);
}
});
return purgedCount;
};
// Usage
var archiver = new AuditArchiver({
archiveDir: "/data/audit-archives",
retentionDays: 1095 // 3 years for ISO 27001
});
archiver.archiveFile("./audit-logs/audit-2026-02-13-1707849600000.json", function(err, result) {
if (err) return console.error(err);
console.log("Checksum:", result.checksum);
});
// Run monthly to purge expired archives
var purged = archiver.purgeExpired();
console.log("Purged " + purged + " expired archive directories");
The SHA-256 checksums provide tamper evidence — a requirement for most compliance frameworks. Store checksums separately from the archives, ideally in a write-once storage system.
Streaming Audit Logs to External Systems
Streaming to Splunk via HEC
var http = require("http");
var https = require("https");
function SplunkForwarder(config) {
this.hecUrl = config.hecUrl; // e.g., "https://splunk.company.com:8088"
this.hecToken = config.hecToken;
this.index = config.index || "azure_devops_audit";
this.sourcetype = config.sourcetype || "azure:devops:audit";
}
SplunkForwarder.prototype.sendEvents = function(events, callback) {
var self = this;
var payload = events.map(function(event) {
return JSON.stringify({
time: new Date(event.timestamp).getTime() / 1000,
source: "azure_devops:" + (process.env.AZURE_DEVOPS_ORG || "unknown"),
sourcetype: self.sourcetype,
index: self.index,
event: {
actionId: event.actionId,
area: event.area,
actorDisplayName: event.actorDisplayName,
actorUPN: event.actorUPN,
details: event.details,
ipAddress: event.ipAddress,
scopeDisplayName: event.scopeDisplayName,
projectName: event.projectName,
correlationId: event.correlationId,
timestamp: event.timestamp
}
});
}).join("\n");
var parsed = require("url").parse(self.hecUrl + "/services/collector/event");
var transport = parsed.protocol === "https:" ? https : http;
var options = {
hostname: parsed.hostname,
port: parsed.port,
path: parsed.path,
method: "POST",
headers: {
"Authorization": "Splunk " + self.hecToken,
"Content-Type": "application/json"
},
rejectUnauthorized: true
};
var req = transport.request(options, function(res) {
var body = "";
res.on("data", function(chunk) { body += chunk; });
res.on("end", function() {
if (res.statusCode !== 200) {
return callback(new Error("Splunk HEC error: " + res.statusCode + " " + body));
}
var result = JSON.parse(body);
if (result.code !== 0) {
return callback(new Error("Splunk rejected events: " + result.text));
}
callback(null, { sent: events.length });
});
});
req.on("error", callback);
req.write(payload);
req.end();
};
// Usage
var splunk = new SplunkForwarder({
hecUrl: "https://splunk.company.com:8088",
hecToken: process.env.SPLUNK_HEC_TOKEN,
index: "azure_devops_audit"
});
splunk.sendEvents(events, function(err, result) {
if (err) return console.error("Splunk forwarding failed:", err.message);
console.log("Sent " + result.sent + " events to Splunk");
});
Streaming to Elasticsearch
var http = require("http");
var https = require("https");
function ElasticsearchForwarder(config) {
this.esUrl = config.esUrl; // e.g., "https://elastic.company.com:9200"
this.apiKey = config.apiKey;
this.indexPrefix = config.indexPrefix || "azuredevops-audit";
}
ElasticsearchForwarder.prototype.sendBulk = function(events, callback) {
var self = this;
var dateStr = new Date().toISOString().split("T")[0].replace(/-/g, ".");
var indexName = self.indexPrefix + "-" + dateStr;
var bulkBody = "";
events.forEach(function(event) {
bulkBody += JSON.stringify({ index: { _index: indexName } }) + "\n";
bulkBody += JSON.stringify({
"@timestamp": event.timestamp,
action_id: event.actionId,
area: event.area,
actor: event.actorDisplayName,
actor_upn: event.actorUPN,
details: event.details,
ip_address: event.ipAddress,
scope: event.scopeDisplayName,
project: event.projectName,
correlation_id: event.correlationId
}) + "\n";
});
var parsed = require("url").parse(self.esUrl + "/_bulk");
var transport = parsed.protocol === "https:" ? https : http;
var options = {
hostname: parsed.hostname,
port: parsed.port,
path: parsed.path,
method: "POST",
headers: {
"Authorization": "ApiKey " + self.apiKey,
"Content-Type": "application/x-ndjson"
}
};
var req = transport.request(options, function(res) {
var body = "";
res.on("data", function(chunk) { body += chunk; });
res.on("end", function() {
if (res.statusCode !== 200) {
return callback(new Error("ES bulk error: " + res.statusCode));
}
var result = JSON.parse(body);
if (result.errors) {
var failedCount = result.items.filter(function(item) {
return item.index.status >= 400;
}).length;
console.warn(failedCount + " events failed to index");
}
callback(null, { indexed: events.length - (result.errors ? 0 : 0) });
});
});
req.on("error", callback);
req.write(bulkBody);
req.end();
};
Alerting on Suspicious Activities
Passive collection is not enough. You need real-time alerting when high-risk events occur:
var https = require("https");
function AuditAlertEngine(config) {
this.webhookUrl = config.slackWebhookUrl;
this.emailEndpoint = config.emailEndpoint;
this.rules = [];
}
AuditAlertEngine.prototype.addRule = function(rule) {
this.rules.push(rule);
};
AuditAlertEngine.prototype.evaluate = function(events, callback) {
var self = this;
var alerts = [];
events.forEach(function(event) {
self.rules.forEach(function(rule) {
if (rule.condition(event)) {
alerts.push({
ruleName: rule.name,
severity: rule.severity,
event: event,
message: rule.message(event),
timestamp: new Date().toISOString()
});
}
});
});
if (alerts.length === 0) {
return callback(null, []);
}
console.log("ALERT: " + alerts.length + " suspicious events detected");
// Send to Slack
self.notifySlack(alerts, function(err) {
if (err) console.error("Slack notification failed:", err.message);
callback(null, alerts);
});
};
AuditAlertEngine.prototype.notifySlack = function(alerts, callback) {
if (!this.webhookUrl) return callback(null);
var criticalAlerts = alerts.filter(function(a) {
return a.severity === "CRITICAL";
});
var highAlerts = alerts.filter(function(a) {
return a.severity === "HIGH";
});
var text = ":rotating_light: *Azure DevOps Security Alert*\n\n" +
"Critical: " + criticalAlerts.length + " | High: " + highAlerts.length + "\n\n";
alerts.slice(0, 10).forEach(function(alert) {
text += "*[" + alert.severity + "]* " + alert.ruleName + "\n" +
"Actor: " + alert.event.actorDisplayName + "\n" +
"Action: " + alert.event.actionId + "\n" +
"Details: " + alert.message + "\n" +
"Time: " + alert.event.timestamp + "\n" +
"IP: " + (alert.event.ipAddress || "N/A") + "\n\n";
});
var payload = JSON.stringify({ text: text });
var parsed = require("url").parse(this.webhookUrl);
var options = {
hostname: parsed.hostname,
path: parsed.path,
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(payload)
}
};
var req = https.request(options, function(res) {
var body = "";
res.on("data", function(chunk) { body += chunk; });
res.on("end", function() {
callback(res.statusCode === 200 ? null : new Error("Slack: " + res.statusCode));
});
});
req.on("error", callback);
req.write(payload);
req.end();
};
// Define alert rules
var alertEngine = new AuditAlertEngine({
slackWebhookUrl: process.env.SLACK_WEBHOOK_URL
});
alertEngine.addRule({
name: "Branch Policy Removed",
severity: "CRITICAL",
condition: function(e) { return e.actionId === "Policy.PolicyConfigRemoved"; },
message: function(e) {
return "Branch policy removed in " + (e.projectName || "unknown project") +
" by " + e.actorDisplayName;
}
});
alertEngine.addRule({
name: "Repository Deleted",
severity: "CRITICAL",
condition: function(e) { return e.actionId === "Git.RepositoryDeleted"; },
message: function(e) {
return "Repository deleted by " + e.actorDisplayName + ": " + e.details;
}
});
alertEngine.addRule({
name: "Permission Escalation",
severity: "HIGH",
condition: function(e) {
return e.actionId === "Security.ModifyPermission" &&
e.details && e.details.indexOf("Allow") !== -1;
},
message: function(e) {
return "Permission granted: " + e.details;
}
});
alertEngine.addRule({
name: "After-Hours PAT Creation",
severity: "HIGH",
condition: function(e) {
if (e.actionId !== "Token.PatCreateEvent") return false;
var hour = new Date(e.timestamp).getUTCHours();
return hour < 6 || hour > 22;
},
message: function(e) {
return "PAT created outside business hours by " + e.actorDisplayName +
" at " + e.timestamp;
}
});
alertEngine.addRule({
name: "Pipeline Definition Modified",
severity: "HIGH",
condition: function(e) { return e.actionId === "Pipelines.PipelineModified"; },
message: function(e) {
return "Pipeline modified by " + e.actorDisplayName + " in " +
(e.projectName || "unknown project");
}
});
alertEngine.addRule({
name: "Bulk Permission Changes",
severity: "CRITICAL",
condition: function(e) {
return e.actionId === "Security.ModifyAccessControlLists";
},
message: function(e) {
return "ACL bulk modification by " + e.actorDisplayName;
}
});
Custom Audit Dashboards
For teams that do not have access to Splunk or a full SIEM, you can build a simple dashboard data API:
var express = require("express");
var fs = require("fs");
var path = require("path");
var app = express();
function loadEvents(daysBack) {
var events = [];
var logDir = path.join(__dirname, "audit-logs");
if (!fs.existsSync(logDir)) return events;
var files = fs.readdirSync(logDir).filter(function(f) {
return f.endsWith(".json");
});
var cutoff = new Date(Date.now() - daysBack * 24 * 60 * 60 * 1000);
files.forEach(function(file) {
try {
var raw = fs.readFileSync(path.join(logDir, file), "utf8");
var fileEvents = JSON.parse(raw);
fileEvents.forEach(function(e) {
if (new Date(e.timestamp) >= cutoff) {
events.push(e);
}
});
} catch (err) {
console.error("Error loading " + file + ":", err.message);
}
});
return events;
}
app.get("/api/dashboard/summary", function(req, res) {
var days = parseInt(req.query.days) || 7;
var events = loadEvents(days);
var query = new AuditQueryEngine(events);
var summary = {
period: days + " days",
totalEvents: events.length,
byArea: {},
topActors: {},
suspiciousCount: query.suspicious().count(),
afterHoursCount: query.afterHours().count()
};
events.forEach(function(e) {
summary.byArea[e.area] = (summary.byArea[e.area] || 0) + 1;
var actor = e.actorDisplayName || "Unknown";
summary.topActors[actor] = (summary.topActors[actor] || 0) + 1;
});
// Sort actors by count, take top 10
var sortedActors = Object.keys(summary.topActors).sort(function(a, b) {
return summary.topActors[b] - summary.topActors[a];
}).slice(0, 10);
var topActors = {};
sortedActors.forEach(function(actor) {
topActors[actor] = summary.topActors[actor];
});
summary.topActors = topActors;
res.json(summary);
});
app.get("/api/dashboard/timeline", function(req, res) {
var days = parseInt(req.query.days) || 7;
var events = loadEvents(days);
var timeline = {};
events.forEach(function(e) {
var dateKey = e.timestamp.split("T")[0];
if (!timeline[dateKey]) {
timeline[dateKey] = { total: 0, security: 0, git: 0, pipelines: 0, policy: 0 };
}
timeline[dateKey].total++;
var area = e.area ? e.area.toLowerCase() : "other";
if (timeline[dateKey][area] !== undefined) {
timeline[dateKey][area]++;
}
});
res.json(timeline);
});
app.listen(3000, function() {
console.log("Audit dashboard API running on port 3000");
});
Sample response from /api/dashboard/summary?days=7:
{
"period": "7 days",
"totalEvents": 2341,
"byArea": {
"Git": 892,
"Pipelines": 634,
"Security": 187,
"Policy": 43,
"Organization": 312,
"Token": 89,
"Licensing": 184
},
"topActors": {
"Build Service (contoso)": 634,
"Jane Smith": 412,
"John Doe": 387,
"Release Pipeline": 298
},
"suspiciousCount": 23,
"afterHoursCount": 156
}
Regulatory Compliance Frameworks
Beyond SOC 2 and ISO 27001, here is how Azure DevOps audit logging maps to other frameworks:
HIPAA (Health Insurance Portability and Accountability Act):
- 164.312(b) — Audit controls: All audit log events satisfy this requirement
- 164.312(d) — Person or entity authentication: Token and authentication events
- 164.308(a)(1)(ii)(D) — Information system activity review: Requires regular review of audit logs, which your collector and reporter automate
PCI DSS (Payment Card Industry Data Security Standard):
- Requirement 10.1 — Implement audit trails: Full audit log collection
- Requirement 10.2 — Automated audit trails for specific events: Permission changes, access changes
- Requirement 10.5 — Secure audit trails: Archive with SHA-256 checksums
- Requirement 10.7 — Retain audit trail history for at least one year: Archiver with configurable retention
FedRAMP:
- AU-2 — Auditable Events: All audit event categories
- AU-3 — Content of Audit Records: The audit API returns actor, timestamp, action, and outcome
- AU-6 — Audit Review, Analysis, and Reporting: Automated compliance reporting
- AU-9 — Protection of Audit Information: Checksums and secure storage
Complete Working Example
Here is the full application that ties everything together — collection, compliance reporting, alerting, and archival:
#!/usr/bin/env node
var https = require("https");
var fs = require("fs");
var path = require("path");
var crypto = require("crypto");
var zlib = require("zlib");
var url = require("url");
// ---- Configuration ----
var CONFIG = {
org: process.env.AZURE_DEVOPS_ORG,
pat: process.env.AZURE_DEVOPS_PAT,
slackWebhookUrl: process.env.SLACK_WEBHOOK_URL,
stateFile: path.join(__dirname, "state.json"),
outputDir: path.join(__dirname, "audit-data"),
reportDir: path.join(__dirname, "reports"),
archiveDir: path.join(__dirname, "archives"),
retentionDays: 1095
};
// ---- Validate config ----
if (!CONFIG.org || !CONFIG.pat) {
console.error("ERROR: AZURE_DEVOPS_ORG and AZURE_DEVOPS_PAT must be set");
process.exit(1);
}
// ---- State management ----
function loadState() {
try {
return JSON.parse(fs.readFileSync(CONFIG.stateFile, "utf8"));
} catch (e) {
return { lastRun: null, totalCollected: 0, runs: [] };
}
}
function saveState(state) {
fs.writeFileSync(CONFIG.stateFile, JSON.stringify(state, null, 2));
}
// ---- API client ----
function fetchAuditPage(startTime, endTime, token, callback) {
var apiUrl = "https://auditservice.dev.azure.com/" + CONFIG.org +
"/_apis/audit/auditlog?api-version=7.1-preview.1" +
"&startTime=" + encodeURIComponent(startTime) +
"&endTime=" + encodeURIComponent(endTime);
if (token) {
apiUrl += "&continuationToken=" + encodeURIComponent(token);
}
var parsed = url.parse(apiUrl);
var auth = Buffer.from(":" + CONFIG.pat).toString("base64");
var req = https.request({
hostname: parsed.hostname,
path: parsed.path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Content-Type": "application/json"
}
}, function(res) {
var body = "";
res.on("data", function(chunk) { body += chunk; });
res.on("end", function() {
if (res.statusCode === 429) {
var wait = parseInt(res.headers["retry-after"] || "10", 10);
console.log("Rate limited — waiting " + wait + "s");
return setTimeout(function() {
fetchAuditPage(startTime, endTime, token, callback);
}, wait * 1000);
}
if (res.statusCode !== 200) {
return callback(new Error("API " + res.statusCode + ": " + body));
}
callback(null, JSON.parse(body));
});
});
req.on("error", callback);
req.end();
}
function fetchAllPages(startTime, endTime, callback) {
var allEvents = [];
function next(token) {
fetchAuditPage(startTime, endTime, token, function(err, data) {
if (err) return callback(err);
var entries = data.decoratedAuditLogEntries || [];
allEvents = allEvents.concat(entries);
process.stdout.write("\r Collected " + allEvents.length + " events...");
if (data.hasMore && data.continuationToken) {
setTimeout(function() { next(data.continuationToken); }, 500);
} else {
console.log("");
callback(null, allEvents);
}
});
}
next(null);
}
// ---- Alert evaluation ----
function evaluateAlerts(events) {
var criticalActions = [
"Policy.PolicyConfigRemoved",
"Git.RepositoryDeleted",
"Security.ModifyAccessControlLists"
];
var highActions = [
"Security.ModifyPermission",
"Pipelines.PipelineModified",
"Token.PatCreateEvent",
"Group.UpdateGroupMembership.Add"
];
var alerts = [];
events.forEach(function(event) {
var severity = null;
if (criticalActions.indexOf(event.actionId) !== -1) {
severity = "CRITICAL";
} else if (highActions.indexOf(event.actionId) !== -1) {
var hour = new Date(event.timestamp).getUTCHours();
if (hour < 6 || hour > 22) {
severity = "HIGH";
}
}
if (severity) {
alerts.push({
severity: severity,
actionId: event.actionId,
actor: event.actorDisplayName,
details: event.details,
timestamp: event.timestamp,
ip: event.ipAddress
});
}
});
return alerts;
}
// ---- Compliance report generation ----
function generateComplianceReport(events, startDate, endDate) {
var areas = {};
var actors = {};
var actionCounts = {};
events.forEach(function(e) {
areas[e.area] = (areas[e.area] || 0) + 1;
actors[e.actorDisplayName] = (actors[e.actorDisplayName] || 0) + 1;
actionCounts[e.actionId] = (actionCounts[e.actionId] || 0) + 1;
});
var securityEvents = events.filter(function(e) { return e.area === "Security"; });
var policyEvents = events.filter(function(e) { return e.area === "Policy"; });
var tokenEvents = events.filter(function(e) {
return e.actionId && e.actionId.indexOf("Token.") === 0;
});
return {
meta: {
title: "Azure DevOps Compliance Report",
org: CONFIG.org,
period: { start: startDate, end: endDate },
generated: new Date().toISOString()
},
summary: {
totalEvents: events.length,
uniqueActors: Object.keys(actors).length,
eventsByArea: areas,
topActions: actionCounts
},
soc2: {
"CC6.1_access_control": {
permissionChanges: securityEvents.length,
tokenEvents: tokenEvents.length,
finding: securityEvents.length > 0 ? "EVENTS_RECORDED" : "NO_EVENTS"
},
"CC8.1_change_management": {
policyChanges: policyEvents.length,
finding: policyEvents.length > 0 ? "EVENTS_RECORDED" : "NO_EVENTS"
}
},
iso27001: {
"A.9_access_control": securityEvents.length + " events",
"A.12_operations_security": areas["Git"] || 0 + " events",
"A.14_secure_development": (areas["Pipelines"] || 0) +
(areas["Policy"] || 0) + " events"
},
highRiskEvents: events.filter(function(e) {
return e.actionId === "Policy.PolicyConfigRemoved" ||
e.actionId === "Git.RepositoryDeleted" ||
e.actionId === "Security.ModifyAccessControlLists";
}).map(function(e) {
return {
timestamp: e.timestamp,
actor: e.actorDisplayName,
action: e.actionId,
details: e.details
};
})
};
}
// ---- Main execution ----
function main() {
var command = process.argv[2] || "collect";
// Ensure directories exist
[CONFIG.outputDir, CONFIG.reportDir, CONFIG.archiveDir].forEach(function(dir) {
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
});
var state = loadState();
if (command === "collect") {
var endTime = new Date().toISOString();
var startTime = state.lastRun ||
new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString();
console.log("=== Azure DevOps Audit Log Collector ===");
console.log("Organization: " + CONFIG.org);
console.log("Collecting from " + startTime + " to " + endTime);
fetchAllPages(startTime, endTime, function(err, events) {
if (err) {
console.error("Collection failed:", err.message);
process.exit(1);
}
if (events.length === 0) {
console.log("No new events.");
return;
}
// Save events
var filename = "audit-" + new Date().toISOString().split("T")[0] +
"-" + Date.now() + ".json";
var filepath = path.join(CONFIG.outputDir, filename);
fs.writeFileSync(filepath, JSON.stringify(events, null, 2));
console.log("Saved " + events.length + " events to " + filepath);
// Evaluate alerts
var alerts = evaluateAlerts(events);
if (alerts.length > 0) {
console.log("\n!!! " + alerts.length + " ALERTS DETECTED !!!");
alerts.forEach(function(a) {
console.log(" [" + a.severity + "] " + a.actionId +
" by " + a.actor + " at " + a.timestamp);
});
}
// Update state
state.lastRun = endTime;
state.totalCollected += events.length;
state.runs.push({
timestamp: endTime,
eventsCollected: events.length,
alertsTriggered: alerts.length
});
// Keep only last 100 run records
if (state.runs.length > 100) {
state.runs = state.runs.slice(-100);
}
saveState(state);
console.log("Total events collected to date: " + state.totalCollected);
});
} else if (command === "report") {
var startDate = process.argv[3] || new Date(
Date.now() - 30 * 24 * 60 * 60 * 1000).toISOString().split("T")[0];
var endDate = process.argv[4] || new Date().toISOString().split("T")[0];
console.log("=== Generating Compliance Report ===");
console.log("Period: " + startDate + " to " + endDate);
// Load all events from files
var allEvents = [];
var files = fs.readdirSync(CONFIG.outputDir).filter(function(f) {
return f.endsWith(".json");
});
files.forEach(function(f) {
try {
var raw = fs.readFileSync(path.join(CONFIG.outputDir, f), "utf8");
allEvents = allEvents.concat(JSON.parse(raw));
} catch (e) {
console.error("Error loading " + f);
}
});
// Filter to date range
var filtered = allEvents.filter(function(e) {
var ts = e.timestamp.split("T")[0];
return ts >= startDate && ts <= endDate;
});
var report = generateComplianceReport(filtered, startDate, endDate);
var reportFile = path.join(CONFIG.reportDir,
"compliance-" + startDate + "-to-" + endDate + ".json");
fs.writeFileSync(reportFile, JSON.stringify(report, null, 2));
console.log("Report saved to " + reportFile);
console.log("Total events analyzed: " + report.summary.totalEvents);
console.log("High-risk events: " + report.highRiskEvents.length);
} else if (command === "archive") {
console.log("=== Archiving Old Audit Logs ===");
var cutoff = new Date(Date.now() - 7 * 24 * 60 * 60 * 1000);
var logFiles = fs.readdirSync(CONFIG.outputDir).filter(function(f) {
return f.endsWith(".json");
});
var archivedCount = 0;
logFiles.forEach(function(f) {
var filePath = path.join(CONFIG.outputDir, f);
var stat = fs.statSync(filePath);
if (stat.mtime < cutoff) {
var content = fs.readFileSync(filePath);
var compressed = zlib.gzipSync(content, { level: 9 });
var yearMonth = f.match(/audit-(\d{4}-\d{2})/);
var archiveSubdir = yearMonth ? yearMonth[1] : "unknown";
var targetDir = path.join(CONFIG.archiveDir, archiveSubdir);
if (!fs.existsSync(targetDir)) {
fs.mkdirSync(targetDir, { recursive: true });
}
var archivePath = path.join(targetDir, f + ".gz");
fs.writeFileSync(archivePath, compressed);
// Write checksum
var hash = crypto.createHash("sha256").update(compressed).digest("hex");
fs.writeFileSync(archivePath + ".sha256", hash + " " + f + ".gz\n");
fs.unlinkSync(filePath);
archivedCount++;
var ratio = ((1 - compressed.length / content.length) * 100).toFixed(1);
console.log(" Archived: " + f + " (" + ratio + "% compression)");
}
});
console.log("Archived " + archivedCount + " files");
} else {
console.log("Usage: node audit-compliance.js [collect|report|archive]");
console.log(" collect Collect new audit events");
console.log(" report [start] [end] Generate compliance report");
console.log(" archive Archive old log files");
}
}
main();
Running the collector:
$ node audit-compliance.js collect
=== Azure DevOps Audit Log Collector ===
Organization: contoso
Collecting from 2026-02-12T00:00:00.000Z to 2026-02-13T14:30:00.000Z
Collected 347 events...
Saved 347 events to /app/audit-data/audit-2026-02-13-1739453400000.json
!!! 4 ALERTS DETECTED !!!
[CRITICAL] Policy.PolicyConfigRemoved by Jane Smith at 2026-02-12T23:15:00.000Z
[HIGH] Security.ModifyPermission by [email protected] at 2026-02-13T03:45:00.000Z
[HIGH] Token.PatCreateEvent by [email protected] at 2026-02-13T02:10:00.000Z
[HIGH] Pipelines.PipelineModified by release-bot at 2026-02-13T01:30:00.000Z
Total events collected to date: 14827
Generating a compliance report:
$ node audit-compliance.js report 2026-01-01 2026-02-13
=== Generating Compliance Report ===
Period: 2026-01-01 to 2026-02-13
Report saved to /app/reports/compliance-2026-01-01-to-2026-02-13.json
Total events analyzed: 14827
High-risk events: 12
Common Issues and Troubleshooting
1. Authentication Failure with 401
Error: API 401: {"message":"The resource cannot be accessed with the current permissions."}
This happens when your PAT does not have the Audit Log: Read scope. Generate a new PAT at https://dev.azure.com/{org}/_usersSettings/tokens and explicitly select the Audit Log scope under "Read" permissions. Organization-level PATs are required — project-scoped PATs will not work for the audit API.
2. Empty Results Despite Known Activity
Retrieved 0 events
The audit log API endpoint uses a different base URL than the standard API. You must use auditservice.dev.azure.com, not dev.azure.com. Also verify your time range — the API uses UTC timestamps. If your local timezone is UTC-8, events at 4 PM local time are recorded at midnight UTC the next day.
3. Continuation Token Expiration
Error: API 400: {"message":"The continuation token is invalid or has expired."}
Continuation tokens expire after approximately 15 minutes. If your collection process is slow (large time ranges, rate limiting), the token can expire between pages. The fix is to narrow your time windows. Instead of collecting a full day at once, collect in 4-hour windows:
// Break large time ranges into 4-hour windows
var windowMs = 4 * 60 * 60 * 1000;
var current = new Date(startTime).getTime();
var end = new Date(endTime).getTime();
while (current < end) {
var windowEnd = Math.min(current + windowMs, end);
// Fetch each window independently
current = windowEnd;
}
4. Rate Limiting (429 Too Many Requests)
Error: API 429: {"message":"Rate limit exceeded"}
Retry-After: 30
The audit log API has stricter rate limits than most Azure DevOps APIs — approximately 50 requests per minute. Always check for the Retry-After header and implement exponential backoff. The collector in this article handles this automatically, but if you are building your own, do not ignore 429 responses.
5. Missing Events in Specific Areas
Expected pipeline events but area filter returns empty
Not all Azure DevOps features generate audit events in all plans. Some audit event categories are only available in Azure DevOps Services (cloud), not Azure DevOps Server (on-premises). Additionally, certain events (like detailed Git push events) require the "Auditing" organization policy to be explicitly enabled under Organization Settings > Policies > Auditing.
Best Practices
Collect at least hourly. The 90-day retention window means you have a limited window to capture events. Running the collector every hour minimizes the risk of data loss if a collection run fails.
Store raw events immutably. Never modify collected audit logs. Write them to append-only storage (or at minimum, generate checksums). Auditors need to verify that logs have not been tampered with.
Separate collection from analysis. Your collector should do one thing: reliably capture events. Build reporting, alerting, and dashboards as separate processes that consume the collected data. This separation makes each component easier to test and maintain.
Alert on policy removals, not just additions. Adding a branch policy is good. Removing one is a potential compliance violation. Your alert rules should focus on destructive actions: policy removals, permission escalations, repository deletions, and PAT creation outside business hours.
Automate compliance report generation on a schedule. Do not wait for an auditor to ask for a report. Generate monthly or quarterly reports automatically and store them alongside your audit archives. When the auditor arrives, you hand them a stack of pre-generated reports.
Use dedicated service accounts for collection. Do not use a personal PAT for your audit log collector. Create a dedicated service account with minimal permissions (Audit Log: Read only). This limits blast radius and provides a clear audit trail of the collection process itself.
Test your retention and archival pipeline end-to-end. Verify that you can actually restore and read archived logs. Compression, checksums, and date-based directory structures are useless if you cannot decompress and query the data when an auditor asks for events from 18 months ago.
Map events to controls before you need to. Do not wait for audit season to figure out which events map to which SOC 2 or ISO 27001 controls. Build the mapping into your report generator from day one. The control mapping tables in this article are a starting point.
References
- Azure DevOps Audit Log REST API — official API documentation
- Azure DevOps Auditing — overview of the auditing feature
- SOC 2 Trust Services Criteria — AICPA SOC 2 framework
- ISO/IEC 27001:2022 — information security management standard
- Azure DevOps REST API Reference — general API documentation
- Splunk HTTP Event Collector — HEC configuration guide
- Elasticsearch Bulk API — bulk indexing documentation