Dependency Vulnerability Scanning
Implement comprehensive dependency vulnerability scanning for Node.js projects in Azure DevOps with automated remediation and supply chain protection
Dependency Vulnerability Scanning
Modern Node.js applications pull in hundreds of transitive dependencies, and every single one of them is an attack surface. Dependency vulnerability scanning is the practice of automatically identifying known security flaws, license risks, and supply chain threats in your project's dependency tree. If you ship production software without scanning, you are flying blind into a minefield that gets new entries daily.
Prerequisites
- Node.js 18+ and npm 9+ installed
- An Azure DevOps organization with a project and pipeline configured
- Basic familiarity with
package.jsonandpackage-lock.json - A Snyk account (free tier works for the examples here)
- Git installed and configured
npm Audit Deep Dive
The npm audit command is the first line of defense. It compares every package in your dependency tree against the npm Advisory Database and reports known vulnerabilities grouped by severity.
Severity Levels
npm audit classifies findings into four severity levels:
| Severity | CVSS Score | Action Required |
|---|---|---|
| Critical | 9.0 - 10.0 | Immediate fix. Block deployments. |
| High | 7.0 - 8.9 | Fix within 24-48 hours. |
| Moderate | 4.0 - 6.9 | Fix within current sprint. |
| Low | 0.1 - 3.9 | Track and fix when convenient. |
Run a basic audit and capture structured output:
npm audit --json > audit-report.json
The JSON output gives you machine-readable data you can feed into downstream tooling:
{
"auditReportVersion": 2,
"vulnerabilities": {
"lodash": {
"name": "lodash",
"severity": "critical",
"isDirect": false,
"via": [
{
"source": 1523,
"name": "lodash",
"dependency": "lodash",
"title": "Prototype Pollution",
"url": "https://github.com/advisories/GHSA-jf85-cpcp-j695",
"severity": "critical",
"cwe": ["CWE-1321"],
"cvss": {
"score": 9.1,
"vectorString": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:N"
},
"range": "<4.17.21"
}
],
"effects": ["express-utils", "config-loader"],
"range": "<=4.17.20",
"nodes": ["node_modules/lodash"],
"fixAvailable": {
"name": "lodash",
"version": "4.17.21",
"isSemVerMajor": false
}
}
},
"metadata": {
"vulnerabilities": {
"info": 0,
"low": 2,
"moderate": 5,
"high": 3,
"critical": 1,
"total": 11
}
}
}
Fix Strategies
There are three approaches to fixing audit findings, and you should try them in this order:
1. Automatic fix with npm audit fix:
npm audit fix
This applies semver-compatible updates. It is safe to run in most cases because it stays within the version ranges declared in your package.json. However, it will not resolve issues that require a major version bump.
2. Forced fix with npm audit fix --force:
npm audit fix --force
This will install breaking changes if needed. Never run this blindly in CI. Always review the changes it proposes first, and run your test suite afterwards.
3. Manual resolution via overrides:
When a vulnerability lives in a transitive dependency and the direct dependency has not released a fix, use npm overrides in package.json:
{
"overrides": {
"lodash": "4.17.21",
"nth-check": ">=2.0.1",
"json5": ">=2.2.2"
}
}
Overrides force the entire dependency tree to use a specific version. This can break things, so test thoroughly.
Audit Signatures
npm 9+ supports registry signatures to verify package integrity. You can validate that packages were published through the official registry and have not been tampered with:
npm audit signatures
Expected output for a clean project:
audited 847 packages in 3s
847 packages have verified registry signatures
If you see unsigned packages, investigate immediately. A missing signature on a well-known package could indicate a compromised registry mirror or a supply chain attack.
Snyk for Node.js Dependency Scanning
Snyk goes beyond npm audit by maintaining its own vulnerability database with broader coverage and faster disclosure times. Install the CLI globally:
npm install -g snyk
snyk auth
Run a scan against your project:
snyk test --json > snyk-report.json
Snyk provides richer remediation guidance than npm audit. It tells you the exact upgrade path, whether a patch is available, and the exploit maturity level:
snyk test --severity-threshold=high
Output:
Testing /home/project...
Tested 312 dependencies for known issues, found 4 issues, 2 critical.
Issues to fix by upgrading:
Upgrade [email protected] to [email protected]
- Prototype Pollution [Critical Severity][https://snyk.io/vuln/SNYK-JS-EXPRESS-1234567]
introduced through [email protected]
Upgrade [email protected] to [email protected]
- Insecure Default Algorithm [High Severity][https://snyk.io/vuln/SNYK-JS-JSONWEBTOKEN-7654321]
introduced through [email protected]
For continuous monitoring, connect your repository to Snyk's dashboard. It will open pull requests automatically when new vulnerabilities are disclosed against your dependencies.
GitHub Advisory Database Integration
The GitHub Advisory Database (GHSA) is the upstream source for npm audit findings. You can query it directly via the GraphQL API for custom tooling:
var https = require("https");
function queryGitHubAdvisories(ecosystem, packageName, callback) {
var query = JSON.stringify({
query: '{ securityVulnerabilities(ecosystem: NPM, package: "' + packageName + '", first: 10) { nodes { advisory { summary severity ghsaId publishedAt } vulnerableVersionRange firstPatchedVersion { identifier } } } }'
});
var options = {
hostname: "api.github.com",
path: "/graphql",
method: "POST",
headers: {
"Authorization": "Bearer " + process.env.GITHUB_TOKEN,
"Content-Type": "application/json",
"User-Agent": "dependency-scanner"
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
callback(null, JSON.parse(data));
});
});
req.on("error", callback);
req.write(query);
req.end();
}
queryGitHubAdvisories("NPM", "lodash", function(err, result) {
if (err) {
console.error("Failed to query advisories:", err.message);
return;
}
var vulnerabilities = result.data.securityVulnerabilities.nodes;
vulnerabilities.forEach(function(vuln) {
console.log("[%s] %s - Fix: %s",
vuln.advisory.severity,
vuln.advisory.summary,
vuln.firstPatchedVersion ? vuln.firstPatchedVersion.identifier : "No fix available"
);
});
});
OWASP Dependency-Check
OWASP Dependency-Check is a language-agnostic scanner that cross-references the National Vulnerability Database (NVD). It works with Node.js projects by analyzing package-lock.json:
# Install via Homebrew or download the CLI release
dependency-check --project "my-app" \
--scan ./package-lock.json \
--format JSON \
--out ./reports/ \
--enableExperimental
The advantage of OWASP Dependency-Check over npm audit is that it uses NVD data with CPE matching, which catches vulnerabilities that the npm advisory database might miss. The downside is a higher false-positive rate because CPE matching can be imprecise for npm packages.
Scanning in Azure Pipelines
Integrate dependency scanning directly into your CI/CD pipeline so vulnerabilities block the build before reaching production.
Pipeline YAML Configuration
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
SNYK_TOKEN: $(SnykApiToken)
FAIL_ON_SEVERITY: 'high'
stages:
- stage: SecurityScan
displayName: 'Dependency Security Scanning'
jobs:
- job: DependencyScan
displayName: 'Scan Dependencies'
steps:
- task: NodeTool@0
inputs:
versionSpec: '20.x'
displayName: 'Install Node.js'
- script: npm ci
displayName: 'Install dependencies'
- script: |
npm audit --json > $(Build.ArtifactStagingDirectory)/npm-audit.json
AUDIT_EXIT=$?
if [ $AUDIT_EXIT -gt 0 ]; then
echo "##vso[task.logissue type=warning]npm audit found vulnerabilities"
fi
npm audit --audit-level=$(FAIL_ON_SEVERITY)
displayName: 'Run npm audit'
continueOnError: true
- script: |
npm install -g snyk
snyk auth $(SNYK_TOKEN)
snyk test --severity-threshold=$(FAIL_ON_SEVERITY) --json > $(Build.ArtifactStagingDirectory)/snyk-report.json || true
snyk test --severity-threshold=$(FAIL_ON_SEVERITY)
displayName: 'Run Snyk scan'
- script: |
node scripts/verify-lockfile.js
displayName: 'Verify lock file integrity'
- script: |
node scripts/check-supply-chain.js
displayName: 'Supply chain risk check'
- task: PublishBuildArtifacts@1
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'security-reports'
displayName: 'Publish security reports'
condition: always()
Using the Snyk Azure DevOps Extension
Azure DevOps has a first-party Snyk extension you can install from the marketplace. Add this task to your pipeline:
- task: SnykSecurityScan@1
inputs:
serviceConnectionEndpoint: 'SnykConnection'
testType: 'app'
monitorWhen: 'always'
failOnIssues: true
severityThreshold: 'high'
additionalArguments: '--all-projects'
Lock File Integrity Verification
The package-lock.json file pins exact dependency versions and records integrity hashes. If someone tampers with it, your build could pull in malicious code. Always verify lock file integrity in CI:
// scripts/verify-lockfile.js
var fs = require("fs");
var crypto = require("crypto");
var path = require("path");
function verifyLockFile() {
var lockfilePath = path.join(process.cwd(), "package-lock.json");
var packagePath = path.join(process.cwd(), "package.json");
if (!fs.existsSync(lockfilePath)) {
console.error("ERROR: package-lock.json not found. Run npm install first.");
process.exit(1);
}
var lockfile = JSON.parse(fs.readFileSync(lockfilePath, "utf8"));
var packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
// Check lockfile version
if (lockfile.lockfileVersion < 2) {
console.warn("WARNING: Lock file version %d is outdated. Upgrade to npm 7+ for lockfileVersion 3.", lockfile.lockfileVersion);
}
// Verify all packages have integrity hashes
var packages = lockfile.packages || {};
var missingIntegrity = [];
var suspiciousUrls = [];
Object.keys(packages).forEach(function(pkgPath) {
var pkg = packages[pkgPath];
if (pkgPath === "") return; // root package
if (!pkg.integrity) {
missingIntegrity.push(pkgPath);
}
// Check for non-registry resolved URLs
if (pkg.resolved && !pkg.resolved.startsWith("https://registry.npmjs.org/")) {
suspiciousUrls.push({ path: pkgPath, url: pkg.resolved });
}
});
if (missingIntegrity.length > 0) {
console.error("ERROR: %d packages missing integrity hashes:", missingIntegrity.length);
missingIntegrity.forEach(function(p) { console.error(" - %s", p); });
process.exit(1);
}
if (suspiciousUrls.length > 0) {
console.warn("WARNING: %d packages resolved from non-standard registries:", suspiciousUrls.length);
suspiciousUrls.forEach(function(item) {
console.warn(" - %s -> %s", item.path, item.url);
});
}
// Check that package.json dependencies align with lockfile
var allDeps = Object.assign({},
packageJson.dependencies || {},
packageJson.devDependencies || {}
);
var missingFromLock = [];
Object.keys(allDeps).forEach(function(dep) {
var lockKey = "node_modules/" + dep;
if (!packages[lockKey]) {
missingFromLock.push(dep);
}
});
if (missingFromLock.length > 0) {
console.error("ERROR: Dependencies in package.json missing from lockfile:");
missingFromLock.forEach(function(d) { console.error(" - %s", d); });
process.exit(1);
}
console.log("Lock file integrity verified. %d packages checked.", Object.keys(packages).length - 1);
}
verifyLockFile();
Private Registry Vulnerability Feeds
If you use a private npm registry like Artifactory or Azure Artifacts, you need to ensure vulnerability data flows through. Configure your .npmrc to use the private registry while still getting audit data from the public registry:
registry=https://pkgs.dev.azure.com/myorg/_packaging/myfeed/npm/registry/
audit-registry=https://registry.npmjs.org/
The audit-registry setting tells npm to use the public registry for vulnerability lookups even when packages are fetched from your private feed. Without this, npm audit will fail silently or return no results against private registries that do not implement the audit endpoint.
Transitive Dependency Analysis
Direct dependencies are easy to manage. The real danger is transitive dependencies, the packages your dependencies depend on, sometimes nested five or six levels deep. Here is a script that maps your full dependency tree and flags vulnerabilities at every level:
// scripts/transitive-analysis.js
var childProcess = require("child_process");
function analyzeTransitiveDeps() {
// Get the full dependency tree
var treeOutput = childProcess.execSync("npm ls --all --json 2>/dev/null", {
encoding: "utf8",
maxBuffer: 50 * 1024 * 1024
});
var tree = JSON.parse(treeOutput);
// Get audit data
var auditOutput = childProcess.execSync("npm audit --json 2>/dev/null", {
encoding: "utf8",
maxBuffer: 10 * 1024 * 1024
});
var audit = JSON.parse(auditOutput);
var vulnerabilities = audit.vulnerabilities || {};
// Walk the tree and find vulnerable transitive deps
var findings = [];
function walkTree(node, chain) {
var deps = node.dependencies || {};
Object.keys(deps).forEach(function(name) {
var dep = deps[name];
var currentChain = chain.concat([name + "@" + dep.version]);
if (vulnerabilities[name]) {
findings.push({
package: name,
version: dep.version,
severity: vulnerabilities[name].severity,
depth: currentChain.length,
chain: currentChain.join(" -> "),
isDirect: currentChain.length === 1
});
}
walkTree(dep, currentChain);
});
}
walkTree(tree, []);
// Sort by severity then depth
var severityOrder = { critical: 0, high: 1, moderate: 2, low: 3 };
findings.sort(function(a, b) {
var sevDiff = (severityOrder[a.severity] || 4) - (severityOrder[b.severity] || 4);
if (sevDiff !== 0) return sevDiff;
return a.depth - b.depth;
});
console.log("=== Transitive Dependency Vulnerability Report ===\n");
console.log("Total vulnerable packages: %d", findings.length);
console.log("Direct: %d", findings.filter(function(f) { return f.isDirect; }).length);
console.log("Transitive: %d\n", findings.filter(function(f) { return !f.isDirect; }).length);
findings.forEach(function(f) {
console.log("[%s] %s@%s (depth: %d)", f.severity.toUpperCase(), f.package, f.version, f.depth);
console.log(" Chain: %s\n", f.chain);
});
return findings;
}
analyzeTransitiveDeps();
Automated PR Creation for Vulnerable Dependencies
Automate the fix process by creating pull requests when vulnerabilities are detected. This script works with Azure DevOps Repos:
// scripts/auto-fix-pr.js
var childProcess = require("child_process");
var https = require("https");
var ORG = process.env.AZURE_DEVOPS_ORG;
var PROJECT = process.env.AZURE_DEVOPS_PROJECT;
var REPO = process.env.AZURE_DEVOPS_REPO;
var PAT = process.env.AZURE_DEVOPS_PAT;
function createSecurityFixPR(packageName, fromVersion, toVersion, severity) {
var branchName = "security/fix-" + packageName + "-" + Date.now();
// Create branch and apply fix
childProcess.execSync("git checkout -b " + branchName);
childProcess.execSync("npm install " + packageName + "@" + toVersion + " --save");
childProcess.execSync('git add package.json package-lock.json');
childProcess.execSync('git commit -m "fix: upgrade ' + packageName + ' from ' + fromVersion + ' to ' + toVersion + ' (security)"');
childProcess.execSync("git push origin " + branchName);
// Create PR via Azure DevOps REST API
var prBody = JSON.stringify({
sourceRefName: "refs/heads/" + branchName,
targetRefName: "refs/heads/master",
title: "[Security] Upgrade " + packageName + " to fix " + severity + " vulnerability",
description: "## Automated Security Fix\\n\\n" +
"**Package:** " + packageName + "\\n" +
"**From:** " + fromVersion + "\\n" +
"**To:** " + toVersion + "\\n" +
"**Severity:** " + severity + "\\n\\n" +
"This PR was automatically generated by the dependency security scanner."
});
var options = {
hostname: "dev.azure.com",
path: "/" + ORG + "/" + PROJECT + "/_apis/git/repositories/" + REPO + "/pullrequests?api-version=7.1",
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": "Basic " + Buffer.from(":" + PAT).toString("base64"),
"Content-Length": Buffer.byteLength(prBody)
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
var pr = JSON.parse(data);
console.log("Created PR #%d: %s", pr.pullRequestId, pr.title);
});
});
req.write(prBody);
req.end();
}
Vulnerability Database Comparison
Not all vulnerability databases are created equal. Here is how the major databases compare for Node.js:
| Database | Coverage | Speed | False Positives | Best For |
|---|---|---|---|---|
| NVD (NIST) | Broadest, all ecosystems | Slow (days delay) | Higher (CPE matching) | Compliance, government |
| GitHub Advisory (GHSA) | npm/PyPI/RubyGems focus | Fast (hours) | Low | Open source projects |
| Snyk DB | npm deep coverage | Fastest (real-time) | Very low | Enterprise Node.js |
| OSV | Aggregates multiple sources | Fast | Low | Multi-ecosystem projects |
| npm Advisory DB | npm only | Fast | Low | Quick npm-specific checks |
For maximum coverage, cross-reference at least two databases. The NVD has the broadest coverage but the worst latency, sometimes taking weeks to add advisories that GHSA and Snyk have known about for days.
Query OSV for Node.js packages programmatically:
var https = require("https");
function queryOSV(packageName, version, callback) {
var body = JSON.stringify({
package: { name: packageName, ecosystem: "npm" },
version: version
});
var options = {
hostname: "api.osv.dev",
path: "/v1/query",
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(body)
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
var result = JSON.parse(data);
callback(null, result.vulns || []);
});
});
req.on("error", callback);
req.write(body);
req.end();
}
queryOSV("jsonwebtoken", "8.5.1", function(err, vulns) {
if (err) { console.error(err); return; }
console.log("Found %d vulnerabilities for [email protected]:", vulns.length);
vulns.forEach(function(v) {
console.log(" - %s: %s", v.id, v.summary);
});
});
License Risk Scanning
Vulnerabilities are not the only risk in your dependency tree. License conflicts can create legal exposure. Scan for risky licenses alongside vulnerabilities:
// scripts/license-scan.js
var childProcess = require("child_process");
var RISKY_LICENSES = [
"GPL-2.0", "GPL-3.0", "AGPL-3.0", "LGPL-2.1", "LGPL-3.0",
"SSPL-1.0", "BSL-1.1", "UNLICENSED", "UNKNOWN"
];
var COPYLEFT_LICENSES = ["GPL-2.0", "GPL-3.0", "AGPL-3.0"];
function scanLicenses() {
// Install license-checker if not present
try {
require.resolve("license-checker");
} catch (e) {
childProcess.execSync("npm install -g license-checker");
}
var output = childProcess.execSync("license-checker --json --production", {
encoding: "utf8"
});
var packages = JSON.parse(output);
var findings = [];
Object.keys(packages).forEach(function(pkgKey) {
var pkg = packages[pkgKey];
var license = pkg.licenses || "UNKNOWN";
var isRisky = RISKY_LICENSES.some(function(l) {
return license.indexOf(l) !== -1;
});
var isCopyleft = COPYLEFT_LICENSES.some(function(l) {
return license.indexOf(l) !== -1;
});
if (isRisky) {
findings.push({
package: pkgKey,
license: license,
copyleft: isCopyleft,
risk: isCopyleft ? "HIGH" : "MEDIUM",
repository: pkg.repository || "unknown"
});
}
});
console.log("=== License Risk Report ===\n");
console.log("Total packages scanned: %d", Object.keys(packages).length);
console.log("Risky licenses found: %d\n", findings.length);
findings.forEach(function(f) {
console.log("[%s] %s - License: %s%s",
f.risk, f.package, f.license,
f.copyleft ? " (COPYLEFT - may require source disclosure)" : ""
);
});
return findings;
}
scanLicenses();
Supply Chain Attack Detection
Supply chain attacks on npm are not theoretical. Typosquatting, account hijacking, and malicious postinstall scripts are actively exploited in the wild. Build detection into your pipeline:
// scripts/check-supply-chain.js
var fs = require("fs");
var path = require("path");
var childProcess = require("child_process");
var KNOWN_TYPOSQUATS = {
"loadsh": "lodash",
"coffe-script": "coffee-script",
"crossenv": "cross-env",
"mongose": "mongoose",
"expresss": "express",
"bable-cli": "babel-cli",
"eslint-config-aribnb": "eslint-config-airbnb"
};
function checkSupplyChain() {
var lockfilePath = path.join(process.cwd(), "package-lock.json");
var lockfile = JSON.parse(fs.readFileSync(lockfilePath, "utf8"));
var packages = lockfile.packages || {};
var issues = [];
Object.keys(packages).forEach(function(pkgPath) {
if (pkgPath === "") return;
var pkg = packages[pkgPath];
var name = pkgPath.replace("node_modules/", "").split("node_modules/").pop();
// Check for known typosquats
if (KNOWN_TYPOSQUATS[name]) {
issues.push({
type: "TYPOSQUAT",
severity: "critical",
package: name,
message: "Possible typosquat of '" + KNOWN_TYPOSQUATS[name] + "'"
});
}
// Check for suspicious install scripts
var pkgJsonPath = path.join(process.cwd(), pkgPath, "package.json");
if (fs.existsSync(pkgJsonPath)) {
var pkgJson = JSON.parse(fs.readFileSync(pkgJsonPath, "utf8"));
var scripts = pkgJson.scripts || {};
var suspiciousScripts = ["preinstall", "postinstall", "preuninstall"];
suspiciousScripts.forEach(function(hook) {
if (scripts[hook]) {
var script = scripts[hook];
// Flag scripts that download external resources or execute encoded strings
var redFlags = ["curl ", "wget ", "eval(", "Buffer.from(", "http://", "powershell"];
redFlags.forEach(function(flag) {
if (script.indexOf(flag) !== -1) {
issues.push({
type: "SUSPICIOUS_SCRIPT",
severity: "high",
package: name,
message: "Install script contains '" + flag + "': " + script.substring(0, 100)
});
}
});
}
});
}
// Check for very new packages with few downloads (higher hijack risk)
// This requires npm API queries - shown below
});
// Check for packages resolved from unexpected registries
Object.keys(packages).forEach(function(pkgPath) {
var pkg = packages[pkgPath];
if (pkg.resolved && pkg.resolved.indexOf("registry.npmjs.org") === -1 &&
pkg.resolved.indexOf("registry.npm.taobao.org") === -1) {
// Allow known private registries, flag everything else
if (pkg.resolved.indexOf("pkgs.dev.azure.com") === -1 &&
pkg.resolved.indexOf("npm.pkg.github.com") === -1) {
issues.push({
type: "UNKNOWN_REGISTRY",
severity: "high",
package: pkgPath,
message: "Resolved from non-standard registry: " + pkg.resolved
});
}
}
});
console.log("=== Supply Chain Risk Report ===\n");
if (issues.length === 0) {
console.log("No supply chain risks detected.");
return issues;
}
console.log("Found %d potential supply chain risks:\n", issues.length);
issues.forEach(function(issue) {
console.log("[%s][%s] %s", issue.severity.toUpperCase(), issue.type, issue.package);
console.log(" %s\n", issue.message);
});
var criticalCount = issues.filter(function(i) { return i.severity === "critical"; }).length;
if (criticalCount > 0) {
console.error("BLOCKING: %d critical supply chain issues detected.", criticalCount);
process.exit(1);
}
return issues;
}
checkSupplyChain();
Complete Working Example: Dependency Security Scanner
This is a complete Node.js tool that orchestrates npm audit, cross-references the OSV database, generates a compliance report, and creates Azure DevOps work items for critical findings:
// dependency-scanner.js
var childProcess = require("child_process");
var fs = require("fs");
var https = require("https");
var path = require("path");
var CONFIG = {
azureOrg: process.env.AZURE_DEVOPS_ORG,
azureProject: process.env.AZURE_DEVOPS_PROJECT,
azurePat: process.env.AZURE_DEVOPS_PAT,
severityThreshold: process.env.SEVERITY_THRESHOLD || "high",
outputDir: process.env.REPORT_DIR || "./security-reports",
createWorkItems: process.env.CREATE_WORK_ITEMS === "true"
};
// ---- npm audit ----
function runNpmAudit(callback) {
console.log("[1/4] Running npm audit...");
childProcess.exec("npm audit --json", { maxBuffer: 10 * 1024 * 1024 }, function(err, stdout) {
var audit;
try {
audit = JSON.parse(stdout);
} catch (e) {
return callback(new Error("Failed to parse npm audit output"));
}
var findings = [];
var vulns = audit.vulnerabilities || {};
Object.keys(vulns).forEach(function(name) {
var vuln = vulns[name];
findings.push({
source: "npm-audit",
package: name,
severity: vuln.severity,
fixAvailable: !!vuln.fixAvailable,
range: vuln.range,
via: Array.isArray(vuln.via) ? vuln.via.filter(function(v) { return typeof v === "object"; }) : []
});
});
console.log(" Found %d vulnerabilities via npm audit.", findings.length);
callback(null, findings);
});
}
// ---- OSV cross-reference ----
function queryOSVBatch(packages, callback) {
console.log("[2/4] Cross-referencing OSV database for %d packages...", packages.length);
var body = JSON.stringify({
queries: packages.map(function(pkg) {
return { package: { name: pkg.name, ecosystem: "npm" }, version: pkg.version };
})
});
var options = {
hostname: "api.osv.dev",
path: "/v1/querybatch",
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(body)
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
try {
var result = JSON.parse(data);
var findings = [];
(result.results || []).forEach(function(r, idx) {
(r.vulns || []).forEach(function(v) {
findings.push({
source: "osv",
package: packages[idx].name,
id: v.id,
summary: v.summary || "No summary",
severity: extractSeverity(v),
aliases: v.aliases || []
});
});
});
console.log(" Found %d additional findings via OSV.", findings.length);
callback(null, findings);
} catch (e) {
callback(new Error("Failed to parse OSV response"));
}
});
});
req.on("error", callback);
req.write(body);
req.end();
}
function extractSeverity(vuln) {
if (vuln.database_specific && vuln.database_specific.severity) {
return vuln.database_specific.severity.toLowerCase();
}
if (vuln.severity && vuln.severity.length > 0) {
var score = vuln.severity[0].score;
if (!score) return "unknown";
// Parse CVSS vector for base score
var match = score.match(/CVSS:[^/]+\/AV/);
return match ? "high" : "unknown";
}
return "unknown";
}
// ---- Get installed packages ----
function getInstalledPackages() {
var lockfilePath = path.join(process.cwd(), "package-lock.json");
var lockfile = JSON.parse(fs.readFileSync(lockfilePath, "utf8"));
var packages = lockfile.packages || {};
var result = [];
Object.keys(packages).forEach(function(key) {
if (key === "") return;
var name = key.replace(/^node_modules\//, "").split("node_modules/").pop();
result.push({ name: name, version: packages[key].version });
});
return result;
}
// ---- Generate compliance report ----
function generateReport(npmFindings, osvFindings) {
console.log("[3/4] Generating compliance report...");
var allFindings = npmFindings.concat(osvFindings);
// Deduplicate by package name
var byPackage = {};
allFindings.forEach(function(f) {
if (!byPackage[f.package]) {
byPackage[f.package] = { sources: [], severity: f.severity };
}
byPackage[f.package].sources.push(f.source);
// Upgrade severity if OSV says it is worse
var order = { critical: 0, high: 1, moderate: 2, low: 3, unknown: 4 };
if ((order[f.severity] || 4) < (order[byPackage[f.package].severity] || 4)) {
byPackage[f.package].severity = f.severity;
}
});
var report = {
generatedAt: new Date().toISOString(),
projectPath: process.cwd(),
summary: {
totalPackages: getInstalledPackages().length,
totalVulnerabilities: Object.keys(byPackage).length,
critical: 0,
high: 0,
moderate: 0,
low: 0
},
findings: [],
complianceStatus: "PASS"
};
Object.keys(byPackage).forEach(function(pkg) {
var entry = byPackage[pkg];
report.summary[entry.severity] = (report.summary[entry.severity] || 0) + 1;
report.findings.push({
package: pkg,
severity: entry.severity,
detectedBy: entry.sources.filter(function(s, i, arr) { return arr.indexOf(s) === i; })
});
});
// Determine compliance status
var thresholds = { critical: 0, high: 1, moderate: 2, low: 3 };
var threshold = thresholds[CONFIG.severityThreshold] || 1;
if (report.summary.critical > 0 && threshold >= 0) report.complianceStatus = "FAIL";
if (report.summary.high > 0 && threshold >= 1) report.complianceStatus = "FAIL";
if (report.summary.moderate > 0 && threshold >= 2) report.complianceStatus = "FAIL";
// Write report
if (!fs.existsSync(CONFIG.outputDir)) {
fs.mkdirSync(CONFIG.outputDir, { recursive: true });
}
var reportPath = path.join(CONFIG.outputDir, "security-report-" + Date.now() + ".json");
fs.writeFileSync(reportPath, JSON.stringify(report, null, 2));
console.log(" Report written to %s", reportPath);
console.log(" Compliance status: %s", report.complianceStatus);
return report;
}
// ---- Create Azure DevOps work items ----
function createWorkItem(finding, callback) {
if (!CONFIG.azureOrg || !CONFIG.azurePat) {
console.log(" Skipping work item creation (Azure DevOps not configured).");
return callback(null);
}
var patchBody = JSON.stringify([
{ op: "add", path: "/fields/System.Title", value: "[Security] Vulnerable dependency: " + finding.package },
{ op: "add", path: "/fields/System.Description", value:
"<h3>Dependency Vulnerability Detected</h3>" +
"<p><strong>Package:</strong> " + finding.package + "</p>" +
"<p><strong>Severity:</strong> " + finding.severity.toUpperCase() + "</p>" +
"<p><strong>Detected by:</strong> " + finding.detectedBy.join(", ") + "</p>" +
"<p>Run <code>npm audit</code> and update the affected package to resolve this vulnerability.</p>"
},
{ op: "add", path: "/fields/Microsoft.VSTS.Common.Priority", value: finding.severity === "critical" ? 1 : 2 },
{ op: "add", path: "/fields/System.Tags", value: "security; automated; dependency-scan" }
]);
var options = {
hostname: "dev.azure.com",
path: "/" + CONFIG.azureOrg + "/" + CONFIG.azureProject + "/_apis/wit/workitems/$Bug?api-version=7.1",
method: "POST",
headers: {
"Content-Type": "application/json-patch+json",
"Authorization": "Basic " + Buffer.from(":" + CONFIG.azurePat).toString("base64")
}
};
var req = https.request(options, function(res) {
var data = "";
res.on("data", function(chunk) { data += chunk; });
res.on("end", function() {
try {
var wi = JSON.parse(data);
console.log(" Created work item #%d: %s", wi.id, wi.fields["System.Title"]);
} catch (e) {
console.error(" Failed to create work item for %s", finding.package);
}
callback(null);
});
});
req.on("error", function(err) {
console.error(" API error: %s", err.message);
callback(null);
});
req.write(patchBody);
req.end();
}
// ---- Main orchestrator ----
function main() {
console.log("=== Dependency Security Scanner ===\n");
console.log("Severity threshold: %s", CONFIG.severityThreshold);
console.log("Create work items: %s\n", CONFIG.createWorkItems);
var installedPackages = getInstalledPackages();
runNpmAudit(function(err, npmFindings) {
if (err) {
console.error("npm audit failed: %s", err.message);
npmFindings = [];
}
// Take a sample of packages for OSV (batch API has limits)
var osvSample = installedPackages.slice(0, 1000);
queryOSVBatch(osvSample, function(err, osvFindings) {
if (err) {
console.error("OSV query failed: %s", err.message);
osvFindings = [];
}
var report = generateReport(npmFindings, osvFindings);
// Create work items for critical/high findings
if (CONFIG.createWorkItems) {
console.log("\n[4/4] Creating Azure DevOps work items...");
var criticalFindings = report.findings.filter(function(f) {
return f.severity === "critical" || f.severity === "high";
});
var pending = criticalFindings.length;
if (pending === 0) {
console.log(" No critical/high findings. Skipping work item creation.");
finish(report);
return;
}
criticalFindings.forEach(function(finding) {
createWorkItem(finding, function() {
pending--;
if (pending === 0) finish(report);
});
});
} else {
console.log("\n[4/4] Work item creation disabled. Skipping.");
finish(report);
}
});
});
}
function finish(report) {
console.log("\n=== Scan Complete ===");
console.log("Critical: %d | High: %d | Moderate: %d | Low: %d",
report.summary.critical, report.summary.high,
report.summary.moderate, report.summary.low);
console.log("Compliance: %s\n", report.complianceStatus);
if (report.complianceStatus === "FAIL") {
process.exit(1);
}
}
main();
Run it locally:
node dependency-scanner.js
Or in an Azure Pipeline:
- script: |
node dependency-scanner.js
displayName: 'Run dependency security scanner'
env:
AZURE_DEVOPS_ORG: $(System.CollectionUri)
AZURE_DEVOPS_PROJECT: $(System.TeamProject)
AZURE_DEVOPS_PAT: $(System.AccessToken)
SEVERITY_THRESHOLD: 'high'
CREATE_WORK_ITEMS: 'true'
Common Issues & Troubleshooting
1. npm audit returns ENOLOCK error:
npm ERR! audit endpoint returned an error
npm ERR! code ENOLOCK
npm ERR! audit This command requires an existing lockfile.
This happens when package-lock.json is missing or gitignored. Fix it by running npm install first, and commit your lockfile to source control. Never gitignore package-lock.json in application projects.
2. npm audit fix --force introduces breaking changes:
npm WARN using --force Recommended protections disabled.
npm WARN audit Updating express to 5.0.0, which is a SemVer major change.
After force-fixing, your application fails because Express 5 has a different API surface. The fix is to use overrides instead of force, or pin the specific transitive dependency that is vulnerable:
{
"overrides": {
"qs": ">=6.11.0"
}
}
3. Snyk scan fails in Azure Pipeline with authentication error:
MissingApiTokenError:
`snyk` requires an authenticated account. Please run `snyk auth` and try again.
You need to set the SNYK_TOKEN environment variable. Create a Snyk API token from your Snyk account settings and add it as a pipeline variable:
- script: snyk test
env:
SNYK_TOKEN: $(SnykApiToken)
4. npm audit signatures reports unsigned packages:
audited 847 packages in 3s
834 packages have verified registry signatures
13 packages have missing registry signatures:
@company/[email protected]
[email protected]
Packages from private registries will not have npm registry signatures. This is expected. Filter them out by configuring your .npmrc properly, or whitelist known private packages in your verification script.
5. OWASP Dependency-Check produces false positives:
[WARNING] lodash (pkg:npm/[email protected]) identified as cpe:2.3:a:lodash:lodash
CVE-2018-3721 - CVSS: 6.5 (MEDIUM)
This CVE was fixed in 4.17.5 but OWASP still flags it because CPE matching does not account for version fixes correctly in some cases. Use suppression files to exclude known false positives:
<!-- dependency-check-suppression.xml -->
<suppressions xmlns="https://jeremylong.github.io/DependencyCheck/dependency-suppression.1.3.xsd">
<suppress>
<notes>False positive - fixed in lodash 4.17.5</notes>
<packageUrl regex="true">^pkg:npm/lodash@.*$</packageUrl>
<cve>CVE-2018-3721</cve>
</suppress>
</suppressions>
Best Practices
Run
npm auditon every CI build. Make it a non-negotiable gate. If your team ignores audit output because it runs somewhere no one checks, pipe it into a dashboard or Slack channel.Use
npm ciinstead ofnpm installin pipelines. Thecicommand installs from the lockfile exactly and will fail if the lockfile is out of sync withpackage.json. This prevents lockfile drift and supply chain manipulation.Cross-reference multiple vulnerability databases. No single database has complete coverage. At minimum, combine npm audit (GHSA) with either Snyk or OSV to catch advisories that one source might miss or delay.
Pin your transitive dependencies with overrides when patches are unavailable. Waiting for a maintainer to release a fix upstream can take weeks. Overrides let you force a safe version immediately while you wait.
Automate remediation PRs and track them in work items. Humans forget. Automated PRs with clear descriptions get reviewed and merged faster than a report that sits in a shared drive.
Scan licenses alongside vulnerabilities. A GPL-3.0 dependency in your proprietary SaaS product is a legal vulnerability, not a technical one, but it is just as dangerous. Catch it in the same pipeline.
Monitor for supply chain attacks, not just known CVEs. Typosquatting and postinstall script attacks are not in any CVE database. Build custom checks into your pipeline that look for suspicious patterns.
Treat vulnerability scanning like tests: fail the build, not just warn. A warning that nobody acts on is worse than no warning at all because it creates a false sense of security. Set a severity threshold and enforce it.