Integrating SonarQube with Azure Pipelines
A comprehensive guide to integrating SonarQube with Azure DevOps Pipelines for continuous code quality analysis, covering server setup, pipeline configuration, quality gates, branch analysis, pull request decoration, and custom quality profiles for multiple languages.
Integrating SonarQube with Azure Pipelines
Overview
SonarQube is the code quality platform I recommend most often for teams running Azure DevOps. It catches bugs, security vulnerabilities, code smells, and duplication that unit tests miss entirely. Integrating it into your Azure Pipeline means every commit gets analyzed before it reaches production, and quality gates can block merges that introduce too much technical debt. I have deployed SonarQube across organizations with hundreds of projects and the return on investment is obvious within the first week — developers start fixing issues they never knew existed.
Prerequisites
- SonarQube server (Community, Developer, Enterprise, or Data Center edition) version 9.x or later
- Azure DevOps organization with Pipelines enabled
- SonarQube Scanner for Azure DevOps extension installed from the Marketplace
- SonarQube project token for authentication
- Service connection configured in Azure DevOps for your SonarQube server
- At least one project with a supported language (Java, C#, JavaScript, TypeScript, Python, Go, etc.)
Setting Up the SonarQube Connection
Install the Marketplace Extension
Search for "SonarQube" in the Azure DevOps Marketplace and install the official extension by SonarSource. This adds three pipeline tasks:
- Prepare Analysis Configuration — configures the scanner with server URL, project key, and analysis parameters
- Run Code Analysis — executes the scanner against your source code
- Publish Quality Gate Result — waits for SonarQube to process results and publishes the quality gate status
Create a Service Connection
Navigate to Project Settings > Service connections > New service connection > SonarQube:
- Server URL:
https://sonarqube.yourcompany.com - Token: Generate a project analysis token in SonarQube (User > My Account > Security > Generate Token)
- Name:
SonarQube-Production
Use a service account token rather than a personal token. If the person who generated the token leaves, all pipelines using that connection break.
Basic Pipeline Integration
.NET Project Analysis
# azure-pipelines-sonar-dotnet.yml
trigger:
branches:
include:
- main
- develop
- feature/*
pool:
vmImage: "ubuntu-latest"
variables:
buildConfiguration: "Release"
sonarProjectKey: "myorg_myapp"
sonarProjectName: "My Application"
steps:
- task: SonarQubePrepare@6
displayName: "Prepare SonarQube Analysis"
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "MSBuild"
projectKey: $(sonarProjectKey)
projectName: $(sonarProjectName)
extraProperties: |
sonar.cs.opencover.reportsPaths=$(Agent.TempDirectory)/**/coverage.opencover.xml
sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/**/*.trx
- task: DotNetCoreCLI@2
displayName: "Build"
inputs:
command: build
projects: "**/*.csproj"
arguments: "--configuration $(buildConfiguration)"
- task: DotNetCoreCLI@2
displayName: "Run Tests with Coverage"
inputs:
command: test
projects: "**/*Tests.csproj"
arguments: "--configuration $(buildConfiguration) --collect:\"XPlat Code Coverage\" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=opencover"
- task: SonarQubeAnalyze@6
displayName: "Run SonarQube Analysis"
- task: SonarQubePublish@6
displayName: "Publish Quality Gate Result"
inputs:
pollingTimeoutSec: 300
JavaScript/TypeScript Project Analysis
# azure-pipelines-sonar-js.yml
trigger:
branches:
include:
- main
- develop
pool:
vmImage: "ubuntu-latest"
steps:
- task: NodeTool@0
inputs:
versionSpec: "20.x"
- script: npm ci
displayName: "Install dependencies"
- script: npm test -- --coverage --coverageReporters=lcov
displayName: "Run tests with coverage"
- task: SonarQubePrepare@6
displayName: "Prepare SonarQube Analysis"
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "CLI"
configMode: "manual"
cliProjectKey: "myorg_frontend"
cliProjectName: "Frontend Application"
cliSources: "src"
extraProperties: |
sonar.javascript.lcov.reportPaths=coverage/lcov.info
sonar.exclusions=**/*.test.js,**/*.spec.ts,**/node_modules/**,**/dist/**
sonar.test.inclusions=**/*.test.js,**/*.spec.ts
sonar.tests=src
sonar.testExecutionReportPaths=test-report.xml
- task: SonarQubeAnalyze@6
displayName: "Run SonarQube Analysis"
- task: SonarQubePublish@6
displayName: "Publish Quality Gate Result"
inputs:
pollingTimeoutSec: 300
Java (Maven) Project Analysis
# azure-pipelines-sonar-java.yml
steps:
- task: SonarQubePrepare@6
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "Other"
extraProperties: |
sonar.projectKey=myorg_backend
sonar.projectName=Backend Service
- task: Maven@4
displayName: "Build and Test"
inputs:
mavenPomFile: "pom.xml"
goals: "clean verify sonar:sonar"
options: "-Dsonar.host.url=$(SONAR_HOST_URL) -Dsonar.token=$(SONAR_TOKEN)"
publishJUnitResults: true
testResultsFiles: "**/surefire-reports/TEST-*.xml"
javaHomeOption: "JDKVersion"
jdkVersionOption: "1.17"
- task: SonarQubePublish@6
inputs:
pollingTimeoutSec: 300
Quality Gates
Quality gates are the enforcement mechanism. A quality gate defines conditions that code must meet — if any condition fails, the gate fails, and your pipeline can block the merge.
Default Quality Gate Conditions
SonarQube's built-in "Sonar way" quality gate checks new code only:
- New bugs = 0 (no new bugs introduced)
- New vulnerabilities = 0 (no new security issues)
- New code smells with debt ratio > 5% fails
- New code coverage < 80% fails
- New duplicated lines > 3% fails
Custom Quality Gate
Create custom quality gates in SonarQube for stricter or more lenient thresholds:
- Navigate to Quality Gates in SonarQube
- Click Create
- Add conditions:
- Reliability Rating on New Code worse than A
- Security Rating on New Code worse than A
- Maintainability Rating on New Code worse than A
- Coverage on New Code less than 70%
- Duplicated Lines on New Code greater than 5%
Failing the Pipeline on Quality Gate Failure
The SonarQubePublish task sets a pipeline variable based on the quality gate result. Use it to fail the pipeline:
- task: SonarQubePublish@6
displayName: "Publish Quality Gate Result"
inputs:
pollingTimeoutSec: 300
# The task sets variable: SONARQUBE_QUALITYGATE_STATUS
- script: |
echo "Quality Gate Status: $(SONARQUBE_QUALITYGATE_STATUS)"
if [ "$(SONARQUBE_QUALITYGATE_STATUS)" != "OK" ]; then
echo "##vso[task.logissue type=error]Quality Gate FAILED"
echo "##vso[task.complete result=Failed;]Quality Gate did not pass"
fi
displayName: "Check Quality Gate"
Pull Request Decoration
SonarQube Developer Edition and above can post analysis results directly on Azure DevOps pull requests — inline comments on new issues, overall quality gate status, and coverage metrics.
Configure PR Decoration in SonarQube
- In SonarQube, navigate to Administration > Configuration > General Settings > ALM Integrations > Azure DevOps
- Set the Azure DevOps URL:
https://dev.azure.com/your-org - Create a PAT in Azure DevOps with Code (Read & Write) scope
- Enter the PAT in SonarQube
Then at the project level:
- Project Settings > General Settings > Azure DevOps
- Set the project name and repository slug
Pipeline Configuration for PR Analysis
# Branch analysis for PRs
trigger:
branches:
include:
- main
- develop
pr:
branches:
include:
- main
- develop
steps:
- task: SonarQubePrepare@6
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "CLI"
configMode: "manual"
cliProjectKey: "myorg_myapp"
cliSources: "src"
extraProperties: |
sonar.pullrequest.key=$(System.PullRequest.PullRequestId)
sonar.pullrequest.branch=$(System.PullRequest.SourceBranch)
sonar.pullrequest.base=$(System.PullRequest.TargetBranch)
condition: eq(variables['Build.Reason'], 'PullRequest')
- task: SonarQubePrepare@6
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "CLI"
configMode: "manual"
cliProjectKey: "myorg_myapp"
cliSources: "src"
extraProperties: |
sonar.branch.name=$(Build.SourceBranchName)
condition: ne(variables['Build.Reason'], 'PullRequest')
This configuration sends PR analysis parameters when the build is triggered by a PR, and branch analysis parameters otherwise. SonarQube uses these to correctly associate results with the PR and post inline comments.
Multi-Project Monorepo Analysis
For monorepos with multiple services, scan each project separately:
# azure-pipelines-monorepo-sonar.yml
trigger:
branches:
include:
- main
pool:
vmImage: "ubuntu-latest"
strategy:
matrix:
api-service:
projectKey: "myorg_api"
projectName: "API Service"
sources: "services/api/src"
tests: "services/api/tests"
worker-service:
projectKey: "myorg_worker"
projectName: "Worker Service"
sources: "services/worker/src"
tests: "services/worker/tests"
shared-lib:
projectKey: "myorg_shared"
projectName: "Shared Library"
sources: "packages/shared/src"
tests: "packages/shared/tests"
steps:
- task: NodeTool@0
inputs:
versionSpec: "20.x"
- script: npm ci
displayName: "Install dependencies"
- script: npx jest --coverage --project $(sources)/../ --coverageDirectory=$(sources)/../coverage
displayName: "Run tests: $(projectName)"
- task: SonarQubePrepare@6
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "CLI"
configMode: "manual"
cliProjectKey: $(projectKey)
cliProjectName: $(projectName)
cliSources: $(sources)
extraProperties: |
sonar.tests=$(tests)
sonar.javascript.lcov.reportPaths=$(sources)/../coverage/lcov.info
- task: SonarQubeAnalyze@6
- task: SonarQubePublish@6
Complete Working Example: Quality Gate Enforcement Script
This Node.js script queries SonarQube directly to check quality gate status and generate a detailed report, useful for custom pipeline steps or standalone quality checks:
// sonar-quality-check.js
var https = require("https");
var http = require("http");
var SONAR_URL = process.env.SONAR_HOST_URL || "http://localhost:9000";
var SONAR_TOKEN = process.env.SONAR_TOKEN;
var PROJECT_KEY = process.argv[2];
if (!PROJECT_KEY) {
console.error("Usage: node sonar-quality-check.js <project-key>");
process.exit(1);
}
function sonarRequest(path, callback) {
var parsed = new URL(SONAR_URL);
var protocol = parsed.protocol === "https:" ? https : http;
var auth = Buffer.from(SONAR_TOKEN + ":").toString("base64");
var options = {
hostname: parsed.hostname,
port: parsed.port,
path: "/api" + path,
method: "GET",
headers: {
"Authorization": "Basic " + auth,
"Accept": "application/json"
}
};
var req = protocol.request(options, function (res) {
var data = "";
res.on("data", function (chunk) { data += chunk; });
res.on("end", function () {
try { callback(null, JSON.parse(data)); }
catch (e) { callback(new Error("Parse error: " + data.substring(0, 200))); }
});
});
req.on("error", callback);
req.end();
}
function getQualityGate(callback) {
sonarRequest("/qualitygates/project_status?projectKey=" + encodeURIComponent(PROJECT_KEY), callback);
}
function getIssues(severity, callback) {
sonarRequest("/issues/search?componentKeys=" + encodeURIComponent(PROJECT_KEY) +
"&severities=" + severity + "&resolved=false&ps=500", callback);
}
function getMeasures(callback) {
var metrics = [
"bugs", "vulnerabilities", "code_smells", "coverage",
"duplicated_lines_density", "ncloc", "sqale_index",
"reliability_rating", "security_rating", "sqale_rating"
].join(",");
sonarRequest("/measures/component?component=" + encodeURIComponent(PROJECT_KEY) +
"&metricKeys=" + metrics, callback);
}
console.log("=== SonarQube Quality Report ===");
console.log("Project: " + PROJECT_KEY);
console.log("Server: " + SONAR_URL);
console.log("");
var pending = 3;
var report = {};
function checkDone() {
pending--;
if (pending > 0) { return; }
// Print measures
if (report.measures) {
console.log("--- Metrics ---");
var measures = {};
report.measures.forEach(function (m) {
measures[m.metric] = m.value;
});
console.log(" Lines of Code: " + (measures.ncloc || "N/A"));
console.log(" Bugs: " + (measures.bugs || "0"));
console.log(" Vulnerabilities: " + (measures.vulnerabilities || "0"));
console.log(" Code Smells: " + (measures.code_smells || "0"));
console.log(" Coverage: " + (measures.coverage || "N/A") + "%");
console.log(" Duplications: " + (measures.duplicated_lines_density || "N/A") + "%");
console.log(" Technical Debt: " + formatDebt(measures.sqale_index));
console.log("");
}
// Print quality gate
if (report.qualityGate) {
var gate = report.qualityGate;
var passed = gate.status === "OK";
console.log("--- Quality Gate: " + (passed ? "PASSED" : "FAILED") + " ---");
if (gate.conditions) {
gate.conditions.forEach(function (condition) {
var icon = condition.status === "OK" ? "PASS" : "FAIL";
console.log(" [" + icon + "] " + condition.metricKey +
": " + condition.actualValue +
" (threshold: " + condition.errorThreshold + ")");
});
}
console.log("");
}
// Print critical issues
if (report.criticalIssues && report.criticalIssues.length > 0) {
console.log("--- Critical/Blocker Issues (" + report.criticalIssues.length + ") ---");
report.criticalIssues.slice(0, 10).forEach(function (issue) {
console.log(" [" + issue.severity + "] " + issue.message);
console.log(" " + issue.component.split(":").pop() + ":" + issue.line);
});
if (report.criticalIssues.length > 10) {
console.log(" ... and " + (report.criticalIssues.length - 10) + " more");
}
console.log("");
}
// Exit code based on quality gate
if (report.qualityGate && report.qualityGate.status !== "OK") {
console.log("Quality gate FAILED. Exiting with code 1.");
process.exit(1);
}
console.log("Quality gate PASSED.");
process.exit(0);
}
function formatDebt(minutes) {
if (!minutes) { return "N/A"; }
var m = parseInt(minutes, 10);
if (m < 60) { return m + " min"; }
var hours = Math.floor(m / 60);
if (hours < 8) { return hours + "h " + (m % 60) + "min"; }
var days = Math.floor(hours / 8);
return days + "d " + (hours % 8) + "h";
}
getQualityGate(function (err, data) {
if (err) { console.error("Quality gate error:", err.message); }
else { report.qualityGate = data.projectStatus; }
checkDone();
});
getMeasures(function (err, data) {
if (err) { console.error("Measures error:", err.message); }
else { report.measures = data.component ? data.component.measures : []; }
checkDone();
});
getIssues("BLOCKER,CRITICAL", function (err, data) {
if (err) { console.error("Issues error:", err.message); }
else { report.criticalIssues = data.issues || []; }
checkDone();
});
Run it from a pipeline step:
- script: node scripts/sonar-quality-check.js myorg_myapp
displayName: "SonarQube Quality Report"
env:
SONAR_HOST_URL: $(SONAR_URL)
SONAR_TOKEN: $(SONAR_TOKEN)
Python Project Analysis
Python analysis uses the CLI scanner mode with coverage from pytest:
# azure-pipelines-sonar-python.yml
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: "3.11"
- script: |
pip install -r requirements.txt
pip install pytest pytest-cov
displayName: "Install dependencies"
- script: |
pytest --cov=src --cov-report=xml:coverage.xml --junitxml=test-results.xml
displayName: "Run tests with coverage"
- task: SonarQubePrepare@6
inputs:
SonarQube: "SonarQube-Production"
scannerMode: "CLI"
configMode: "manual"
cliProjectKey: "myorg_python_api"
cliProjectName: "Python API"
cliSources: "src"
extraProperties: |
sonar.python.coverage.reportPaths=coverage.xml
sonar.python.xunit.reportPath=test-results.xml
sonar.exclusions=**/migrations/**,**/tests/**,**/venv/**
sonar.tests=tests
- task: SonarQubeAnalyze@6
- task: SonarQubePublish@6
inputs:
pollingTimeoutSec: 300
The key property for Python coverage is sonar.python.coverage.reportPaths — it expects Cobertura XML format, which is what pytest-cov produces with --cov-report=xml. If you use coverage.py directly, run coverage xml to generate the report before the SonarQube analysis step.
Common Issues and Troubleshooting
Analysis fails with "Not authorized. Please check the properties sonar.login and sonar.password"
ERROR: Not authorized. Analyzing this project requires authentication.
The token in your service connection is invalid or expired. Generate a new token in SonarQube under User > My Account > Security > Tokens. Use a "Project Analysis" token type scoped to the specific project rather than a global token. Update the service connection in Azure DevOps with the new token.
Quality gate status shows "NONE" or never resolves
Quality Gate Status: NONE
The SonarQubePublish task polls SonarQube for the analysis result. If the SonarQube server is under heavy load or the compute engine queue is backed up, the analysis may not complete within the polling timeout. Increase pollingTimeoutSec to 600 or higher. Also verify that the SonarQube Compute Engine is running and processing tasks — check Administration > System > Background Tasks.
Scanner fails with OutOfMemoryError on large projects
java.lang.OutOfMemoryError: Java heap space
The SonarQube scanner runs in a JVM with default memory limits. For large codebases, increase the heap size in your pipeline:
- task: SonarQubeAnalyze@6
env:
SONAR_SCANNER_OPTS: "-Xmx4096m"
Code coverage shows 0% despite tests running
The scanner cannot find coverage reports. Verify the sonar.javascript.lcov.reportPaths or sonar.cs.opencover.reportsPaths property points to the correct file path. The path is relative to the working directory where the scanner runs. Run find . -name "lcov.info" in a debug step to locate the actual coverage file path.
PR decoration not posting comments
PR decoration requires SonarQube Developer Edition or higher. Community Edition does not support it. Also verify: (1) the ALM integration is configured in SonarQube admin, (2) the PAT has Code (Read & Write) permission, (3) the project-level Azure DevOps settings are configured with the correct repository slug.
Best Practices
Analyze on every PR and every merge to main. PR analysis catches issues before merge. Main branch analysis updates the overall project metrics and detects issues that slip through PR checks.
Focus quality gates on new code only. Analyzing the entire codebase with strict quality gates on a legacy project produces thousands of pre-existing issues that nobody will fix. Gate on new code to keep quality improving without blocking current work.
Integrate coverage reports from your test framework. SonarQube's value multiplies when it combines static analysis with code coverage. Without coverage data, it cannot identify which critical code paths are untested.
Exclude generated code and third-party files. Use
sonar.exclusionsto skip auto-generated code, migration files, vendor directories, and build output. Analyzing generated code produces noise that obscures real issues.Set up quality profiles per language. Customize which rules are active for each language in your stack. Disable rules that create too much noise or conflict with your team's coding standards. Enable strict security rules for backend code handling user input.
Review SonarQube issues during code review. Make it a team habit to check the SonarQube PR decoration alongside the code diff. Issues flagged by SonarQube are conversation starters, not automatic rejections.
Track technical debt trends over time. SonarQube's Activity page shows how metrics change across analyses. Use this in sprint retrospectives to discuss whether technical debt is growing or shrinking.