Integrations

Integrating SonarQube with Azure Pipelines

Run SonarQube code analysis in Azure Pipelines with quality gates, PR decoration, and Jest coverage integration for Node.js

Integrating SonarQube with Azure Pipelines

SonarQube is one of the most effective tools for catching bugs, vulnerabilities, and code smells before they reach production. When you wire it into Azure Pipelines, every commit and pull request gets automatic static analysis with enforceable quality gates. This article walks through the full integration for Node.js projects — from service connections and scanner configuration to coverage reporting, PR decoration, and troubleshooting the issues you will inevitably hit.

Prerequisites

Before starting, make sure you have the following in place:

  • A running SonarQube instance (Community Edition 9.9+ or higher) accessible from your Azure DevOps agents
  • An Azure DevOps project with a Node.js repository
  • The SonarQube extension installed from the Azure DevOps Marketplace
  • A SonarQube user token generated under My Account > Security > Tokens
  • Node.js 18+ installed on your build agents
  • Jest (or another test framework that produces LCOV coverage reports)

SonarQube Overview for Node.js Projects

SonarQube performs static analysis on your source code without executing it. For JavaScript and TypeScript projects, it uses its own analyzer built on top of ESLint rules plus hundreds of additional checks specific to SonarQube. It detects:

  • Bugs — logic errors, null dereferences, unreachable code
  • Vulnerabilities — SQL injection, XSS, hardcoded credentials, insecure crypto
  • Code smells — overly complex functions, duplicated blocks, poor naming
  • Security hotspots — code that needs manual review for security implications

For Node.js specifically, SonarQube analyzes .js, .ts, .jsx, and .tsx files. It also reads your test coverage reports (LCOV format) to track how much of your code is covered by tests. The key thing to understand is that SonarQube does not run your tests — it only consumes the coverage output that your test runner produces.

Setting Up the SonarQube Service Connection

The service connection tells Azure DevOps how to talk to your SonarQube instance. This is a one-time setup per project.

  1. Go to Project Settings > Service connections > New service connection
  2. Select SonarQube
  3. Enter your SonarQube server URL (e.g., https://sonar.yourcompany.com)
  4. Paste your user token in the Token field
  5. Name the connection something descriptive like SonarQube-Production
  6. Check Grant access permission to all pipelines if you want all pipelines in the project to use it

The token should belong to a user with the Execute Analysis permission on the target SonarQube project. I recommend creating a dedicated service account rather than using a personal token — when someone leaves the team and their account gets deactivated, you do not want your pipelines breaking.

SonarQube Scanner for Azure DevOps

The SonarQube extension adds three pipeline tasks that work together:

  • SonarQubePrepare — configures the scanner, sets the project key, and defines analysis parameters
  • SonarQubeAnalyze — runs the actual analysis against your source code
  • SonarQubePublish — waits for the SonarQube server to process results and publishes the quality gate status back to the pipeline

These three tasks must run in that exact order. The Prepare task must run before any build or test steps that produce coverage data, while Analyze and Publish run after.

Pipeline Configuration

Here is a basic pipeline that integrates SonarQube analysis into a Node.js project:

trigger:
  branches:
    include:
      - main
      - develop

pr:
  branches:
    include:
      - main

pool:
  vmImage: 'ubuntu-latest'

steps:
  - task: NodeTool@0
    inputs:
      versionSpec: '20.x'
    displayName: 'Install Node.js'

  - task: SonarQubePrepare@6
    inputs:
      SonarQube: 'SonarQube-Production'
      scannerMode: 'CLI'
      configMode: 'manual'
      cliProjectKey: 'my-node-project'
      cliProjectName: 'My Node Project'
      cliSources: 'src'
      extraProperties: |
        sonar.javascript.lcov.reportPaths=coverage/lcov.info
        sonar.testExecutionReportPaths=coverage/test-report.xml
        sonar.exclusions=**/node_modules/**,**/coverage/**,**/dist/**
        sonar.tests=test
        sonar.test.inclusions=**/*.test.js,**/*.spec.js
    displayName: 'Prepare SonarQube Analysis'

  - script: npm ci
    displayName: 'Install Dependencies'

  - script: npm test -- --coverage --coverageReporters=lcov --coverageReporters=text
    displayName: 'Run Tests with Coverage'

  - task: SonarQubeAnalyze@6
    displayName: 'Run SonarQube Analysis'

  - task: SonarQubePublish@6
    inputs:
      pollingTimeoutSec: '300'
    displayName: 'Publish Quality Gate Result'

The key detail here is ordering. The Prepare task sets up the scanner configuration before tests run, so the scanner knows where to find coverage data. Then after tests produce the LCOV file, the Analyze task picks it up and sends everything to the server.

Quality Gates as Pipeline Gates

A quality gate in SonarQube is a set of conditions that code must meet to be considered acceptable. The default "Sonar way" quality gate requires:

  • No new bugs
  • No new vulnerabilities
  • Code coverage on new code >= 80%
  • Duplication on new code < 3%

To make the pipeline fail when the quality gate fails, add a condition check after the Publish task:

  - task: SonarQubePublish@6
    inputs:
      pollingTimeoutSec: '300'
    displayName: 'Publish Quality Gate Result'

  - script: |
      if [ "$(Agent.JobStatus)" != "Succeeded" ]; then
        echo "##vso[task.logissue type=error]Quality Gate failed. Check SonarQube for details."
        exit 1
      fi
    displayName: 'Check Quality Gate Status'
    condition: always()

However, the cleaner approach is to configure this in SonarQube itself. Under Administration > Configuration > Webhooks, add a webhook pointing to your Azure DevOps organization. The SonarQubePublish task polls for the quality gate result, and if it fails, the task itself fails — which fails the pipeline. No extra script needed if the task is configured with the correct timeout.

In practice, I set the polling timeout to 300 seconds. On a busy SonarQube server with a large project queue, analysis can take a while. If you see timeouts, bump this value up or look into upgrading your SonarQube server resources.

PR Decoration with Analysis Results

PR decoration puts SonarQube analysis results directly in your pull request comments. This is incredibly useful because developers see issues without leaving Azure DevOps.

To configure PR decoration:

  1. In SonarQube, go to Administration > Configuration > General Settings > DevOps Platform Integrations
  2. Select Azure DevOps
  3. Enter your Azure DevOps organization URL and a Personal Access Token with Code > Read & Write permissions
  4. In your SonarQube project settings, go to General Settings > DevOps Platform Integration and bind the project to your Azure DevOps repository

Once configured, every PR analysis will post a summary comment showing new issues, coverage changes, and the quality gate status. Individual issues also appear as inline comments on the affected lines of code.

For PR analysis to work, SonarQube needs to know it is analyzing a pull request. Azure Pipelines automatically sets the right environment variables when the pipeline runs on a PR trigger, and the SonarQube scanner picks them up. Just make sure your pipeline has a pr trigger defined.

sonar-project.properties Configuration

Instead of putting all your SonarQube configuration in the pipeline YAML, you can use a sonar-project.properties file at the root of your repository. This keeps analysis configuration version-controlled alongside your code:

sonar.projectKey=my-node-project
sonar.projectName=My Node Project
sonar.sources=src
sonar.tests=test
sonar.test.inclusions=**/*.test.js,**/*.spec.js

# Coverage
sonar.javascript.lcov.reportPaths=coverage/lcov.info

# Exclusions
sonar.exclusions=**/node_modules/**,**/coverage/**,**/dist/**,**/*.config.js

# Encoding
sonar.sourceEncoding=UTF-8

# Duplicate detection
sonar.cpd.minimumTokens=50
sonar.cpd.minimumLines=5

When you use this approach, change the Prepare task's configMode to file:

  - task: SonarQubePrepare@6
    inputs:
      SonarQube: 'SonarQube-Production'
      scannerMode: 'CLI'
      configMode: 'file'
    displayName: 'Prepare SonarQube Analysis'

I prefer the properties file approach for most projects. It means any developer can see and modify the analysis configuration without touching the pipeline YAML. It also makes it easier to keep configuration consistent if you run SonarQube locally during development.

Code Coverage Integration

SonarQube does not generate coverage data — it reads it. For Node.js projects using Jest, you need to configure Jest to output LCOV reports:

// jest.config.js
var path = require('path');

module.exports = {
  testEnvironment: 'node',
  roots: ['<rootDir>/test'],
  collectCoverageFrom: [
    'src/**/*.js',
    '!src/**/*.config.js',
    '!src/**/index.js'
  ],
  coverageDirectory: 'coverage',
  coverageReporters: ['lcov', 'text', 'text-summary'],
  testMatch: ['**/*.test.js', '**/*.spec.js']
};

The critical piece is the lcov reporter. SonarQube reads the coverage/lcov.info file to calculate line and branch coverage. The text and text-summary reporters are optional but useful for seeing coverage output in the pipeline logs.

If you have a monorepo or multiple coverage files, you can specify multiple paths:

sonar.javascript.lcov.reportPaths=packages/api/coverage/lcov.info,packages/web/coverage/lcov.info

One gotcha: make sure your coverage paths in sonar-project.properties are relative to the repository root, not the working directory of the test runner. If your paths are wrong, SonarQube will silently report 0% coverage without any error — a common source of confusion.

Custom Quality Profiles for JavaScript

SonarQube comes with a built-in quality profile for JavaScript, but you can customize it to match your team's standards. Go to Quality Profiles > JavaScript and create a copy of the "Sonar way" profile.

Common customizations for Node.js projects:

  • Disable browser-specific rules if you are building a backend service. Rules about document and window usage are irrelevant for server-side code.
  • Increase cognitive complexity thresholds from the default 15 to 20 or 25 for complex business logic modules. I find the default too aggressive for data transformation functions.
  • Enable require statement rules if you are using CommonJS. The default profile leans toward ES modules.
  • Add security rules for Express.js patterns — checking for missing helmet, CSRF protection, and rate limiting.

To assign your custom profile to a project, go to Project Settings > Quality Profiles and select your custom profile for JavaScript.

A word of advice: do not activate every available rule. Start with the defaults, run analysis on your existing codebase, and add rules incrementally. If you activate 500 rules on a legacy project with 10,000 issues, nobody will take the results seriously.

Managing False Positives

Every static analysis tool produces false positives. SonarQube gives you several ways to handle them:

In-code suppression — Add a comment to suppress a specific rule on a line:

var password = process.env.DB_PASSWORD; // NOSONAR

Or more precisely:

// Suppressing a specific rule
var config = require('./config'); // sonar-disable-next-line javascript:S1848

In SonarQube UI — Mark individual issues as "Won't Fix" or "False Positive" with an explanation. These dismissals persist across analyses.

Issue exclusions — Under Administration > General Settings > Analysis Scope > Issues, you can exclude specific rules from specific file patterns:

Rule: javascript:S1848
File Pattern: **/migrations/**

My recommendation: use in-code suppression sparingly. If you find yourself suppressing the same rule across many files, either the rule is wrong for your project (disable it in the quality profile) or your code has a pattern that needs refactoring.

SonarQube vs SonarCloud

SonarCloud is the hosted version of SonarQube. Here is a practical comparison:

Feature SonarQube SonarCloud
Hosting Self-managed SaaS (sonarcloud.io)
Cost Free (Community), paid for Developer+ Free for public repos, paid for private
Branch Analysis Developer Edition+ All plans
PR Decoration Developer Edition+ All plans
Maintenance You handle upgrades, backups, scaling Managed by SonarSource
Customization Full control over rules, profiles, plugins Limited plugin support
Data Location Your infrastructure SonarSource cloud

For most teams, I recommend SonarCloud if you are using Azure DevOps in the cloud and do not have strict data residency requirements. It eliminates the operational overhead of running a SonarQube server. The Azure DevOps integration is also slightly smoother since SonarCloud was built with cloud CI/CD in mind.

If you need to keep analysis data on-premises, need custom plugins, or have compliance requirements that prohibit sending code metadata to a third party, go with self-hosted SonarQube. The Developer Edition at minimum — the Community Edition lacks branch analysis and PR decoration, which are critical for pipeline integration.

The pipeline configuration is nearly identical for both. The main difference is the task names (SonarCloudPrepare, SonarCloudAnalyze, SonarCloudPublish) and the service connection type.

Branch and PR Analysis

Branch analysis is essential for tracking quality across your branching strategy. SonarQube (Developer Edition+) supports analyzing multiple branches and comparing them against your main branch.

Configure your pipeline to pass branch information:

  - task: SonarQubePrepare@6
    inputs:
      SonarQube: 'SonarQube-Production'
      scannerMode: 'CLI'
      configMode: 'file'
    displayName: 'Prepare SonarQube Analysis'

The scanner automatically detects the branch name from Azure Pipelines environment variables (BUILD_SOURCEBRANCH for CI builds, SYSTEM_PULLREQUEST_SOURCEBRANCH for PR builds). You do not need to manually set sonar.branch.name — the Azure DevOps integration handles it.

For PR analysis, SonarQube performs a differential analysis. It compares the PR branch against the target branch and only flags issues in new or modified code. This is the "new code" concept that SonarQube calls the New Code Period. You can configure this under project settings to use:

  • Previous version — new code since the last version tag
  • Reference branch — new code compared to a specific branch (usually main)
  • Number of days — new code from the last N days

I recommend the Reference branch strategy set to your main branch. It gives the clearest picture of what a PR introduces.

Security Hotspot Review

Security hotspots are different from vulnerabilities. A vulnerability is a confirmed issue; a hotspot is code that might be a security issue and needs human review. Examples include:

  • Using eval() or Function() constructor
  • HTTP endpoints without authentication middleware
  • File system access based on user input
  • Regular expressions vulnerable to ReDoS

When SonarQube flags a security hotspot, someone on the team needs to review it and mark it as either:

  • Fixed — the code was changed to address the concern
  • Safe — the code is intentional and secure in this context
  • Not applicable — the rule does not apply here

I recommend making hotspot review part of your PR process. Assign a security champion on the team to review new hotspots weekly. In your quality gate, you can add a condition that no unreviewed security hotspots are allowed on new code.

Complete Working Example

Here is a complete Azure Pipeline that runs SonarQube analysis on a Node.js project with Jest coverage, quality gate enforcement, and PR decoration.

First, the project structure:

my-node-api/
  src/
    app.js
    routes/
      users.js
    middleware/
      auth.js
    utils/
      validator.js
  test/
    routes/
      users.test.js
    middleware/
      auth.test.js
  sonar-project.properties
  jest.config.js
  package.json
  azure-pipelines.yml

package.json:

{
  "name": "my-node-api",
  "version": "1.0.0",
  "scripts": {
    "start": "node src/app.js",
    "test": "jest",
    "test:coverage": "jest --coverage",
    "lint": "eslint src/"
  },
  "dependencies": {
    "express": "^4.18.2",
    "helmet": "^7.1.0"
  },
  "devDependencies": {
    "eslint": "^8.56.0",
    "jest": "^29.7.0",
    "supertest": "^6.3.3"
  }
}

jest.config.js:

var path = require('path');

module.exports = {
  testEnvironment: 'node',
  roots: ['<rootDir>/test'],
  collectCoverageFrom: [
    'src/**/*.js',
    '!src/app.js'
  ],
  coverageDirectory: 'coverage',
  coverageReporters: ['lcov', 'text', 'text-summary', 'clover'],
  coverageThreshold: {
    global: {
      branches: 70,
      functions: 80,
      lines: 80,
      statements: 80
    }
  },
  testMatch: ['**/*.test.js'],
  verbose: true
};

sonar-project.properties:

sonar.projectKey=my-node-api
sonar.projectName=My Node API
sonar.projectVersion=1.0.0

# Source configuration
sonar.sources=src
sonar.tests=test
sonar.test.inclusions=**/*.test.js

# Coverage
sonar.javascript.lcov.reportPaths=coverage/lcov.info

# Exclusions
sonar.exclusions=**/node_modules/**,**/coverage/**,**/dist/**
sonar.coverage.exclusions=src/app.js,**/*.config.js

# Encoding
sonar.sourceEncoding=UTF-8

# Duplicate detection thresholds
sonar.cpd.minimumTokens=40
sonar.cpd.minimumLines=5

src/utils/validator.js:

var validator = {};

validator.isValidEmail = function(email) {
  if (!email || typeof email !== 'string') {
    return false;
  }
  var pattern = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
  return pattern.test(email.trim());
};

validator.sanitizeInput = function(input) {
  if (!input || typeof input !== 'string') {
    return '';
  }
  return input
    .replace(/&/g, '&amp;')
    .replace(/</g, '&lt;')
    .replace(/>/g, '&gt;')
    .replace(/"/g, '&quot;')
    .replace(/'/g, '&#x27;');
};

validator.isValidId = function(id) {
  if (!id) return false;
  var parsed = parseInt(id, 10);
  return !isNaN(parsed) && parsed > 0 && String(parsed) === String(id);
};

module.exports = validator;

test/utils/validator.test.js:

var validator = require('../../src/utils/validator');

describe('validator', function() {
  describe('isValidEmail', function() {
    test('returns true for valid emails', function() {
      expect(validator.isValidEmail('[email protected]')).toBe(true);
      expect(validator.isValidEmail('[email protected]')).toBe(true);
    });

    test('returns false for invalid emails', function() {
      expect(validator.isValidEmail('')).toBe(false);
      expect(validator.isValidEmail(null)).toBe(false);
      expect(validator.isValidEmail('notanemail')).toBe(false);
      expect(validator.isValidEmail('@domain.com')).toBe(false);
    });
  });

  describe('sanitizeInput', function() {
    test('escapes HTML characters', function() {
      var result = validator.sanitizeInput('<script>alert("xss")</script>');
      expect(result).toBe('&lt;script&gt;alert(&quot;xss&quot;)&lt;/script&gt;');
    });

    test('handles empty input', function() {
      expect(validator.sanitizeInput('')).toBe('');
      expect(validator.sanitizeInput(null)).toBe('');
    });
  });

  describe('isValidId', function() {
    test('returns true for valid positive integers', function() {
      expect(validator.isValidId('1')).toBe(true);
      expect(validator.isValidId('999')).toBe(true);
    });

    test('returns false for invalid ids', function() {
      expect(validator.isValidId('0')).toBe(false);
      expect(validator.isValidId('-1')).toBe(false);
      expect(validator.isValidId('abc')).toBe(false);
      expect(validator.isValidId(null)).toBe(false);
    });
  });
});

azure-pipelines.yml:

trigger:
  branches:
    include:
      - main
      - develop
      - release/*

pr:
  branches:
    include:
      - main
      - develop

pool:
  vmImage: 'ubuntu-latest'

variables:
  - name: nodeVersion
    value: '20.x'
  - name: sonarProjectKey
    value: 'my-node-api'

stages:
  - stage: Build
    displayName: 'Build & Test'
    jobs:
      - job: BuildAndAnalyze
        displayName: 'Build, Test, and Analyze'
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '$(nodeVersion)'
            displayName: 'Install Node.js $(nodeVersion)'

          - task: SonarQubePrepare@6
            inputs:
              SonarQube: 'SonarQube-Production'
              scannerMode: 'CLI'
              configMode: 'file'
            displayName: 'Prepare SonarQube Analysis'

          - script: npm ci
            displayName: 'Install Dependencies'

          - script: npm run lint
            displayName: 'Run ESLint'
            continueOnError: true

          - script: npm run test:coverage
            displayName: 'Run Tests with Coverage'

          - task: PublishTestResults@2
            inputs:
              testResultsFormat: 'JUnit'
              testResultsFiles: 'coverage/junit.xml'
              mergeTestResults: true
            displayName: 'Publish Test Results'
            condition: succeededOrFailed()

          - task: PublishCodeCoverageResults@2
            inputs:
              summaryFileLocation: 'coverage/cobertura-coverage.xml'
            displayName: 'Publish Coverage to Azure DevOps'
            condition: succeededOrFailed()

          - task: SonarQubeAnalyze@6
            displayName: 'Run SonarQube Analysis'

          - task: SonarQubePublish@6
            inputs:
              pollingTimeoutSec: '300'
            displayName: 'Publish Quality Gate Result'

This pipeline does the following in order:

  1. Installs Node.js 20
  2. Prepares the SonarQube scanner (reads sonar-project.properties)
  3. Installs npm dependencies
  4. Runs ESLint (continues even if linting fails, since SonarQube will also catch style issues)
  5. Runs Jest with coverage enabled
  6. Publishes test results and coverage to Azure DevOps (for the Tests and Coverage tabs)
  7. Runs the SonarQube analysis (sends source code and coverage data to the server)
  8. Publishes the quality gate result (fails the pipeline if the gate fails)

Common Issues and Troubleshooting

Coverage Shows 0% Despite Tests Passing

This is the number one issue I see. The cause is almost always a path mismatch. SonarQube looks for the LCOV file at the path specified in sonar.javascript.lcov.reportPaths, relative to the project root. Verify the file exists after tests run:

  - script: |
      echo "Checking coverage output..."
      ls -la coverage/
      head -20 coverage/lcov.info
    displayName: 'Debug Coverage Output'
    condition: always()

Also check that the file paths inside lcov.info match your source file paths. If Jest is configured with rootDir or module aliases, the paths in the LCOV file might not match what SonarQube expects.

Scanner Fails with "Not Authorized"

The user token you configured in the service connection needs the Execute Analysis permission on the SonarQube project. Go to SonarQube > Project Settings > Permissions and make sure the token's user has the right access. Also verify the token has not expired — SonarQube tokens can be configured with expiration dates.

Analysis Takes Too Long or Times Out

Large Node.js projects with thousands of files can take 10+ minutes to analyze. Solutions:

  • Exclude generated files, build output, and vendor directories using sonar.exclusions
  • Increase the pollingTimeoutSec on the Publish task
  • If you are on SonarQube Community Edition, consider that analysis runs sequentially — only one project can be analyzed at a time. Upgrade to Developer Edition for parallel analysis.
  • Check your SonarQube server's Compute Engine logs for bottlenecks

Quality Gate Status Stays "Pending"

If the Publish task keeps polling but never gets a result, the most common cause is a missing or misconfigured webhook. SonarQube needs a webhook to notify Azure DevOps when analysis is complete. Go to Administration > Configuration > Webhooks and verify the webhook URL is correct and reachable from the SonarQube server.

Another cause: the SonarQube Compute Engine queue is backed up. Check Administration > Compute Engine to see if your analysis is stuck in the queue.

PR Decoration Not Appearing

Check these in order:

  1. The SonarQube project is bound to the correct Azure DevOps repository under project settings
  2. The Personal Access Token in the DevOps Platform Integration has Code > Read & Write scope
  3. The PAT has not expired
  4. The pipeline is running on a PR trigger (not just a CI trigger)
  5. You are using Developer Edition or higher (Community Edition does not support PR decoration)

"Project Not Found" During Analysis

The project key in your pipeline or sonar-project.properties must match an existing project in SonarQube. Either create the project manually in SonarQube first, or enable automatic project provisioning under Administration > Configuration > General Settings > DevOps Platform Integrations.

Best Practices

  • Use the properties file over inline YAML configuration. Keeping sonar-project.properties in your repository means analysis configuration is versioned, reviewable, and portable. Developers can also run local analysis with the same settings.

  • Set the New Code Period to "Reference Branch" against main. This gives the most meaningful quality gate results for pull requests. You want to know if the PR introduces issues, not whether the entire codebase meets the bar.

  • Do not skip the quality gate on failure. It is tempting to add continueOnError: true to the SonarQube tasks so builds do not block. Resist this. A quality gate that does not block anything is just a dashboard nobody checks.

  • Exclude test files from source analysis but include them in test analysis. Use sonar.sources for production code and sonar.tests for test code. SonarQube applies different rule sets to each. You do not want test helper functions flagged for missing error handling.

  • Run SonarQube analysis on every PR, not just main branch builds. Catching issues before merge is the entire point. If you only analyze on main, you are doing a post-mortem instead of a prevention.

  • Review security hotspots weekly. They pile up fast if ignored. Assign a rotating security reviewer and make it part of your sprint process. Unreviewed hotspots sitting at 200+ send the message that nobody cares about security.

  • Tune the quality profile before rolling out to the team. Run an initial analysis, review the results, and disable rules that are not relevant to your stack. A wall of irrelevant warnings trains developers to ignore all warnings.

  • Cache npm dependencies in your pipeline. SonarQube analysis itself is not slow, but the full pipeline including npm ci can be. Use Azure Pipelines caching to speed up dependency installation:

  - task: Cache@2
    inputs:
      key: 'npm | "$(Agent.OS)" | package-lock.json'
      restoreKeys: |
        npm | "$(Agent.OS)"
      path: '$(Pipeline.Workspace)/.npm'
    displayName: 'Cache npm packages'
  • Monitor your SonarQube server's health. If the Compute Engine queue grows, pipelines will time out waiting for quality gate results. Set up alerting on queue depth and analysis duration.

  • Keep SonarQube updated. Each release improves JavaScript/TypeScript analysis significantly. The rules get smarter, false positives decrease, and new security checks are added. Budget time for quarterly upgrades.

References

Powered by Contentful