Integrations

GitHub and Azure DevOps: Hybrid Workflows

Build hybrid workflows combining GitHub repos with Azure Pipelines, Azure Boards, and cross-platform automation

GitHub and Azure DevOps: Hybrid Workflows

Overview

Most teams treat GitHub and Azure DevOps as competing platforms. They are not. The strongest engineering workflows I have built in production combine GitHub as the source control and collaboration layer with Azure DevOps for enterprise CI/CD, work item tracking, and release management. This article covers the concrete patterns, automation code, and configuration you need to run a hybrid workflow that leverages the best of both platforms.

Prerequisites

  • An Azure DevOps organization with an active project
  • A GitHub account with at least one repository
  • Node.js 18+ installed locally
  • Basic familiarity with YAML pipelines and GitHub Actions
  • Azure CLI installed (az command available)
  • A Personal Access Token (PAT) for both GitHub and Azure DevOps

Why Use Both GitHub and Azure DevOps

The honest answer is that each platform is better at different things. GitHub has the developer community, open source ecosystem, pull request experience, and Copilot integration. Azure DevOps has enterprise-grade pipelines with deployment gates, approval workflows, variable groups, environments with checks, and Azure Boards for work item tracking that integrates directly with Azure resources.

In organizations I have worked with, the pattern emerges naturally. Developers want to work in GitHub. Project managers want Azure Boards. The ops team wants Azure Pipelines with its release management capabilities. Fighting this is a waste of energy. Building a hybrid workflow that respects each group's preferences produces better outcomes.

The key scenarios where hybrid makes sense:

  • Open source projects that need enterprise CI/CD behind the scenes
  • Regulated industries where Azure DevOps audit trails are required but developers prefer GitHub
  • Migration periods where you are moving from one platform to the other incrementally
  • Multi-team organizations where some teams use GitHub and others use Azure DevOps
  • Enterprise Azure shops that acquired companies running on GitHub

Azure Pipelines with GitHub Repos

The most common hybrid pattern is using Azure Pipelines to build and deploy code that lives in GitHub. Azure Pipelines has first-class GitHub integration through the Azure Pipelines GitHub App.

Setting Up the Connection

First, install the Azure Pipelines app in your GitHub organization. Then create a service connection in Azure DevOps:

  1. Go to Project Settings > Service connections
  2. Select "New service connection" > GitHub
  3. Choose "Azure Pipelines" as the authentication method
  4. Authorize and select your repositories

Now create an azure-pipelines.yml in the root of your GitHub repo:

trigger:
  branches:
    include:
      - main
      - release/*
  paths:
    exclude:
      - '*.md'
      - docs/*

pr:
  branches:
    include:
      - main

pool:
  vmImage: 'ubuntu-latest'

variables:
  - group: production-secrets
  - name: NODE_VERSION
    value: '20.x'

stages:
  - stage: Build
    displayName: 'Build and Test'
    jobs:
      - job: BuildJob
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '$(NODE_VERSION)'
            displayName: 'Install Node.js'

          - script: |
              npm ci
              npm run lint
              npm test
            displayName: 'Install, Lint, and Test'

          - task: PublishTestResults@2
            inputs:
              testResultsFormat: 'JUnit'
              testResultsFiles: '**/test-results.xml'
            condition: always()

          - task: PublishCodeCoverageResults@2
            inputs:
              summaryFileLocation: 'coverage/cobertura-coverage.xml'

  - stage: Deploy
    displayName: 'Deploy to Production'
    dependsOn: Build
    condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
    jobs:
      - deployment: DeployJob
        environment: 'production'
        strategy:
          runOnce:
            deploy:
              steps:
                - script: |
                    npm ci --production
                    npm run deploy
                  displayName: 'Deploy Application'

The critical advantage here over GitHub Actions is the environment resource with its approval gates. You can require manual approval, business hours checks, and artifact validation before any deployment proceeds.

GitHub Actions with Azure Boards

The reverse pattern works too. If your CI runs in GitHub Actions but you track work in Azure Boards, you can automate the connection. Here is a GitHub Actions workflow that updates Azure Boards work items based on commit messages:

name: Update Azure Boards
on:
  push:
    branches: [main]
  pull_request:
    types: [opened, closed]

jobs:
  update-boards:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Update Work Items
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Link Commits to Work Items
        env:
          AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
          AZURE_DEVOPS_ORG: ${{ secrets.AZURE_DEVOPS_ORG }}
          AZURE_DEVOPS_PROJECT: ${{ secrets.AZURE_DEVOPS_PROJECT }}
        run: node .github/scripts/update-work-items.js

The companion Node.js script that parses commit messages and updates Azure Boards:

var https = require("https");

var org = process.env.AZURE_DEVOPS_ORG;
var project = process.env.AZURE_DEVOPS_PROJECT;
var pat = process.env.AZURE_DEVOPS_PAT;

function extractWorkItemIds(commitMessage) {
    var pattern = /AB#(\d+)/g;
    var ids = [];
    var match;
    while ((match = pattern.exec(commitMessage)) !== null) {
        ids.push(parseInt(match[1], 10));
    }
    return ids;
}

function updateWorkItem(workItemId, commitSha, commitMessage, callback) {
    var token = Buffer.from(":" + pat).toString("base64");
    var patchBody = JSON.stringify([
        {
            op: "add",
            path: "/fields/System.History",
            value: "Commit " + commitSha.substring(0, 7) + ": " + commitMessage
        },
        {
            op: "add",
            path: "/relations/-",
            value: {
                rel: "ArtifactLink",
                url: "vstfs:///Git/Commit/" + encodeURIComponent(commitSha),
                attributes: {
                    name: "Fixed in Commit"
                }
            }
        }
    ]);

    var options = {
        hostname: "dev.azure.com",
        path: "/" + org + "/" + project + "/_apis/wit/workitems/" + workItemId + "?api-version=7.1",
        method: "PATCH",
        headers: {
            "Content-Type": "application/json-patch+json",
            "Authorization": "Basic " + token,
            "Content-Length": Buffer.byteLength(patchBody)
        }
    };

    var req = https.request(options, function(res) {
        var body = "";
        res.on("data", function(chunk) { body += chunk; });
        res.on("end", function() {
            if (res.statusCode >= 200 && res.statusCode < 300) {
                console.log("Updated work item #" + workItemId);
                callback(null);
            } else {
                callback(new Error("Failed to update work item #" + workItemId + ": " + res.statusCode));
            }
        });
    });

    req.on("error", callback);
    req.write(patchBody);
    req.end();
}

function processCommits() {
    var eventPath = process.env.GITHUB_EVENT_PATH;
    if (!eventPath) {
        console.log("No GitHub event data found");
        return;
    }

    var fs = require("fs");
    var event = JSON.parse(fs.readFileSync(eventPath, "utf8"));
    var commits = event.commits || [];

    console.log("Processing " + commits.length + " commits");

    var pending = 0;
    var errors = [];

    commits.forEach(function(commit) {
        var ids = extractWorkItemIds(commit.message);
        ids.forEach(function(id) {
            pending++;
            updateWorkItem(id, commit.id, commit.message, function(err) {
                if (err) errors.push(err);
                pending--;
                if (pending === 0) {
                    if (errors.length > 0) {
                        console.error("Errors:", errors.map(function(e) { return e.message; }));
                        process.exit(1);
                    }
                    console.log("All work items updated successfully");
                }
            });
        });
    });

    if (pending === 0) {
        console.log("No work item references found in commits");
    }
}

processCommits();

Use the AB#123 syntax in your commit messages. Azure DevOps recognizes this pattern natively when the GitHub integration is configured, but this script gives you additional control over what happens during the update.

Cross-Platform Branch Policies

One challenge in hybrid setups is enforcing consistent branch policies. GitHub has branch protection rules. Azure DevOps has branch policies. You need both configured if pipelines run on Azure but code reviews happen on GitHub.

On the GitHub side, configure branch protection for main:

  • Require pull request reviews (at least 1 approval)
  • Require status checks to pass (include your Azure Pipeline status)
  • Require branches to be up to date before merging
  • Require signed commits if your organization mandates it

On the Azure DevOps side, even though the repo lives in GitHub, you can add build validation policies that trigger Azure Pipelines on pull requests. The pipeline status reports back to GitHub as a commit status check.

Here is a Node.js utility that audits branch policies across both platforms:

var https = require("https");

var config = {
    github: {
        token: process.env.GITHUB_TOKEN,
        owner: process.env.GITHUB_OWNER,
        repo: process.env.GITHUB_REPO
    },
    azureDevOps: {
        pat: process.env.AZURE_DEVOPS_PAT,
        org: process.env.AZURE_DEVOPS_ORG,
        project: process.env.AZURE_DEVOPS_PROJECT
    }
};

function githubRequest(path, callback) {
    var options = {
        hostname: "api.github.com",
        path: path,
        method: "GET",
        headers: {
            "Authorization": "token " + config.github.token,
            "User-Agent": "branch-policy-auditor",
            "Accept": "application/vnd.github.v3+json"
        }
    };

    var req = https.request(options, function(res) {
        var body = "";
        res.on("data", function(chunk) { body += chunk; });
        res.on("end", function() {
            callback(null, JSON.parse(body));
        });
    });
    req.on("error", callback);
    req.end();
}

function azureDevOpsRequest(path, callback) {
    var token = Buffer.from(":" + config.azureDevOps.pat).toString("base64");
    var options = {
        hostname: "dev.azure.com",
        path: "/" + config.azureDevOps.org + "/" + config.azureDevOps.project + path,
        method: "GET",
        headers: {
            "Authorization": "Basic " + token,
            "Accept": "application/json"
        }
    };

    var req = https.request(options, function(res) {
        var body = "";
        res.on("data", function(chunk) { body += chunk; });
        res.on("end", function() {
            callback(null, JSON.parse(body));
        });
    });
    req.on("error", callback);
    req.end();
}

function auditBranchPolicies() {
    var owner = config.github.owner;
    var repo = config.github.repo;

    githubRequest("/repos/" + owner + "/" + repo + "/branches/main/protection", function(err, ghPolicy) {
        if (err) {
            console.error("Failed to fetch GitHub branch protection:", err.message);
            return;
        }

        console.log("=== GitHub Branch Protection ===");
        console.log("PR reviews required:", ghPolicy.required_pull_request_reviews ? "Yes" : "No");
        console.log("Status checks required:", ghPolicy.required_status_checks ? "Yes" : "No");
        console.log("Up to date required:", ghPolicy.required_status_checks ? ghPolicy.required_status_checks.strict : "N/A");

        if (ghPolicy.required_status_checks && ghPolicy.required_status_checks.contexts) {
            console.log("Required status checks:", ghPolicy.required_status_checks.contexts.join(", "));
        }

        azureDevOpsRequest("/_apis/policy/configurations?api-version=7.1", function(err, azPolicy) {
            if (err) {
                console.error("Failed to fetch Azure DevOps policies:", err.message);
                return;
            }

            console.log("\n=== Azure DevOps Branch Policies ===");
            var policies = azPolicy.value || [];
            policies.forEach(function(policy) {
                if (policy.isEnabled) {
                    console.log("Policy:", policy.type.displayName, "- Blocking:", policy.isBlocking);
                }
            });

            console.log("\n=== Compliance Summary ===");
            var issues = [];
            if (!ghPolicy.required_pull_request_reviews) {
                issues.push("GitHub: PR reviews not required on main");
            }
            if (!ghPolicy.required_status_checks) {
                issues.push("GitHub: Status checks not required on main");
            }
            if (policies.filter(function(p) { return p.isEnabled && p.isBlocking; }).length === 0) {
                issues.push("Azure DevOps: No blocking policies configured");
            }

            if (issues.length === 0) {
                console.log("All branch policies are properly configured.");
            } else {
                console.log("Issues found:");
                issues.forEach(function(issue) { console.log("  - " + issue); });
            }
        });
    });
}

auditBranchPolicies();

PR Mirroring Strategies

Some organizations need pull requests visible in both platforms. This is common during migrations or when different teams use different tools. There are two approaches:

Webhook-based mirroring creates a shadow PR in Azure DevOps whenever a GitHub PR is opened. The shadow PR exists purely for visibility in Azure Boards and dashboards. It does not accept code changes.

Status synchronization is the lighter approach. Instead of mirroring the entire PR, you synchronize statuses. When a GitHub PR is approved, a webhook updates the associated Azure Boards work item. When an Azure Pipeline completes, it posts back to GitHub as a commit status.

I recommend status synchronization. Full PR mirroring adds complexity with minimal benefit, and it creates confusion about which platform is the source of truth.

Work Item Linking from GitHub Commits

The AB# syntax is the foundation of cross-platform traceability. When you include AB#123 in a commit message pushed to a GitHub repo connected to Azure DevOps, the work item automatically gets a link to that commit.

The convention I enforce on teams:

feat: add user authentication AB#456

Implements OAuth2 flow with PKCE for the SPA client.
Resolves AB#456, related to AB#400

You can also use keywords for state transitions:

  • Fixes AB#123 moves the work item to Done/Closed
  • Resolves AB#123 same as Fixes
  • AB#123 creates a link without state change

Configure this in Azure DevOps under Project Settings > GitHub Connections. Select your connected repos and enable work item mention resolution.

Azure Boards GitHub App

The Azure Boards app for GitHub is the official integration point. Install it from the GitHub Marketplace and connect it to your Azure DevOps organization.

What it enables:

  • Automatic linking: AB# mentions in commits, PRs, and branches create links
  • Status badges: Work item status shows in GitHub PR descriptions
  • Transition triggers: Merging a PR with Fixes AB#123 moves the work item to Done
  • PR annotations: The bot adds a comment to GitHub PRs showing linked work items

The setup requires admin access to both the GitHub organization and the Azure DevOps project. After installation, go to Azure DevOps > Project Settings > GitHub Connections and add your repositories.

One thing the documentation does not emphasize enough: the Azure Boards app only works with GitHub.com, not GitHub Enterprise Server, unless you are on Azure DevOps Server 2020 or later with specific network configurations.

Unified CI/CD Across Platforms

For teams that use both GitHub Actions and Azure Pipelines, you need a strategy to avoid duplicating pipeline definitions. My preferred approach is a shared Node.js build script that both platforms call:

// scripts/build.js
var childProcess = require("child_process");
var fs = require("fs");
var path = require("path");

var BUILD_DIR = path.join(__dirname, "..", "dist");
var PLATFORM = process.env.CI_PLATFORM || "local";

function exec(command) {
    console.log("[" + PLATFORM + "] Running: " + command);
    try {
        childProcess.execSync(command, { stdio: "inherit", cwd: path.join(__dirname, "..") });
    } catch (err) {
        console.error("Command failed: " + command);
        process.exit(1);
    }
}

function cleanBuildDir() {
    if (fs.existsSync(BUILD_DIR)) {
        fs.rmSync(BUILD_DIR, { recursive: true });
    }
    fs.mkdirSync(BUILD_DIR, { recursive: true });
}

function runTests() {
    exec("npm test -- --reporter mocha-junit-reporter --reporter-options mochaFile=test-results.xml");
}

function buildApp() {
    exec("npm run build");
}

function generateArtifacts() {
    var manifest = {
        version: require("../package.json").version,
        platform: PLATFORM,
        buildTime: new Date().toISOString(),
        commit: process.env.GITHUB_SHA || process.env.BUILD_SOURCEVERSION || "local"
    };

    fs.writeFileSync(
        path.join(BUILD_DIR, "build-manifest.json"),
        JSON.stringify(manifest, null, 2)
    );

    console.log("Build manifest:", JSON.stringify(manifest, null, 2));
}

function main() {
    console.log("Starting build on platform: " + PLATFORM);
    cleanBuildDir();
    runTests();
    buildApp();
    generateArtifacts();
    console.log("Build complete");
}

main();

Then both pipeline definitions call the same script:

azure-pipelines.yml:

steps:
  - script: node scripts/build.js
    env:
      CI_PLATFORM: azure-pipelines

GitHub Actions:

steps:
  - run: node scripts/build.js
    env:
      CI_PLATFORM: github-actions

This pattern ensures identical build behavior regardless of which CI platform runs the job.

Repository Sync Patterns

If you need to maintain mirrors between GitHub and Azure Repos, there are a few approaches:

One-way sync (GitHub to Azure Repos): Use Azure Pipelines to push on every commit:

trigger:
  branches:
    include:
      - '*'

pool:
  vmImage: 'ubuntu-latest'

steps:
  - checkout: self
    persistCredentials: true

  - script: |
      git remote add azure-mirror https://$(AZURE_REPOS_PAT)@dev.azure.com/$(ORG)/$(PROJECT)/_git/$(REPO)
      git push azure-mirror --all --force
      git push azure-mirror --tags --force
    displayName: 'Sync to Azure Repos'

Bidirectional sync is harder and I generally advise against it. Merge conflicts between platforms are painful. If you absolutely need it, use a dedicated sync service or a cron job that detects divergence and raises alerts rather than auto-merging.

Event-driven sync with webhooks is the cleanest approach for selective sync:

// sync-webhook-server.js
var http = require("http");
var crypto = require("crypto");
var childProcess = require("child_process");

var WEBHOOK_SECRET = process.env.WEBHOOK_SECRET;
var SYNC_DIR = process.env.SYNC_DIR || "/tmp/repo-sync";

function verifySignature(payload, signature) {
    var expected = "sha256=" + crypto
        .createHmac("sha256", WEBHOOK_SECRET)
        .update(payload)
        .digest("hex");
    return crypto.timingSafeEqual(Buffer.from(signature), Buffer.from(expected));
}

function syncRepository(repoUrl, targetUrl, branch) {
    var commands = [
        "git clone --bare " + repoUrl + " " + SYNC_DIR,
        "cd " + SYNC_DIR + " && git push --mirror " + targetUrl
    ];

    commands.forEach(function(cmd) {
        try {
            childProcess.execSync(cmd, { stdio: "inherit" });
        } catch (err) {
            console.error("Sync failed for command: " + cmd);
            throw err;
        }
    });

    childProcess.execSync("rm -rf " + SYNC_DIR);
}

var server = http.createServer(function(req, res) {
    if (req.method !== "POST" || req.url !== "/webhook") {
        res.writeHead(404);
        res.end("Not Found");
        return;
    }

    var body = "";
    req.on("data", function(chunk) { body += chunk; });
    req.on("end", function() {
        var signature = req.headers["x-hub-signature-256"];
        if (!signature || !verifySignature(body, signature)) {
            res.writeHead(401);
            res.end("Unauthorized");
            return;
        }

        var event = JSON.parse(body);
        console.log("Received push event for " + event.repository.full_name);

        try {
            syncRepository(
                event.repository.clone_url,
                process.env.AZURE_REPOS_URL,
                event.ref
            );
            res.writeHead(200);
            res.end("Synced");
        } catch (err) {
            res.writeHead(500);
            res.end("Sync failed: " + err.message);
        }
    });
});

server.listen(process.env.PORT || 3000, function() {
    console.log("Sync webhook server listening on port " + (process.env.PORT || 3000));
});

Choosing GitHub vs Azure Repos

Here is my honest comparison after running both in production:

Aspect GitHub Azure Repos
PR experience Superior. Better UI, better review tools, Copilot integration Functional but dated
Open source The standard. No contest Not designed for it
Branch policies Good with branch protection rules More granular with path-based policies
CI integration GitHub Actions is tightly integrated Azure Pipelines is tightly integrated
Enterprise audit GitHub Enterprise has good audit logs Azure DevOps audit logs are more comprehensive
Code search Excellent with GitHub code search Basic search only
TFVC support None Full support for legacy repos
Pricing Per-user for private repos Included with Azure DevOps subscription

My recommendation: use GitHub for source control unless you have a specific regulatory requirement that mandates Azure Repos, or you are already deeply invested in TFVC.

Migration Strategies

If you are migrating from Azure Repos to GitHub (the more common direction), plan it in phases:

Phase 1 - Parallel operation (2-4 weeks): Set up GitHub repos, mirror from Azure Repos, and run both pipelines. This validates that nothing breaks.

Phase 2 - Primary switch (1 week): Make GitHub the primary source. Developers push to GitHub. Sync still runs to Azure Repos for any remaining dependencies.

Phase 3 - Decommission (2-4 weeks): Remove Azure Repos dependencies. Update all webhook integrations. Archive old repos.

For work item migration, export Azure Boards items using the REST API and import into GitHub Issues if you are also switching project management tools. If you are keeping Azure Boards (which I recommend for enterprise teams), no migration is needed for work items.

Authentication Between Platforms with Node.js

Managing authentication across both platforms is a common pain point. Here is a utility module that handles both:

// lib/platform-auth.js
var https = require("https");

function AzureDevOpsClient(org, project, pat) {
    this.org = org;
    this.project = project;
    this.token = Buffer.from(":" + pat).toString("base64");
    this.baseUrl = "dev.azure.com";
}

AzureDevOpsClient.prototype.request = function(method, path, body, callback) {
    var self = this;
    var fullPath = "/" + self.org + "/" + self.project + path;

    if (fullPath.indexOf("api-version") === -1) {
        fullPath += (fullPath.indexOf("?") === -1 ? "?" : "&") + "api-version=7.1";
    }

    var options = {
        hostname: self.baseUrl,
        path: fullPath,
        method: method,
        headers: {
            "Authorization": "Basic " + self.token,
            "Content-Type": "application/json",
            "Accept": "application/json"
        }
    };

    if (body) {
        var bodyStr = JSON.stringify(body);
        options.headers["Content-Length"] = Buffer.byteLength(bodyStr);
    }

    var req = https.request(options, function(res) {
        var data = "";
        res.on("data", function(chunk) { data += chunk; });
        res.on("end", function() {
            if (res.statusCode >= 200 && res.statusCode < 300) {
                callback(null, JSON.parse(data || "{}"));
            } else {
                callback(new Error("Azure DevOps API error " + res.statusCode + ": " + data));
            }
        });
    });

    req.on("error", callback);
    if (body) req.write(JSON.stringify(body));
    req.end();
};

function GitHubClient(token) {
    this.token = token;
    this.baseUrl = "api.github.com";
}

GitHubClient.prototype.request = function(method, path, body, callback) {
    var options = {
        hostname: this.baseUrl,
        path: path,
        method: method,
        headers: {
            "Authorization": "token " + this.token,
            "User-Agent": "hybrid-workflow-client",
            "Accept": "application/vnd.github.v3+json",
            "Content-Type": "application/json"
        }
    };

    if (body) {
        var bodyStr = JSON.stringify(body);
        options.headers["Content-Length"] = Buffer.byteLength(bodyStr);
    }

    var req = https.request(options, function(res) {
        var data = "";
        res.on("data", function(chunk) { data += chunk; });
        res.on("end", function() {
            if (res.statusCode >= 200 && res.statusCode < 300) {
                callback(null, JSON.parse(data || "{}"));
            } else {
                callback(new Error("GitHub API error " + res.statusCode + ": " + data));
            }
        });
    });

    req.on("error", callback);
    if (body) req.write(JSON.stringify(body));
    req.end();
};

module.exports = {
    AzureDevOpsClient: AzureDevOpsClient,
    GitHubClient: GitHubClient
};

Complete Working Example

This is the full hybrid workflow: GitHub as the source repo, Azure Pipelines for CI/CD, Azure Boards for project management, and a Node.js orchestrator that ties everything together.

The Orchestrator

// hybrid-workflow.js
var platformAuth = require("./lib/platform-auth");
var http = require("http");
var crypto = require("crypto");

var GITHUB_WEBHOOK_SECRET = process.env.GITHUB_WEBHOOK_SECRET;
var AZURE_ORG = process.env.AZURE_DEVOPS_ORG;
var AZURE_PROJECT = process.env.AZURE_DEVOPS_PROJECT;
var AZURE_PAT = process.env.AZURE_DEVOPS_PAT;
var GITHUB_TOKEN = process.env.GITHUB_TOKEN;

var azdo = new platformAuth.AzureDevOpsClient(AZURE_ORG, AZURE_PROJECT, AZURE_PAT);
var github = new platformAuth.GitHubClient(GITHUB_TOKEN);

function verifyGitHubWebhook(payload, signature) {
    var expected = "sha256=" + crypto
        .createHmac("sha256", GITHUB_WEBHOOK_SECRET)
        .update(payload)
        .digest("hex");
    return crypto.timingSafeEqual(
        Buffer.from(signature || ""),
        Buffer.from(expected)
    );
}

function handlePushEvent(event) {
    var commits = event.commits || [];
    var repo = event.repository.full_name;

    console.log("Push to " + repo + " with " + commits.length + " commits");

    commits.forEach(function(commit) {
        var workItemPattern = /AB#(\d+)/g;
        var match;
        while ((match = workItemPattern.exec(commit.message)) !== null) {
            var workItemId = parseInt(match[1], 10);
            linkCommitToWorkItem(workItemId, commit, repo);
        }
    });
}

function handlePREvent(event) {
    var action = event.action;
    var pr = event.pull_request;

    console.log("PR #" + pr.number + " " + action + " in " + event.repository.full_name);

    if (action === "opened" || action === "edited") {
        var workItemPattern = /AB#(\d+)/g;
        var match;
        var body = pr.body || "";
        while ((match = workItemPattern.exec(body)) !== null) {
            var workItemId = parseInt(match[1], 10);
            updateWorkItemWithPR(workItemId, pr);
        }
    }

    if (action === "closed" && pr.merged) {
        var fixPattern = /[Ff]ixes AB#(\d+)/g;
        var fixMatch;
        var prBody = pr.body || "";
        while ((fixMatch = fixPattern.exec(prBody)) !== null) {
            var fixId = parseInt(fixMatch[1], 10);
            closeWorkItem(fixId, pr);
        }
    }
}

function linkCommitToWorkItem(workItemId, commit, repo) {
    var patchBody = [
        {
            op: "add",
            path: "/fields/System.History",
            value: "<a href=\"" + commit.url + "\">" + commit.id.substring(0, 7) + "</a> by " +
                commit.author.name + ": " + commit.message.split("\n")[0]
        }
    ];

    azdo.request("PATCH", "/_apis/wit/workitems/" + workItemId, patchBody, function(err) {
        if (err) {
            console.error("Failed to link commit to work item #" + workItemId + ":", err.message);
        } else {
            console.log("Linked commit " + commit.id.substring(0, 7) + " to AB#" + workItemId);
        }
    });
}

function updateWorkItemWithPR(workItemId, pr) {
    var patchBody = [
        {
            op: "add",
            path: "/fields/System.History",
            value: "PR <a href=\"" + pr.html_url + "\">#" + pr.number + "</a>: " + pr.title
        },
        {
            op: "replace",
            path: "/fields/System.State",
            value: "Active"
        }
    ];

    azdo.request("PATCH", "/_apis/wit/workitems/" + workItemId, patchBody, function(err) {
        if (err) {
            console.error("Failed to update work item #" + workItemId + ":", err.message);
        } else {
            console.log("Updated AB#" + workItemId + " with PR #" + pr.number);
        }
    });
}

function closeWorkItem(workItemId, pr) {
    var patchBody = [
        {
            op: "replace",
            path: "/fields/System.State",
            value: "Closed"
        },
        {
            op: "add",
            path: "/fields/System.History",
            value: "Resolved by merging PR <a href=\"" + pr.html_url + "\">#" + pr.number + "</a>"
        },
        {
            op: "add",
            path: "/fields/Microsoft.VSTS.Common.ResolvedReason",
            value: "Fixed"
        }
    ];

    azdo.request("PATCH", "/_apis/wit/workitems/" + workItemId, patchBody, function(err) {
        if (err) {
            console.error("Failed to close work item #" + workItemId + ":", err.message);
        } else {
            console.log("Closed AB#" + workItemId + " via PR #" + pr.number);
        }
    });
}

function postStatusToGitHub(owner, repo, sha, state, description, targetUrl) {
    var statusBody = {
        state: state,
        target_url: targetUrl,
        description: description,
        context: "azure-devops/work-items"
    };

    github.request("POST", "/repos/" + owner + "/" + repo + "/statuses/" + sha, statusBody, function(err) {
        if (err) {
            console.error("Failed to post GitHub status:", err.message);
        } else {
            console.log("Posted status to GitHub: " + state + " - " + description);
        }
    });
}

var server = http.createServer(function(req, res) {
    if (req.method !== "POST") {
        res.writeHead(405);
        res.end("Method Not Allowed");
        return;
    }

    var body = "";
    req.on("data", function(chunk) { body += chunk; });
    req.on("end", function() {
        var signature = req.headers["x-hub-signature-256"];
        if (!verifyGitHubWebhook(body, signature)) {
            res.writeHead(401);
            res.end("Invalid signature");
            return;
        }

        var event;
        try {
            event = JSON.parse(body);
        } catch (err) {
            res.writeHead(400);
            res.end("Invalid JSON");
            return;
        }

        var eventType = req.headers["x-github-event"];

        switch (eventType) {
            case "push":
                handlePushEvent(event);
                break;
            case "pull_request":
                handlePREvent(event);
                break;
            default:
                console.log("Ignoring event type: " + eventType);
        }

        res.writeHead(200);
        res.end("OK");
    });
});

var PORT = process.env.PORT || 3001;
server.listen(PORT, function() {
    console.log("Hybrid workflow orchestrator listening on port " + PORT);
    console.log("Azure DevOps org: " + AZURE_ORG + "/" + AZURE_PROJECT);
});

Azure Pipelines Configuration

The Azure Pipeline that builds from the GitHub repo and reports back:

# azure-pipelines.yml
trigger:
  branches:
    include:
      - main
      - feature/*

pr:
  branches:
    include:
      - main

pool:
  vmImage: 'ubuntu-latest'

variables:
  - group: production-config
  - name: isMain
    value: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]

stages:
  - stage: Validate
    displayName: 'Validate'
    jobs:
      - job: Lint
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'
          - script: npm ci && npm run lint
            displayName: 'Lint'

      - job: Test
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'
          - script: npm ci && npm test
            displayName: 'Test'
          - task: PublishTestResults@2
            inputs:
              testResultsFormat: 'JUnit'
              testResultsFiles: '**/test-results.xml'
            condition: always()

  - stage: Build
    displayName: 'Build'
    dependsOn: Validate
    jobs:
      - job: BuildApp
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'
          - script: node scripts/build.js
            env:
              CI_PLATFORM: azure-pipelines
          - publish: $(System.DefaultWorkingDirectory)/dist
            artifact: app-dist

  - stage: DeployStaging
    displayName: 'Deploy to Staging'
    dependsOn: Build
    condition: eq(variables.isMain, true)
    jobs:
      - deployment: StagingDeploy
        environment: 'staging'
        strategy:
          runOnce:
            deploy:
              steps:
                - download: current
                  artifact: app-dist
                - script: |
                    echo "Deploying to staging..."
                    # Your staging deployment commands here
                  displayName: 'Deploy to Staging'

  - stage: DeployProduction
    displayName: 'Deploy to Production'
    dependsOn: DeployStaging
    condition: eq(variables.isMain, true)
    jobs:
      - deployment: ProductionDeploy
        environment: 'production'
        strategy:
          runOnce:
            deploy:
              steps:
                - download: current
                  artifact: app-dist
                - script: |
                    echo "Deploying to production..."
                    # Your production deployment commands here
                  displayName: 'Deploy to Production'

GitHub Webhook Setup

Configure the webhook on your GitHub repo (Settings > Webhooks) pointing to your orchestrator:

  • Payload URL: https://your-orchestrator.example.com/
  • Content type: application/json
  • Secret: Your GITHUB_WEBHOOK_SECRET value
  • Events: Push, Pull requests

The full workflow then operates as follows:

  1. Developer pushes code to GitHub with AB#123 in the commit message
  2. GitHub webhook fires to the orchestrator, which links the commit to Azure Boards work item #123
  3. Azure Pipelines triggers on the push (via the Azure Pipelines GitHub App), runs build and tests
  4. Pipeline status reports back to GitHub as a commit status check
  5. When a PR is merged with Fixes AB#123, the orchestrator closes the Azure Boards work item
  6. The pipeline deploys through staging and production with approval gates

Common Issues and Troubleshooting

Issue 1: Azure Pipelines status checks not appearing on GitHub PRs

This happens when the service connection between Azure DevOps and GitHub loses authorization. Go to Project Settings > Service Connections, find the GitHub connection, and click "Verify." If verification fails, re-authorize. Also check that the pipeline YAML file exists in the branch that the PR targets.

Issue 2: AB# links not creating connections in Azure Boards

The Azure Boards GitHub app must be installed and the specific repository must be connected in Azure DevOps under Project Settings > GitHub Connections. A common mistake is installing the app but not adding the repository in the Azure DevOps project. Also verify the commit is on a branch that Azure DevOps is monitoring.

Issue 3: Personal Access Tokens expiring mid-pipeline

PATs have a maximum lifetime of one year. Set calendar reminders. Better yet, use managed identities or OAuth apps for service-to-service authentication. For Azure Pipelines specifically, use the built-in service connection rather than PATs wherever possible. Store PAT expiration dates in your team's shared documentation.

Issue 4: Webhook delivery failures causing missed work item updates

GitHub shows webhook delivery history under Settings > Webhooks > Recent Deliveries. If your orchestrator was down, you can redeliver failed webhooks manually. For production setups, put a message queue (Azure Service Bus or SQS) in front of your orchestrator so that webhook payloads are persisted even if the processor is temporarily unavailable.

Issue 5: Branch policy conflicts between platforms

When both GitHub branch protection and Azure DevOps branch policies enforce status checks, you can end up in a deadlock where each platform waits for the other. The fix is to make one platform authoritative for policies. Typically, GitHub branch protection is authoritative since it controls the merge button, and Azure Pipelines reports into it as a status check. Do not duplicate the same policy on both sides.

Issue 6: Rate limiting on cross-platform API calls

Both GitHub and Azure DevOps APIs have rate limits. GitHub allows 5,000 requests per hour for authenticated requests. Azure DevOps allows significantly more but has per-second throttling. Implement exponential backoff in your orchestrator and batch API calls where possible. The orchestrator code above processes commits synchronously, which could hit limits on large pushes. Add a queue for production use.

Best Practices

  • Single source of truth for code: Pick GitHub or Azure Repos and make it authoritative. Never allow direct pushes to both. Mirror if you must, but one platform owns the code.

  • Use the AB# convention religiously: Train your team to include work item references in every commit. Set up commit message linting in a pre-commit hook to enforce this. The traceability it provides is worth the minor inconvenience.

  • Manage secrets in one place: Do not spread secrets across GitHub Secrets, Azure DevOps Variable Groups, and your orchestrator's environment. Pick one source of truth. Azure Key Vault integrated with both platforms is the cleanest approach.

  • Monitor webhook health: Set up alerts for webhook delivery failures. A silent failure in the integration layer means work items fall out of sync, which undermines the entire hybrid workflow.

  • Prefer service connections over PATs: Personal Access Tokens are tied to individual accounts and expire. Service connections, managed identities, and OAuth apps are more reliable for automated workflows. When you must use PATs, create them under a service account, not a personal account.

  • Document the workflow visually: Create a diagram showing how code flows from GitHub through Azure Pipelines to deployment, and how work items in Azure Boards connect to GitHub PRs and commits. New team members need this context to understand why two platforms are involved.

  • Keep pipeline definitions in the repo: Store azure-pipelines.yml in the GitHub repo, not defined purely in the Azure DevOps UI. This makes pipeline changes go through code review and keeps the pipeline versioned with the code.

  • Test the integration in a sandbox first: Set up a test Azure DevOps project and a test GitHub repo. Validate the entire workflow end to end before rolling it out to your production project. Integration failures between platforms are disruptive and visible.

  • Plan for migration: If you are using hybrid as a transition strategy, set clear milestones and deadlines. Hybrid workflows are powerful but they add operational complexity. Have a target end state, even if that end state is intentionally hybrid.

References

Powered by Contentful