GitHub and Azure DevOps: Hybrid Workflows
A comprehensive guide to building hybrid workflows that leverage both GitHub and Azure DevOps, including repository mirroring, pipeline triggering across platforms, and unified work item tracking for teams that operate in both ecosystems.
GitHub and Azure DevOps: Hybrid Workflows
Overview
Most organizations do not live in a single platform. You inherit GitHub repositories from an acquisition, your open-source work lives on GitHub while your enterprise CI/CD runs on Azure DevOps, or your developers prefer GitHub pull requests while your project managers insist on Azure Boards. Whatever the reason, running both platforms side by side is common, and building workflows that bridge them is necessary. I have set up these hybrid configurations for multiple teams and the key is treating the integration points as first-class infrastructure rather than afterthoughts.
Prerequisites
- An Azure DevOps organization with a project and at least one repository
- A GitHub account with at least one repository
- Azure DevOps Personal Access Token (PAT) with Code (Read & Write), Build (Read & Execute), and Work Items (Read & Write) scopes
- GitHub Personal Access Token (classic) with
repo,workflow, andadmin:repo_hookscopes - Node.js 16 or later for scripting examples
- Basic familiarity with both Azure Pipelines YAML and GitHub Actions workflow syntax
Why Hybrid Workflows Exist
The question is never whether to use GitHub or Azure DevOps. The question is which parts of each platform make sense for your team. GitHub has superior pull request workflows, a massive open-source ecosystem, Copilot integration, and GitHub Actions for straightforward CI. Azure DevOps has richer enterprise features — Azure Boards for work tracking, Azure Artifacts for package management, deployment gates, approval flows, and deep integration with Azure cloud resources.
Teams end up in hybrid configurations for several reasons. Acquisitions bring GitHub repositories into an Azure DevOps shop. Open-source projects live on GitHub but internal builds and deployments run through Azure Pipelines for compliance. Mobile teams prefer GitHub while backend teams are on Azure Repos. Rather than forcing a migration, bridging the platforms is often the pragmatic choice.
Repository Mirroring
The most fundamental hybrid workflow is keeping a repository synchronized between GitHub and Azure Repos. This lets developers push to their preferred platform while builds and deployments trigger from the other.
One-Way Mirror: GitHub to Azure Repos
The simplest approach mirrors GitHub as the source of truth, with Azure Repos receiving pushes automatically. Azure Pipelines then trigger from the Azure Repos copy.
Create a pipeline in Azure DevOps that runs on a schedule or webhook:
# azure-pipelines-mirror.yml
trigger: none
schedules:
- cron: "*/10 * * * *"
displayName: "Mirror from GitHub every 10 minutes"
branches:
include:
- main
always: true
pool:
vmImage: "ubuntu-latest"
steps:
- checkout: none
- script: |
git clone --mirror https://$(GITHUB_PAT)@github.com/your-org/your-repo.git mirror-repo
cd mirror-repo
git remote add azure https://$(AZURE_PAT)@dev.azure.com/your-org/your-project/_git/your-repo
git push azure --mirror
displayName: "Mirror GitHub to Azure Repos"
env:
GITHUB_PAT: $(GitHubPAT)
AZURE_PAT: $(AzureDevOpsPAT)
Store both PATs as secret pipeline variables. The --mirror flag ensures all branches, tags, and refs are synchronized.
Bidirectional Sync with Git Hooks
For teams that push to both platforms, bidirectional sync is trickier. You need a script that handles conflicts gracefully:
// sync-repos.js
var https = require("https");
var exec = require("child_process").execSync;
var path = require("path");
var fs = require("fs");
var config = {
workDir: path.join(__dirname, ".sync-workspace"),
github: {
url: process.env.GITHUB_REPO_URL,
remoteName: "github"
},
azure: {
url: process.env.AZURE_REPO_URL,
remoteName: "azure"
},
branches: ["main", "develop", "release/*"]
};
function ensureWorkspace() {
if (!fs.existsSync(config.workDir)) {
console.log("Initializing sync workspace...");
exec("git init --bare " + config.workDir);
exec("git -C " + config.workDir + " remote add " + config.github.remoteName + " " + config.github.url);
exec("git -C " + config.workDir + " remote add " + config.azure.remoteName + " " + config.azure.url);
}
}
function fetchAll() {
console.log("Fetching from GitHub...");
exec("git -C " + config.workDir + " fetch " + config.github.remoteName + " --prune", { stdio: "inherit" });
console.log("Fetching from Azure Repos...");
exec("git -C " + config.workDir + " fetch " + config.azure.remoteName + " --prune", { stdio: "inherit" });
}
function syncBranch(branch, source, target) {
var sourceRef = source + "/" + branch;
var targetRef = target + "/" + branch;
try {
var sourceHash = exec("git -C " + config.workDir + " rev-parse " + sourceRef).toString().trim();
var targetHash;
try {
targetHash = exec("git -C " + config.workDir + " rev-parse " + targetRef).toString().trim();
} catch (err) {
console.log("Branch " + branch + " does not exist on " + target + ", pushing...");
exec("git -C " + config.workDir + " push " + target + " " + sourceRef + ":refs/heads/" + branch);
return;
}
if (sourceHash === targetHash) {
console.log("Branch " + branch + " is in sync.");
return;
}
// Check if source is ahead of target (fast-forward possible)
try {
exec("git -C " + config.workDir + " merge-base --is-ancestor " + targetHash + " " + sourceHash);
console.log("Branch " + branch + ": " + source + " is ahead, pushing to " + target);
exec("git -C " + config.workDir + " push " + target + " " + sourceRef + ":refs/heads/" + branch);
} catch (err) {
console.log("WARNING: Branch " + branch + " has diverged between " + source + " and " + target);
console.log(" " + source + ": " + sourceHash);
console.log(" " + target + ": " + targetHash);
console.log(" Skipping — manual resolution required.");
}
} catch (err) {
console.error("Error syncing branch " + branch + ": " + err.message);
}
}
function run() {
ensureWorkspace();
fetchAll();
config.branches.forEach(function (pattern) {
if (pattern.indexOf("*") !== -1) {
// Glob pattern — list matching branches from both remotes
var output = exec("git -C " + config.workDir + " branch -r").toString();
var lines = output.split("\n");
var prefix = pattern.replace("/*", "/");
lines.forEach(function (line) {
var ref = line.trim();
if (ref.indexOf(config.github.remoteName + "/" + prefix) === 0) {
var branchName = ref.replace(config.github.remoteName + "/", "");
syncBranch(branchName, config.github.remoteName, config.azure.remoteName);
}
if (ref.indexOf(config.azure.remoteName + "/" + prefix) === 0) {
var branchName2 = ref.replace(config.azure.remoteName + "/", "");
syncBranch(branchName2, config.azure.remoteName, config.github.remoteName);
}
});
} else {
syncBranch(pattern, config.github.remoteName, config.azure.remoteName);
syncBranch(pattern, config.azure.remoteName, config.github.remoteName);
}
});
console.log("Sync complete.");
}
run();
Run this on a schedule or trigger it from webhooks on both platforms. The script only does fast-forward pushes — if branches diverge, it logs a warning and requires manual resolution. This prevents accidental force-pushes that destroy work.
Cross-Platform Pipeline Triggering
The most powerful hybrid pattern is triggering builds on one platform from events on the other.
Triggering Azure Pipelines from GitHub Actions
When code lives on GitHub but deployment pipelines live on Azure DevOps:
# .github/workflows/trigger-azure-pipeline.yml
name: Trigger Azure Pipeline
on:
push:
branches: [main]
pull_request:
types: [closed]
jobs:
trigger-azure:
if: github.event_name == 'push' || github.event.pull_request.merged == true
runs-on: ubuntu-latest
steps:
- name: Trigger Azure DevOps Pipeline
run: |
curl -s -X POST \
"https://dev.azure.com/${AZURE_ORG}/${AZURE_PROJECT}/_apis/pipelines/${PIPELINE_ID}/runs?api-version=7.1" \
-H "Authorization: Basic $(echo -n ":${AZURE_PAT}" | base64)" \
-H "Content-Type: application/json" \
-d '{
"resources": {
"repositories": {
"self": {
"refName": "refs/heads/main"
}
}
},
"templateParameters": {
"githubSha": "'"${GITHUB_SHA}"'",
"githubRef": "'"${GITHUB_REF}"'",
"triggeredBy": "github-actions"
}
}'
env:
AZURE_ORG: your-org
AZURE_PROJECT: your-project
PIPELINE_ID: "42"
AZURE_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
Triggering GitHub Actions from Azure Pipelines
The reverse — triggering a GitHub Actions workflow from an Azure Pipeline — uses the GitHub repository dispatch API:
# azure-pipelines.yml
steps:
- script: |
curl -s -X POST \
"https://api.github.com/repos/$(GITHUB_ORG)/$(GITHUB_REPO)/dispatches" \
-H "Authorization: token $(GITHUB_PAT)" \
-H "Accept: application/vnd.github.v3+json" \
-d '{
"event_type": "azure-pipeline-complete",
"client_payload": {
"buildId": "$(Build.BuildId)",
"buildNumber": "$(Build.BuildNumber)",
"sourceBranch": "$(Build.SourceBranch)",
"status": "succeeded"
}
}'
displayName: "Trigger GitHub Actions workflow"
env:
GITHUB_PAT: $(GitHubPAT)
On the GitHub side, define a workflow that handles the dispatch:
# .github/workflows/on-azure-complete.yml
name: Post Azure Pipeline
on:
repository_dispatch:
types: [azure-pipeline-complete]
jobs:
post-deploy:
runs-on: ubuntu-latest
steps:
- name: Log trigger info
run: |
echo "Azure Build ID: ${{ github.event.client_payload.buildId }}"
echo "Build Number: ${{ github.event.client_payload.buildNumber }}"
echo "Branch: ${{ github.event.client_payload.sourceBranch }}"
- name: Run post-deployment tests
run: |
# Run smoke tests, notify Slack, update status page, etc.
echo "Running post-deployment verification..."
Unified Work Item Tracking
Teams split between platforms need a single source of truth for work items. Azure Boards has more features for sprint planning and reporting, so it often becomes the primary tracker even when development happens on GitHub.
Connecting Azure Boards to GitHub
Azure DevOps has a built-in GitHub connection. Navigate to Project Settings > GitHub connections and authenticate. Once connected, you can:
- Link GitHub commits to Azure Boards work items using
AB#1234syntax in commit messages - See GitHub commit and PR info directly on work items
- Use the
AB#prefix in PR descriptions to auto-link
This is the simplest integration and requires zero code. However, it only links — it does not sync status.
Automated Status Sync
For bidirectional status updates, build a small middleware that listens to webhooks from both platforms:
// work-item-sync.js
var express = require("express");
var https = require("https");
var app = express();
app.use(express.json());
var AZURE_ORG = process.env.AZURE_ORG;
var AZURE_PROJECT = process.env.AZURE_PROJECT;
var AZURE_PAT = process.env.AZURE_PAT;
var GITHUB_TOKEN = process.env.GITHUB_TOKEN;
function azureRequest(method, path, body, callback) {
var auth = Buffer.from(":" + AZURE_PAT).toString("base64");
var options = {
hostname: "dev.azure.com",
path: "/" + AZURE_ORG + "/" + AZURE_PROJECT + "/_apis" + path + "?api-version=7.1",
method: method,
headers: {
"Authorization": "Basic " + auth,
"Content-Type": "application/json-patch+json"
}
};
var req = https.request(options, function (res) {
var data = "";
res.on("data", function (chunk) { data += chunk; });
res.on("end", function () {
callback(null, JSON.parse(data));
});
});
req.on("error", callback);
if (body) { req.write(JSON.stringify(body)); }
req.end();
}
function githubRequest(method, path, body, callback) {
var options = {
hostname: "api.github.com",
path: path,
method: method,
headers: {
"Authorization": "token " + GITHUB_TOKEN,
"Accept": "application/vnd.github.v3+json",
"User-Agent": "work-item-sync"
}
};
var req = https.request(options, function (res) {
var data = "";
res.on("data", function (chunk) { data += chunk; });
res.on("end", function () {
callback(null, JSON.parse(data));
});
});
req.on("error", callback);
if (body) { req.write(JSON.stringify(body)); }
req.end();
}
// GitHub issue opened -> Create Azure DevOps work item
app.post("/webhooks/github/issues", function (req, res) {
var payload = req.body;
if (payload.action !== "opened") {
return res.status(200).json({ message: "Ignored action: " + payload.action });
}
var issue = payload.issue;
var labels = issue.labels.map(function (l) { return l.name; });
var workItemType = labels.indexOf("bug") !== -1 ? "Bug" : "User Story";
var patchDoc = [
{ op: "add", path: "/fields/System.Title", value: issue.title },
{ op: "add", path: "/fields/System.Description", value: issue.body || "" },
{ op: "add", path: "/fields/System.Tags", value: "github-synced" },
{
op: "add",
path: "/fields/System.History",
value: "Created from GitHub issue <a href=\"" + issue.html_url + "\">#" + issue.number + "</a>"
}
];
azureRequest("POST", "/wit/workitems/$" + workItemType, patchDoc, function (err, result) {
if (err) {
console.error("Failed to create work item:", err);
return res.status(500).json({ error: err.message });
}
console.log("Created work item " + result.id + " from GitHub issue #" + issue.number);
res.status(200).json({ workItemId: result.id });
});
});
// Azure DevOps work item updated -> Update GitHub issue
app.post("/webhooks/azure/workitems", function (req, res) {
var payload = req.body;
if (payload.eventType !== "workitem.updated") {
return res.status(200).json({ message: "Ignored event: " + payload.eventType });
}
var workItem = payload.resource;
var fields = workItem.fields;
var tags = (fields["System.Tags"] || "").split(";").map(function (t) { return t.trim(); });
if (tags.indexOf("github-synced") === -1) {
return res.status(200).json({ message: "Not a GitHub-synced work item" });
}
// Extract GitHub issue number from history or a custom field
var githubRef = fields["Custom.GitHubIssueUrl"] || "";
var issueMatch = githubRef.match(/github\.com\/(.+?)\/(.+?)\/issues\/(\d+)/);
if (!issueMatch) {
return res.status(200).json({ message: "No GitHub issue linked" });
}
var owner = issueMatch[1];
var repo = issueMatch[2];
var issueNumber = issueMatch[3];
var state = fields["System.State"];
var githubState = (state === "Closed" || state === "Done" || state === "Resolved") ? "closed" : "open";
var updateBody = {
state: githubState,
labels: [state.toLowerCase().replace(/ /g, "-")]
};
githubRequest("PATCH", "/repos/" + owner + "/" + repo + "/issues/" + issueNumber, updateBody, function (err, result) {
if (err) {
console.error("Failed to update GitHub issue:", err);
return res.status(500).json({ error: err.message });
}
console.log("Updated GitHub issue #" + issueNumber + " state to " + githubState);
res.status(200).json({ updated: true });
});
});
var PORT = process.env.PORT || 3500;
app.listen(PORT, function () {
console.log("Work item sync service listening on port " + PORT);
});
Pull Request Cross-Linking
When a developer opens a PR on GitHub that relates to Azure Boards work, you want the PR to appear on the work item and vice versa. Beyond the built-in AB# linking, you can add richer context with a GitHub Action:
# .github/workflows/link-pr-to-azure.yml
name: Link PR to Azure Boards
on:
pull_request:
types: [opened, edited]
jobs:
link:
runs-on: ubuntu-latest
steps:
- name: Extract work item IDs
id: extract
run: |
BODY="${{ github.event.pull_request.body }}"
# Match AB#1234 patterns
IDS=$(echo "$BODY" | grep -oP 'AB#\K\d+' | sort -u | tr '\n' ',' | sed 's/,$//')
echo "work_item_ids=$IDS" >> $GITHUB_OUTPUT
- name: Add comment to Azure DevOps work items
if: steps.extract.outputs.work_item_ids != ''
run: |
IFS=',' read -ra IDS <<< "${{ steps.extract.outputs.work_item_ids }}"
for ID in "${IDS[@]}"; do
curl -s -X POST \
"https://dev.azure.com/${AZURE_ORG}/${AZURE_PROJECT}/_apis/wit/workitems/${ID}?api-version=7.1" \
-H "Authorization: Basic $(echo -n ":${AZURE_PAT}" | base64)" \
-H "Content-Type: application/json-patch+json" \
-d '[
{
"op": "add",
"path": "/fields/System.History",
"value": "GitHub PR <a href=\"'"${{ github.event.pull_request.html_url }}"'\">'"${{ github.event.pull_request.title }}"'</a> ('"${{ github.event.pull_request.state }}"')"
}
]'
echo "Linked PR to work item $ID"
done
env:
AZURE_ORG: your-org
AZURE_PROJECT: your-project
AZURE_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
Complete Working Example: Hybrid CI/CD Bridge
This example builds a complete bridge service that synchronizes GitHub and Azure DevOps. It handles repository events, pipeline coordination, and status reporting across both platforms.
// hybrid-bridge.js
var express = require("express");
var crypto = require("crypto");
var https = require("https");
var app = express();
// Raw body for webhook signature verification
app.use(function (req, res, next) {
var chunks = [];
req.on("data", function (chunk) { chunks.push(chunk); });
req.on("end", function () {
req.rawBody = Buffer.concat(chunks);
try {
req.body = JSON.parse(req.rawBody.toString());
} catch (e) {
req.body = {};
}
next();
});
});
var config = {
github: {
token: process.env.GITHUB_TOKEN,
webhookSecret: process.env.GITHUB_WEBHOOK_SECRET,
org: process.env.GITHUB_ORG,
repo: process.env.GITHUB_REPO
},
azure: {
pat: process.env.AZURE_PAT,
org: process.env.AZURE_ORG,
project: process.env.AZURE_PROJECT,
pipelineId: parseInt(process.env.AZURE_PIPELINE_ID, 10)
},
port: parseInt(process.env.PORT, 10) || 4000
};
// --- HTTP helpers ---
function makeRequest(hostname, path, method, headers, body, callback) {
var options = {
hostname: hostname,
path: path,
method: method,
headers: headers
};
var req = https.request(options, function (res) {
var data = "";
res.on("data", function (chunk) { data += chunk; });
res.on("end", function () {
var parsed;
try { parsed = JSON.parse(data); } catch (e) { parsed = data; }
callback(null, res.statusCode, parsed);
});
});
req.on("error", callback);
if (body) { req.write(typeof body === "string" ? body : JSON.stringify(body)); }
req.end();
}
function githubApi(method, path, body, callback) {
var headers = {
"Authorization": "token " + config.github.token,
"Accept": "application/vnd.github.v3+json",
"User-Agent": "hybrid-bridge",
"Content-Type": "application/json"
};
makeRequest("api.github.com", path, method, headers, body, callback);
}
function azureApi(method, path, body, callback) {
var auth = Buffer.from(":" + config.azure.pat).toString("base64");
var fullPath = "/" + config.azure.org + "/" + config.azure.project + "/_apis" + path;
var separator = fullPath.indexOf("?") === -1 ? "?" : "&";
fullPath += separator + "api-version=7.1";
var contentType = "application/json";
if (method === "PATCH" && fullPath.indexOf("/wit/") !== -1) {
contentType = "application/json-patch+json";
}
var headers = {
"Authorization": "Basic " + auth,
"Content-Type": contentType
};
makeRequest("dev.azure.com", fullPath, method, headers, body, callback);
}
// --- Webhook verification ---
function verifyGithubSignature(req) {
if (!config.github.webhookSecret) { return true; }
var signature = req.headers["x-hub-signature-256"];
if (!signature) { return false; }
var expected = "sha256=" + crypto.createHmac("sha256", config.github.webhookSecret)
.update(req.rawBody)
.digest("hex");
return crypto.timingSafeEqual(Buffer.from(signature), Buffer.from(expected));
}
// --- Event handlers ---
function handleGithubPush(payload) {
var branch = payload.ref.replace("refs/heads/", "");
var commitCount = payload.commits.length;
var headCommit = payload.head_commit;
console.log("GitHub push: " + commitCount + " commit(s) to " + branch);
console.log(" Head commit: " + headCommit.id.substring(0, 8) + " - " + headCommit.message.split("\n")[0]);
// Trigger Azure Pipeline
var runBody = {
resources: {
repositories: {
self: {
refName: payload.ref
}
}
},
templateParameters: {
githubSha: headCommit.id,
githubBranch: branch,
githubPusher: payload.pusher.name,
source: "github-webhook"
}
};
azureApi("POST", "/pipelines/" + config.azure.pipelineId + "/runs", runBody, function (err, status, result) {
if (err) {
console.error("Failed to trigger Azure Pipeline:", err.message);
return;
}
if (status >= 200 && status < 300) {
console.log("Triggered Azure Pipeline run #" + result.id);
// Set commit status on GitHub
var statusBody = {
state: "pending",
target_url: result._links.web.href,
description: "Azure Pipeline #" + result.id + " triggered",
context: "azure-pipelines"
};
githubApi("POST", "/repos/" + config.github.org + "/" + config.github.repo + "/statuses/" + headCommit.id, statusBody, function (err2) {
if (err2) { console.error("Failed to set GitHub status:", err2.message); }
else { console.log("Set pending status on commit " + headCommit.id.substring(0, 8)); }
});
} else {
console.error("Azure Pipeline trigger failed (" + status + "):", JSON.stringify(result));
}
});
}
function handleGithubPullRequest(payload) {
var pr = payload.pull_request;
var action = payload.action;
console.log("GitHub PR #" + pr.number + " " + action + ": " + pr.title);
if (action === "opened" || action === "synchronize") {
// Trigger Azure Pipeline for PR validation
var runBody = {
resources: {
repositories: {
self: {
refName: "refs/pull/" + pr.number + "/merge"
}
}
},
templateParameters: {
githubPR: pr.number.toString(),
githubSha: pr.head.sha,
source: "github-pr"
}
};
azureApi("POST", "/pipelines/" + config.azure.pipelineId + "/runs", runBody, function (err, status, result) {
if (err || status >= 300) {
console.error("Failed to trigger PR validation pipeline");
return;
}
console.log("Triggered PR validation pipeline run #" + result.id);
});
}
}
function handleAzureBuildComplete(payload) {
var resource = payload.resource;
var buildNumber = resource.buildNumber;
var buildStatus = resource.status;
var buildResult = resource.result;
console.log("Azure Build #" + buildNumber + ": " + buildStatus + " (" + buildResult + ")");
// Extract GitHub SHA from build parameters
var params = resource.templateParameters || {};
var githubSha = params.githubSha;
var githubPR = params.githubPR;
if (!githubSha) {
console.log("No GitHub SHA in build parameters, skipping status update");
return;
}
var state;
var description;
switch (buildResult) {
case "succeeded":
state = "success";
description = "Azure Pipeline #" + buildNumber + " succeeded";
break;
case "failed":
state = "failure";
description = "Azure Pipeline #" + buildNumber + " failed";
break;
case "canceled":
state = "error";
description = "Azure Pipeline #" + buildNumber + " was canceled";
break;
default:
state = "pending";
description = "Azure Pipeline #" + buildNumber + " is " + buildResult;
}
var statusBody = {
state: state,
target_url: resource._links.web.href,
description: description,
context: "azure-pipelines"
};
githubApi("POST", "/repos/" + config.github.org + "/" + config.github.repo + "/statuses/" + githubSha, statusBody, function (err) {
if (err) { console.error("Failed to update GitHub status:", err.message); }
else { console.log("Updated GitHub commit status to " + state); }
});
// Add comment on PR if applicable
if (githubPR) {
var emoji = state === "success" ? "white_check_mark" : (state === "failure" ? "x" : "warning");
var comment = {
body: ":" + emoji + ": Azure Pipeline [#" + buildNumber + "](" + resource._links.web.href + ") " + buildResult + "\n\n" +
"**Duration:** " + calculateDuration(resource.startTime, resource.finishTime) + "\n" +
"**Triggered by:** " + (params.source || "unknown")
};
githubApi("POST", "/repos/" + config.github.org + "/" + config.github.repo + "/issues/" + githubPR + "/comments", comment, function (err) {
if (err) { console.error("Failed to add PR comment:", err.message); }
else { console.log("Added build result comment to PR #" + githubPR); }
});
}
}
function calculateDuration(startTime, finishTime) {
if (!startTime || !finishTime) { return "unknown"; }
var ms = new Date(finishTime).getTime() - new Date(startTime).getTime();
var seconds = Math.floor(ms / 1000);
var minutes = Math.floor(seconds / 60);
seconds = seconds % 60;
if (minutes > 0) { return minutes + "m " + seconds + "s"; }
return seconds + "s";
}
// --- Routes ---
app.post("/webhooks/github", function (req, res) {
if (!verifyGithubSignature(req)) {
console.error("Invalid GitHub webhook signature");
return res.status(401).json({ error: "Invalid signature" });
}
var event = req.headers["x-github-event"];
console.log("\nReceived GitHub event: " + event);
switch (event) {
case "push":
handleGithubPush(req.body);
break;
case "pull_request":
handleGithubPullRequest(req.body);
break;
case "ping":
console.log("GitHub webhook ping received");
break;
default:
console.log("Unhandled GitHub event: " + event);
}
res.status(200).json({ received: true });
});
app.post("/webhooks/azure", function (req, res) {
var eventType = req.body.eventType;
console.log("\nReceived Azure DevOps event: " + eventType);
switch (eventType) {
case "build.complete":
handleAzureBuildComplete(req.body);
break;
default:
console.log("Unhandled Azure event: " + eventType);
}
res.status(200).json({ received: true });
});
app.get("/health", function (req, res) {
res.json({
status: "ok",
uptime: process.uptime(),
github: { org: config.github.org, repo: config.github.repo },
azure: { org: config.azure.org, project: config.azure.project }
});
});
// --- Start ---
app.listen(config.port, function () {
console.log("Hybrid CI/CD Bridge running on port " + config.port);
console.log("GitHub: " + config.github.org + "/" + config.github.repo);
console.log("Azure DevOps: " + config.azure.org + "/" + config.azure.project);
console.log("\nEndpoints:");
console.log(" POST /webhooks/github - GitHub webhook receiver");
console.log(" POST /webhooks/azure - Azure DevOps webhook receiver");
console.log(" GET /health - Health check");
});
Deploy this as a small service (Azure App Service, Container Instance, or any Node.js host). Configure webhooks on both platforms to point to the appropriate endpoints. The bridge handles the event translation, status synchronization, and cross-platform notifications automatically.
Common Issues and Troubleshooting
PAT token scope errors when triggering pipelines
HTTP 403: The user does not have permission to queue builds
Your Azure DevOps PAT needs the Build: Read & execute scope. GitHub tokens need repo and workflow for dispatching Actions. Regenerate the token with correct scopes — there is no way to add scopes to an existing PAT.
Repository mirror push fails with "remote rejected"
! [remote rejected] main -> main (TF402455: Pushes to this branch are not permitted)
Azure Repos branch policies block direct pushes. Either add the mirror service account as a bypass identity in branch policies, or use a service account that has "Bypass policies when pushing" permission.
GitHub webhook signature validation fails after secret rotation
Invalid GitHub webhook signature
When you rotate the webhook secret on GitHub, you must also update the GITHUB_WEBHOOK_SECRET environment variable on your bridge service. During rotation, GitHub sends both old and new signatures for a short period, but the HMAC comparison uses a single secret. Update the service before changing the GitHub setting, or temporarily disable signature verification during the transition.
Azure DevOps service hooks stop firing after project rename
Service hooks are tied to the project URL. If you rename your Azure DevOps project, existing service hook subscriptions become orphaned. Navigate to Project Settings > Service hooks and verify each subscription still shows a valid target URL. You may need to delete and recreate them.
Cross-platform commit status shows as "pending" indefinitely
azure-pipelines — Pending — Waiting for status to be reported
This happens when the Azure Pipeline run completes but the callback to GitHub fails — typically because the bridge service was down, the GitHub token expired, or the commit SHA in the pipeline parameters was wrong. Check the bridge service logs and verify the githubSha template parameter is being passed correctly through the pipeline.
Best Practices
Choose a single source of truth for code. Pick either GitHub or Azure Repos as the primary and mirror to the other. Bidirectional sync sounds appealing but creates merge conflicts and confusion about which platform has the latest changes.
Use service accounts for cross-platform tokens. Never use personal PATs for production integrations. Create dedicated service accounts on both platforms with only the permissions needed. This prevents token invalidation when someone leaves the team.
Implement webhook signature verification. Both GitHub and Azure DevOps support webhook secrets. Always verify signatures in production to prevent spoofed events from triggering pipelines or creating work items.
Log all cross-platform events. When something goes wrong in a hybrid workflow, debugging spans two platforms. Log every inbound webhook and outbound API call with timestamps, correlation IDs, and response codes so you can trace the full flow.
Set up monitoring for the bridge service. Your bridge is a single point of failure. If it goes down, cross-platform triggers stop working. Add health checks, uptime monitoring, and alerts. Consider running redundant instances behind a load balancer.
Version your integration configuration. Store webhook URLs, pipeline IDs, and mapping rules in configuration files checked into version control, not in portal settings. This makes the integration reproducible and auditable.
Test cross-platform flows end to end before going live. Create a test project on both platforms and verify the full loop: push on GitHub triggers Azure Pipeline, build result updates GitHub commit status, work items sync correctly. Do not test in production.
Handle API rate limits gracefully. Both GitHub and Azure DevOps have API rate limits. GitHub allows 5,000 requests per hour for authenticated users. Azure DevOps has similar limits. Add retry logic with exponential backoff to your bridge service.