Bitbucket Pipelines for Node.js Projects
A practical guide to configuring Bitbucket Pipelines for Node.js including caching, parallel steps, deployment environments, Docker builds, and pipeline optimization.
Bitbucket Pipelines for Node.js Projects
Bitbucket Pipelines is the built-in CI/CD service for Bitbucket repositories. Every Bitbucket repo gets pipelines for free — no external service, no plugins. It runs your builds in Docker containers, supports parallel steps, caching, deployments, and integrates directly with Jira for issue tracking.
I have set up Pipelines for dozens of Node.js projects. The configuration is straightforward once you understand the structure. This guide covers everything from basic test pipelines to production deployment workflows.
Prerequisites
- A Bitbucket repository
- Pipelines enabled in repository settings
- Basic YAML knowledge
- A Node.js project
Pipeline Basics
# bitbucket-pipelines.yml
image: node:20
pipelines:
default:
- step:
name: Install and Test
caches:
- node
script:
- npm ci
- npm test
The default pipeline runs on every push to any branch. The image sets the Docker image for all steps.
Pipeline Structure
Multiple Steps
image: node:20
pipelines:
default:
- step:
name: Install
caches:
- node
script:
- npm ci
artifacts:
- node_modules/**
- step:
name: Lint
script:
- npx eslint src/
- step:
name: Test
script:
- npm test
- step:
name: Build
script:
- npm run build
artifacts:
- dist/**
Steps run sequentially by default. Each step starts in a fresh Docker container. Artifacts pass files between steps.
Parallel Steps
pipelines:
default:
- step:
name: Install
caches:
- node
script:
- npm ci
artifacts:
- node_modules/**
- parallel:
- step:
name: Lint
script:
- npx eslint src/ --format compact
- step:
name: Unit Tests
script:
- npx jest --ci --coverage
- step:
name: Integration Tests
script:
- npm run test:integration
- step:
name: Build
script:
- npm run build
artifacts:
- dist/**
Parallel steps run simultaneously, reducing pipeline time. All parallel steps must pass before the pipeline continues to the next stage.
Branch-Specific Pipelines
image: node:20
pipelines:
default:
- step:
name: Test
caches:
- node
script:
- npm ci
- npm test
branches:
main:
- step:
name: Test
caches:
- node
script:
- npm ci
- npm test
- step:
name: Build
script:
- npm run build
artifacts:
- dist/**
- step:
name: Deploy to Staging
deployment: staging
script:
- pipe: atlassian/rsync-deploy:0.12.0
variables:
USER: deploy
SERVER: staging.myapp.com
REMOTE_PATH: /var/www/myapp/
LOCAL_PATH: dist/
'release/*':
- step:
name: Test and Build
caches:
- node
script:
- npm ci
- npm test
- npm run build
artifacts:
- dist/**
- step:
name: Deploy to Production
deployment: production
trigger: manual
script:
- echo "Deploying to production"
Pull Request Pipelines
pipelines:
pull-requests:
'**': # All pull requests
- step:
name: Test
caches:
- node
script:
- npm ci
- npm test
- step:
name: Lint
script:
- npx eslint src/
'feature/*': # PRs from feature branches only
- step:
name: Full Check
caches:
- node
script:
- npm ci
- npm test
- npm run lint
- npm run build
Tag Pipelines
pipelines:
tags:
'v*':
- step:
name: Build Release
caches:
- node
script:
- npm ci
- npm run build
artifacts:
- dist/**
- step:
name: Publish
script:
- npm publish
deployment: production
Caching
Built-in Caches
pipelines:
default:
- step:
caches:
- node # Caches ~/.npm (npm cache directory)
script:
- npm ci
- npm test
Custom Caches
definitions:
caches:
nodemodules: node_modules
nextcache: .next/cache
jestcache: /tmp/jest_cache
pipelines:
default:
- step:
caches:
- nodemodules
- nextcache
script:
- npm ci
- npm run build
- npm test
Cache Behavior
- Caches are branch-specific by default
- Cache is populated on first run and reused on subsequent runs
- Caches expire after 7 days
- Clear caches in Bitbucket Settings → Pipelines → Caches
Services (Sidecar Containers)
definitions:
services:
postgres:
image: postgres:15
variables:
POSTGRES_DB: test_db
POSTGRES_USER: test
POSTGRES_PASSWORD: test
memory: 512
redis:
image: redis:7-alpine
memory: 256
pipelines:
default:
- step:
name: Integration Tests
services:
- postgres
- redis
caches:
- node
script:
- npm ci
- npm run test:integration
variables:
DATABASE_URL: "postgres://test:test@localhost:5432/test_db"
REDIS_URL: "redis://localhost:6379"
Services run as separate Docker containers accessible via localhost.
Deployments and Environments
Deployment Environments
pipelines:
branches:
main:
- step:
name: Test
script:
- npm ci && npm test
- step:
name: Deploy Staging
deployment: staging
script:
- npm run deploy:staging
- step:
name: Deploy Production
deployment: production
trigger: manual
script:
- npm run deploy:production
Configure environments in Bitbucket Settings → Pipelines → Deployments:
- Test — automatic deploys
- Staging — automatic or manual
- Production — manual approval with optional IP restrictions
Deployment Variables
Environment-specific variables are set in the Bitbucket UI:
# Variables set per-environment in UI:
# Staging: DEPLOY_URL, API_KEY, SSH_KEY
# Production: DEPLOY_URL, API_KEY, SSH_KEY
pipelines:
branches:
main:
- step:
deployment: staging
script:
- echo "Deploying to $DEPLOY_URL"
- curl -X POST "$DEPLOY_URL/deploy" -H "Authorization: $API_KEY"
Pipes
Pipes are pre-built integration steps:
pipelines:
branches:
main:
- step:
name: Build
script:
- npm ci && npm run build
artifacts:
- dist/**
- step:
name: Deploy to S3
deployment: production
script:
- pipe: atlassian/aws-s3-deploy:1.1.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: us-east-1
S3_BUCKET: my-app-bucket
LOCAL_PATH: dist/
- step:
name: Notify Slack
script:
- pipe: atlassian/slack-notify:2.1.0
variables:
WEBHOOK_URL: $SLACK_WEBHOOK
MESSAGE: "Deployed version ${BITBUCKET_BUILD_NUMBER} to production"
Common Pipes
# SSH deploy
- pipe: atlassian/ssh-run:0.7.1
variables:
SSH_USER: deploy
SERVER: myapp.com
SSH_KEY: $SSH_KEY
COMMAND: "cd /var/www/myapp && git pull && npm install && pm2 restart all"
# Docker push to AWS ECR
- pipe: atlassian/aws-ecr-push-image:2.3.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: us-east-1
IMAGE_NAME: myapp
TAGS: "${BITBUCKET_BUILD_NUMBER} latest"
# Trigger another pipeline
- pipe: atlassian/trigger-pipeline:5.3.0
variables:
BITBUCKET_USERNAME: $BITBUCKET_USERNAME
BITBUCKET_APP_PASSWORD: $BITBUCKET_APP_PASSWORD
REPOSITORY: myorg/other-repo
REF_TYPE: branch
REF_NAME: main
Docker Builds
Building and Pushing Images
image: atlassian/default-image:4
pipelines:
branches:
main:
- step:
name: Build and Push Docker Image
services:
- docker
caches:
- docker
script:
- export IMAGE_NAME="${DOCKER_REGISTRY}/${BITBUCKET_REPO_SLUG}"
- export TAG="${BITBUCKET_BUILD_NUMBER}"
- docker build -t ${IMAGE_NAME}:${TAG} -t ${IMAGE_NAME}:latest .
- echo ${DOCKER_PASSWORD} | docker login -u ${DOCKER_USERNAME} --password-stdin ${DOCKER_REGISTRY}
- docker push ${IMAGE_NAME}:${TAG}
- docker push ${IMAGE_NAME}:latest
definitions:
services:
docker:
memory: 2048
Pipeline Variables
Predefined Variables
script:
- echo "Repository: $BITBUCKET_REPO_SLUG"
- echo "Branch: $BITBUCKET_BRANCH"
- echo "Commit: $BITBUCKET_COMMIT"
- echo "Build: $BITBUCKET_BUILD_NUMBER"
- echo "Tag: $BITBUCKET_TAG"
- echo "PR ID: $BITBUCKET_PR_ID"
- echo "Workspace: $BITBUCKET_WORKSPACE"
- echo "Clone Dir: $BITBUCKET_CLONE_DIR"
Custom Variables
Set in the Bitbucket UI (Settings → Repository Variables):
- Repository variables — available in all pipelines
- Deployment variables — environment-specific, override repository variables
- Secured variables — masked in logs, not accessible in pull requests from forks
script:
- echo "Using API at $API_URL" # Custom variable
- curl -H "Authorization: Bearer $API_TOKEN" $API_URL/health
Complete Working Example: Full Node.js Pipeline
# bitbucket-pipelines.yml
image: node:20
definitions:
caches:
nodemodules: node_modules
services:
postgres:
image: postgres:15
variables:
POSTGRES_DB: test
POSTGRES_USER: test
POSTGRES_PASSWORD: test
memory: 512
redis:
image: redis:7-alpine
memory: 256
steps:
- step: &install
name: Install Dependencies
caches:
- nodemodules
- node
script:
- npm ci
artifacts:
- node_modules/**
- step: &lint
name: Lint
script:
- npx eslint src/ --format compact
- npx prettier --check "src/**/*.{js,json}"
- step: &unit-test
name: Unit Tests
script:
- npx jest --ci --coverage --coverageReporters=text-summary
- npx jest --ci --coverageReporters=cobertura
artifacts:
- coverage/**
- step: &integration-test
name: Integration Tests
services:
- postgres
- redis
script:
- npm run test:integration
variables:
DATABASE_URL: "postgres://test:test@localhost:5432/test"
REDIS_URL: "redis://localhost:6379"
- step: &build
name: Build
script:
- npm run build
- echo "Build complete"
artifacts:
- dist/**
pipelines:
# === Default: Run on every push ===
default:
- step: *install
- parallel:
- step: *lint
- step: *unit-test
# === Pull Requests ===
pull-requests:
'**':
- step: *install
- parallel:
- step: *lint
- step: *unit-test
- step: *integration-test
# === Branch-specific ===
branches:
main:
- step: *install
- parallel:
- step: *lint
- step: *unit-test
- step: *integration-test
- step: *build
- step:
name: Deploy to Staging
deployment: staging
script:
- apt-get update && apt-get install -y rsync
- echo "Deploying build ${BITBUCKET_BUILD_NUMBER} to staging"
- rsync -avz --delete dist/ [email protected]:/var/www/myapp/
after-script:
- |
if [ $BITBUCKET_EXIT_CODE -eq 0 ]; then
echo "Deployment successful"
else
echo "Deployment failed"
fi
- step:
name: Deploy to Production
deployment: production
trigger: manual
script:
- apt-get update && apt-get install -y rsync
- echo "Deploying build ${BITBUCKET_BUILD_NUMBER} to production"
- rsync -avz --delete dist/ [email protected]:/var/www/myapp/
# === Tag releases ===
tags:
'v*':
- step: *install
- step: *unit-test
- step: *build
- step:
name: Publish to npm
script:
- echo "//registry.npmjs.org/:_authToken=${NPM_TOKEN}" > .npmrc
- npm publish
deployment: production
Pipeline Optimization
Reducing Build Minutes
# Skip CI for documentation changes
pipelines:
default:
- step:
name: Check Changes
script:
- |
CHANGED=$(git diff --name-only HEAD~1)
if echo "$CHANGED" | grep -qvE '\.(md|txt)$'; then
echo "Code changes detected, running tests"
else
echo "Documentation only, skipping tests"
exit 0
fi
- npm ci
- npm test
Step Size Limits
Free plan: 50 build minutes/month
Standard: 2500 build minutes/month
Premium: 3500 build minutes/month
Memory per step: 4GB (default), 8GB (2x), 16GB (4x), 32GB (8x)
Max step time: 120 minutes
Max pipeline: 100 steps
# Use double memory for heavy builds
- step:
name: Build
size: 2x # 8GB memory instead of 4GB
script:
- npm ci
- npm run build
Common Issues and Troubleshooting
Pipeline fails with "npm ci" error
The package-lock.json is out of sync with package.json:
Fix: Run npm install locally to regenerate the lock file, commit it, and push. Ensure the lock file is committed to the repository.
Cache is stale and causing test failures
Old cached node_modules conflicts with updated dependencies:
Fix: Clear the cache in Bitbucket Settings → Pipelines → Caches → Delete All. Or change the cache name to force a fresh install.
Service container (PostgreSQL) is not ready when tests start
The service takes a few seconds to initialize:
Fix: Add a wait script before running tests:
script:
- npm ci
- |
for i in $(seq 1 30); do
pg_isready -h localhost -p 5432 && break
echo "Waiting for PostgreSQL..."
sleep 1
done
- npm run test:integration
Pipeline runs but deployment step is skipped
The deployment step has trigger: manual and requires clicking in the Bitbucket UI:
Fix: Check the pipeline results in Bitbucket. Manual steps show a "Run" button. Click it to trigger the deployment. Remove trigger: manual for automatic deployments.
Build exceeds memory limit
The step crashes with an out-of-memory error:
Fix: Use size: 2x to double the memory allocation. Optimize your build — avoid running the full application during build. Use --max-old-space-size for Node.js: NODE_OPTIONS=--max-old-space-size=3072 npm run build.
Best Practices
- Use YAML anchors for step reuse. Define steps in
definitionsand reference them with*anchor. This prevents duplication and ensures consistency. - Run tests in parallel. Lint, unit tests, and integration tests are independent — run them simultaneously to cut pipeline time.
- Cache
node_modulesand npm cache. Caching both the installed modules and the npm download cache makesnpm cinear-instant on subsequent runs. - Use deployment environments. Bitbucket tracks which build is deployed where. This provides rollback capability and deployment history.
- Set
trigger: manualfor production deployments. Automatic production deploys are risky. Manual approval adds a safety gate. - Use
after-scriptfor cleanup and notifications. Theafter-scriptblock runs regardless of whether the main script succeeded or failed, making it ideal for cleanup and alerting. - Pin image versions. Use
node:20.11instead ofnode:20. Pinning prevents unexpected breakage when images are updated.