Tooling

Monorepo Management with npm Workspaces

A practical guide to managing monorepos with npm workspaces covering workspace setup, dependency sharing, cross-package scripts, CI/CD strategies, and comparison with alternatives.

Monorepo Management with npm Workspaces

If you have spent any meaningful time shipping JavaScript at scale, you have hit the point where splitting code across multiple repositories becomes a coordination nightmare. Version mismatches between your shared library and your API server. PRs that span three repos just to change a validation function. Dependency drift that nobody catches until production breaks. Monorepos solve these problems, and npm workspaces — built directly into npm since version 7 — give you a zero-dependency way to manage them.

This guide covers everything you need to set up, operate, and ship a production monorepo with npm workspaces. No extra tooling required.

Prerequisites

  • Node.js 16+ (npm 7+ required for workspace support; Node 18+ recommended)
  • Familiarity with npm and package.json configuration
  • Basic understanding of Node.js module resolution
  • A terminal and a text editor

What Monorepos Are and When to Use Them

A monorepo is a single repository that contains multiple distinct packages or applications. Instead of my-api, my-utils, and my-cli living in three separate Git repos, they all live under one roof.

Use a monorepo when:

  • Multiple packages share common dependencies or configuration
  • Changes frequently span more than one package
  • You want atomic commits across package boundaries
  • Your team works across multiple packages regularly
  • You need consistent tooling (linting, testing, formatting) everywhere

Do not use a monorepo when:

  • Packages are owned by completely independent teams with different release cycles
  • You have a single application with no shared code
  • Your CI infrastructure cannot handle selective builds

I have found that most teams of 2-15 engineers benefit enormously from a monorepo. The overhead is minimal, and the coordination savings are real.

npm Workspaces Setup

npm workspaces are declared in the root package.json using the workspaces field. This tells npm which directories contain packages.

{
  "name": "my-monorepo",
  "version": "1.0.0",
  "private": true,
  "workspaces": [
    "packages/*"
  ],
  "scripts": {
    "build": "npm run build --workspaces --if-present",
    "test": "npm run test --workspaces --if-present",
    "lint": "npm run lint --workspaces --if-present"
  }
}

The "private": true field is important. You do not want to accidentally publish your root package to npm. The workspaces array accepts glob patterns, so "packages/*" means every directory under packages/ is a workspace.

You can also list workspaces explicitly:

{
  "workspaces": [
    "packages/utils",
    "packages/api",
    "packages/cli"
  ]
}

I prefer the glob pattern. It means adding a new workspace is just creating a directory — no root package.json edit required.

Workspace Directory Structure

Here is the structure I use for most projects:

my-monorepo/
├── package.json              # Root package.json with workspaces field
├── package-lock.json         # Single lockfile for everything
├── node_modules/             # Hoisted dependencies
├── .eslintrc.json            # Shared ESLint config
├── .prettierrc               # Shared Prettier config
├── packages/
│   ├── utils/
│   │   ├── package.json
│   │   ├── index.js
│   │   └── lib/
│   ├── api/
│   │   ├── package.json
│   │   ├── server.js
│   │   └── routes/
│   └── cli/
│       ├── package.json
│       └── bin/
│           └── cli.js

Key things to notice: there is one package-lock.json at the root and one node_modules at the root. npm hoists shared dependencies to the top level. Each workspace has its own package.json with its own name, version, and dependencies.

Installing Dependencies Across Workspaces

When you run npm install from the root, npm installs dependencies for all workspaces at once. Shared dependencies get hoisted to the root node_modules, and workspace-specific dependencies that conflict go into the workspace's own node_modules.

To add a dependency to a specific workspace:

# Add express to the api workspace
npm install express --workspace=packages/api

# Add commander to the cli workspace
npm install commander --workspace=packages/cli

# Add a dev dependency to a specific workspace
npm install mocha --save-dev --workspace=packages/utils

To add a dependency to the root (shared tooling like linters):

# Add ESLint at the root level
npm install eslint --save-dev

Never run npm install from inside a workspace directory. Always run it from the root. This ensures the lockfile stays consistent and dependency hoisting works correctly.

Running Scripts Across Workspaces

npm gives you two flags for running scripts across workspaces:

# Run "test" in a specific workspace
npm run test --workspace=packages/utils

# Run "test" in all workspaces that have a test script
npm run test --workspaces --if-present

# Run "build" only in utils and api
npm run build --workspace=packages/utils --workspace=packages/api

The --if-present flag is critical. Without it, npm will throw an error if any workspace is missing the specified script. With it, npm silently skips workspaces that do not define that script.

You can also use the short form -w and -ws:

npm run test -w packages/utils
npm run build -ws --if-present

Sharing Dependencies (Hoisting)

npm workspaces hoist dependencies to the root node_modules by default. If both packages/api and packages/cli depend on [email protected], only one copy gets installed at the root.

This mostly just works. But there are cases where hoisting causes problems — typically when a package uses __dirname-relative paths or a bundler that does not understand hoisted modules. You can disable hoisting for specific packages using the overrides field or by installing with --install-strategy=nested, but I rarely need to.

The hoisting behavior is what makes monorepos efficient. Instead of three copies of Express, three copies of Lodash, and three copies of everything else, you get one shared node_modules tree.

Internal Package References

This is where workspaces really shine. You can reference one workspace from another using the workspace name and a version specifier:

{
  "name": "@myorg/api",
  "version": "1.0.0",
  "dependencies": {
    "@myorg/utils": "^1.0.0",
    "express": "^4.18.0"
  }
}

When you run npm install, npm creates a symlink from node_modules/@myorg/utils to packages/utils. You get live, real-time access to the source code. No publishing step. No manual linking. Changes in packages/utils are immediately visible in packages/api.

In your code, you just require the package normally:

var utils = require("@myorg/utils");
var result = utils.slugify("Hello World");

I use the @myorg/ scope prefix for all internal packages. It prevents naming collisions and makes it obvious which packages are internal versus external.

Workspace Versioning Strategies

There are two approaches I have seen work:

Fixed versioning — all workspaces share the same version number. When you bump, you bump everything. This is simpler and works well for tightly coupled packages.

Independent versioning — each workspace has its own version. You only bump what changed. This requires more discipline but makes more sense when packages have different consumers.

For most teams, I recommend fixed versioning until you have a specific reason to go independent. npm does not provide built-in version orchestration, so if you need independent versioning with automatic changelog generation, look at Changesets or Lerna's version command.

Workspace-Specific vs Root Dependencies

The rule is straightforward:

  • Root dependencies: Tooling that applies to the whole repo — ESLint, Prettier, testing frameworks if shared, TypeScript compiler
  • Workspace dependencies: Runtime dependencies specific to that package — Express for the API, Commander for the CLI
// Root package.json
{
  "devDependencies": {
    "eslint": "^8.50.0",
    "prettier": "^3.0.0"
  }
}

// packages/api/package.json
{
  "dependencies": {
    "express": "^4.18.0",
    "@myorg/utils": "^1.0.0"
  },
  "devDependencies": {
    "supertest": "^6.3.0"
  }
}

When you deploy packages/api independently, its package.json must list everything it needs at runtime. Root devDependencies are for development only.

Building Packages in Dependency Order

If packages/api depends on packages/utils, you need to build utils first. npm workspaces do not automatically resolve build order — scripts run in directory order by default.

You can enforce order manually:

{
  "scripts": {
    "build": "npm run build -w packages/utils && npm run build -w packages/api && npm run build -w packages/cli"
  }
}

Or you can write a small build orchestrator:

// scripts/build-all.js
var execSync = require("child_process").execSync;

var buildOrder = [
  "packages/utils",
  "packages/api",
  "packages/cli"
];

buildOrder.forEach(function (workspace) {
  console.log("Building " + workspace + "...");
  try {
    execSync("npm run build -w " + workspace, { stdio: "inherit" });
  } catch (err) {
    console.error("Build failed for " + workspace);
    process.exit(1);
  }
});
node scripts/build-all.js

If you need topological sorting (automatic dependency-order resolution), that is where tools like Nx or Turborepo add value. For most projects with 3-10 workspaces, an explicit build order in a script is perfectly fine.

Testing Across Workspaces

Run all tests from the root:

npm run test --workspaces --if-present

Each workspace defines its own test script:

// packages/utils/package.json
{
  "scripts": {
    "test": "mocha test/**/*.test.js"
  }
}

// packages/api/package.json
{
  "scripts": {
    "test": "mocha test/**/*.test.js --timeout 10000"
  }
}

Because internal packages are symlinked, your tests in packages/api can require @myorg/utils and get the actual source — not a stale published version. This is a huge advantage. You catch integration breakages immediately.

npm Workspaces vs Lerna vs Nx vs Turborepo

Here is my honest take after using all four in production:

Feature npm Workspaces Lerna Nx Turborepo
Dependency management Built-in Uses npm/yarn Built-in Uses npm/yarn/pnpm
Task running Basic (--workspaces) Sequential/parallel Parallel + cache Parallel + cache
Dependency-order builds Manual Yes Yes Yes
Remote caching No Via Nx Yes Yes
Affected detection No Via Nx Yes Yes
Setup complexity Zero Low Medium Low
Extra dependencies None lerna nx, @nx/* turbo
Version management Manual Built-in Via plugins Via Changesets

npm Workspaces: Start here. It is free, built-in, and handles 80% of what most teams need. Dependency hoisting, workspace linking, and cross-workspace script execution.

Lerna: Lerna is now maintained by Nx. Its main value is version management and publishing orchestration. If you need to publish multiple packages to npm with coordinated versions, Lerna is still useful. But for task running, it delegates to Nx now.

Nx: The most powerful option. Task graph, computation caching, affected-only builds, remote caching. Worth the complexity when you have 20+ packages or builds that take more than a few minutes. Overkill for small repos.

Turborepo: Simpler than Nx with most of the same caching benefits. If you want build caching without the learning curve of Nx, Turborepo is the sweet spot. Acquired by Vercel, so it integrates well with Next.js projects.

My recommendation: start with npm workspaces. If you outgrow it, add Turborepo. If you need advanced orchestration, evaluate Nx.

CI/CD for Monorepos

The biggest CI/CD challenge with monorepos is avoiding full rebuilds when only one package changed. Here is a straightforward approach using Git diff:

// scripts/detect-changes.js
var execSync = require("child_process").execSync;

var baseBranch = process.env.BASE_BRANCH || "main";

function getChangedFiles() {
  var output = execSync(
    "git diff --name-only origin/" + baseBranch + "...HEAD"
  ).toString().trim();

  if (!output) return [];
  return output.split("\n");
}

function getAffectedWorkspaces(changedFiles) {
  var workspaces = {};

  changedFiles.forEach(function (file) {
    var match = file.match(/^packages\/([^/]+)\//);
    if (match) {
      workspaces[match[1]] = true;
    }
  });

  return Object.keys(workspaces);
}

var changedFiles = getChangedFiles();
var affected = getAffectedWorkspaces(changedFiles);

console.log("Changed files: " + changedFiles.length);
console.log("Affected workspaces: " + affected.join(", "));

// Output for CI consumption
affected.forEach(function (workspace) {
  console.log("::set-output name=" + workspace + "::true");
});

In a GitHub Actions workflow:

name: CI
on:
  pull_request:
    branches: [main]

jobs:
  detect-changes:
    runs-on: ubuntu-latest
    outputs:
      utils: ${{ steps.changes.outputs.utils }}
      api: ${{ steps.changes.outputs.api }}
      cli: ${{ steps.changes.outputs.cli }}
    steps:
      - uses: actions/checkout@v4
        with:
          fetch-depth: 0
      - uses: dorny/paths-filter@v2
        id: changes
        with:
          filters: |
            utils:
              - 'packages/utils/**'
            api:
              - 'packages/api/**'
              - 'packages/utils/**'
            cli:
              - 'packages/cli/**'
              - 'packages/utils/**'

  test-utils:
    needs: detect-changes
    if: needs.detect-changes.outputs.utils == 'true'
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: 20
          cache: 'npm'
      - run: npm ci
      - run: npm run test -w packages/utils

  test-api:
    needs: detect-changes
    if: needs.detect-changes.outputs.api == 'true'
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: 20
          cache: 'npm'
      - run: npm ci
      - run: npm run test -w packages/api

Notice that api and cli are triggered when utils changes too, because they depend on it. This dependency awareness is something you have to configure manually with npm workspaces. Tools like Nx and Turborepo detect this automatically.

Shared Configuration

One of the biggest wins of a monorepo is shared tooling configuration.

ESLint

Create a root .eslintrc.json:

{
  "env": {
    "node": true,
    "es2020": true
  },
  "rules": {
    "no-unused-vars": ["error", { "argsIgnorePattern": "^_" }],
    "no-console": "off",
    "semi": ["error", "always"],
    "quotes": ["error", "double"]
  }
}

Workspaces inherit this automatically. If a workspace needs overrides, it can have its own .eslintrc.json that extends the root:

{
  "extends": "../../.eslintrc.json",
  "rules": {
    "no-console": "warn"
  }
}

Prettier

A single .prettierrc at the root:

{
  "semi": true,
  "singleQuote": false,
  "tabWidth": 2,
  "trailingComma": "none",
  "printWidth": 100
}

TypeScript (if applicable)

A root tsconfig.base.json with shared settings, and workspace-specific tsconfig.json files that extend it:

// tsconfig.base.json
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "strict": true,
    "esModuleInterop": true,
    "declaration": true,
    "declarationMap": true,
    "sourceMap": true
  }
}

// packages/utils/tsconfig.json
{
  "extends": "../../tsconfig.base.json",
  "compilerOptions": {
    "outDir": "./dist",
    "rootDir": "./src"
  },
  "include": ["src/**/*"]
}

Workspace Linking and Resolution

When npm installs workspaces, it creates symlinks in the root node_modules. You can verify this:

$ ls -la node_modules/@myorg/
total 0
lrwxr-xr-x  1 user  staff  22 Jan 10 14:30 utils -> ../../packages/utils
lrwxr-xr-x  1 user  staff  19 Jan 10 14:30 api -> ../../packages/api
lrwxr-xr-x  1 user  staff  19 Jan 10 14:30 cli -> ../../packages/cli

This means require("@myorg/utils") resolves to the actual source code in packages/utils, not a published copy. Node.js follows the symlink and executes the real files.

One subtlety: __dirname inside a symlinked package resolves to the real path, not the symlink path. If your code uses __dirname to find relative files, it works correctly. But if something resolves paths relative to node_modules, you might see unexpected behavior.

Publishing Workspace Packages

When you are ready to publish workspace packages to npm:

# Publish a single workspace
npm publish --workspace=packages/utils

# Publish all workspaces
npm publish --workspaces

# Dry run first
npm publish --workspace=packages/utils --dry-run

Before publishing, make sure each workspace's package.json has the correct main, files, and version fields:

{
  "name": "@myorg/utils",
  "version": "1.2.0",
  "main": "index.js",
  "files": [
    "index.js",
    "lib/"
  ]
}

The files array is your allowlist of what gets published. Without it, everything except what is in .npmignore gets included — which might mean publishing your test files, docs, and other junk.

Complete Working Example

Here is a full monorepo with three workspaces: a shared utility library, an Express API server, and a CLI tool.

Root package.json

{
  "name": "acme-monorepo",
  "version": "1.0.0",
  "private": true,
  "workspaces": [
    "packages/*"
  ],
  "scripts": {
    "build": "npm run build -w packages/utils && npm run build --workspaces --if-present",
    "test": "npm run test --workspaces --if-present",
    "lint": "eslint packages/*/lib/**/*.js packages/*/index.js packages/*/bin/**/*.js --ignore-pattern node_modules",
    "start:api": "npm run start -w packages/api"
  },
  "devDependencies": {
    "eslint": "^8.50.0",
    "mocha": "^10.2.0",
    "prettier": "^3.0.0"
  }
}

packages/utils/package.json

{
  "name": "@acme/utils",
  "version": "1.0.0",
  "main": "index.js",
  "files": ["index.js", "lib/"],
  "scripts": {
    "test": "mocha test/**/*.test.js",
    "lint": "eslint lib/**/*.js index.js"
  }
}

packages/utils/index.js

var slugify = require("./lib/slugify");
var validate = require("./lib/validate");
var format = require("./lib/format");

module.exports = {
  slugify: slugify,
  validate: validate,
  format: format
};

packages/utils/lib/slugify.js

function slugify(text) {
  if (typeof text !== "string") {
    throw new TypeError("slugify expects a string, got " + typeof text);
  }

  return text
    .toLowerCase()
    .trim()
    .replace(/[^\w\s-]/g, "")
    .replace(/[\s_]+/g, "-")
    .replace(/-+/g, "-")
    .replace(/^-+|-+$/g, "");
}

module.exports = slugify;

packages/utils/lib/validate.js

function isEmail(value) {
  var pattern = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
  return pattern.test(value);
}

function isNonEmpty(value) {
  return typeof value === "string" && value.trim().length > 0;
}

function isPositiveInt(value) {
  var num = parseInt(value, 10);
  return !isNaN(num) && num > 0 && String(num) === String(value);
}

module.exports = {
  isEmail: isEmail,
  isNonEmpty: isNonEmpty,
  isPositiveInt: isPositiveInt
};

packages/utils/lib/format.js

function formatDate(date) {
  var d = date instanceof Date ? date : new Date(date);
  if (isNaN(d.getTime())) {
    throw new Error("Invalid date: " + date);
  }
  var year = d.getFullYear();
  var month = String(d.getMonth() + 1).padStart(2, "0");
  var day = String(d.getDate()).padStart(2, "0");
  return year + "-" + month + "-" + day;
}

function formatBytes(bytes) {
  if (bytes === 0) return "0 B";
  var units = ["B", "KB", "MB", "GB", "TB"];
  var i = Math.floor(Math.log(bytes) / Math.log(1024));
  var value = (bytes / Math.pow(1024, i)).toFixed(2);
  return value + " " + units[i];
}

module.exports = {
  formatDate: formatDate,
  formatBytes: formatBytes
};

packages/utils/test/utils.test.js

var assert = require("assert");
var utils = require("../index");

describe("slugify", function () {
  it("should convert text to a URL slug", function () {
    assert.strictEqual(utils.slugify("Hello World"), "hello-world");
  });

  it("should handle special characters", function () {
    assert.strictEqual(utils.slugify("What's Up?"), "whats-up");
  });

  it("should throw on non-string input", function () {
    assert.throws(function () {
      utils.slugify(42);
    }, TypeError);
  });
});

describe("validate", function () {
  it("should validate email addresses", function () {
    assert.strictEqual(utils.validate.isEmail("[email protected]"), true);
    assert.strictEqual(utils.validate.isEmail("not-an-email"), false);
  });

  it("should check non-empty strings", function () {
    assert.strictEqual(utils.validate.isNonEmpty("hello"), true);
    assert.strictEqual(utils.validate.isNonEmpty("  "), false);
  });
});

describe("format", function () {
  it("should format dates", function () {
    var result = utils.format.formatDate(new Date("2025-03-15"));
    assert.strictEqual(result, "2025-03-15");
  });

  it("should format byte sizes", function () {
    assert.strictEqual(utils.format.formatBytes(1024), "1.00 KB");
    assert.strictEqual(utils.format.formatBytes(1048576), "1.00 MB");
  });
});

packages/api/package.json

{
  "name": "@acme/api",
  "version": "1.0.0",
  "main": "server.js",
  "scripts": {
    "start": "node server.js",
    "test": "mocha test/**/*.test.js --timeout 10000"
  },
  "dependencies": {
    "@acme/utils": "^1.0.0",
    "express": "^4.18.0"
  },
  "devDependencies": {
    "supertest": "^6.3.0"
  }
}

packages/api/server.js

var express = require("express");
var utils = require("@acme/utils");

var app = express();
var PORT = process.env.PORT || 3000;

app.use(express.json());

app.get("/health", function (req, res) {
  res.json({ status: "ok", timestamp: utils.format.formatDate(new Date()) });
});

app.post("/slugify", function (req, res) {
  var text = req.body.text;
  if (!utils.validate.isNonEmpty(text)) {
    return res.status(400).json({ error: "text is required" });
  }
  var slug = utils.slugify(text);
  res.json({ original: text, slug: slug });
});

app.post("/validate-email", function (req, res) {
  var email = req.body.email;
  var valid = utils.validate.isEmail(email || "");
  res.json({ email: email, valid: valid });
});

if (require.main === module) {
  app.listen(PORT, function () {
    console.log("API server running on port " + PORT);
  });
}

module.exports = app;

packages/api/test/api.test.js

var assert = require("assert");
var request = require("supertest");
var app = require("../server");

describe("API", function () {
  describe("GET /health", function () {
    it("should return ok status", function (done) {
      request(app)
        .get("/health")
        .expect(200)
        .end(function (err, res) {
          if (err) return done(err);
          assert.strictEqual(res.body.status, "ok");
          done();
        });
    });
  });

  describe("POST /slugify", function () {
    it("should slugify text", function (done) {
      request(app)
        .post("/slugify")
        .send({ text: "Hello World" })
        .expect(200)
        .end(function (err, res) {
          if (err) return done(err);
          assert.strictEqual(res.body.slug, "hello-world");
          done();
        });
    });

    it("should reject empty text", function (done) {
      request(app)
        .post("/slugify")
        .send({ text: "" })
        .expect(400, done);
    });
  });
});

packages/cli/package.json

{
  "name": "@acme/cli",
  "version": "1.0.0",
  "bin": {
    "acme": "./bin/cli.js"
  },
  "scripts": {
    "test": "mocha test/**/*.test.js"
  },
  "dependencies": {
    "@acme/utils": "^1.0.0",
    "commander": "^11.0.0"
  }
}

packages/cli/bin/cli.js

#!/usr/bin/env node

var program = require("commander").program;
var utils = require("@acme/utils");
var pkg = require("../package.json");

program
  .name("acme")
  .version(pkg.version)
  .description("Acme CLI utilities");

program
  .command("slugify <text>")
  .description("Convert text to a URL-friendly slug")
  .action(function (text) {
    console.log(utils.slugify(text));
  });

program
  .command("validate-email <email>")
  .description("Check if an email address is valid")
  .action(function (email) {
    var valid = utils.validate.isEmail(email);
    console.log(email + " -> " + (valid ? "valid" : "invalid"));
    process.exit(valid ? 0 : 1);
  });

program
  .command("format-date [date]")
  .description("Format a date string as YYYY-MM-DD")
  .action(function (date) {
    var input = date || new Date();
    console.log(utils.format.formatDate(input));
  });

program.parse(process.argv);

Bootstrapping the Monorepo

# From the root
npm install

# Run all tests
npm test

# Start the API
npm run start:api

# Use the CLI (after npm install links the bin)
npx acme slugify "Hello World"
# Output: hello-world

npx acme validate-email "[email protected]"
# Output: [email protected] -> valid

npx acme format-date "2025-06-15"
# Output: 2025-06-15

Common Issues and Troubleshooting

1. "ENOENT: no such file or directory" When Requiring Internal Package

Error: Cannot find module '@acme/utils'

This usually means you ran npm install from within a workspace directory instead of the root. Workspace symlinks only get created when you install from the root. Fix: delete any workspace-level node_modules directories and run npm install from the project root.

rm -rf packages/*/node_modules
npm install

2. Lockfile Conflicts on Merge

When multiple developers modify different workspaces, package-lock.json conflicts are common. Do not try to manually resolve lockfile conflicts. Instead:

git checkout --theirs package-lock.json
npm install
git add package-lock.json

This regenerates the lockfile from the merged package.json files, which is always correct.

3. Phantom Dependencies from Hoisting

A workspace might accidentally require a package that it does not declare in its own package.json — it works locally because the dependency is hoisted from another workspace. This breaks when the workspace is deployed or published independently.

Example: packages/cli uses lodash but does not list it. It works because packages/api has lodash and it gets hoisted. When you publish @acme/cli alone, it fails.

Fix: always explicitly declare every dependency your workspace uses, even if it is already hoisted. Run npm ls in a workspace to check:

npm ls --workspace=packages/cli

4. "ERESOLVE: unable to resolve dependency tree"

npm ERR! ERESOLVE unable to resolve dependency tree
npm ERR! While resolving: @acme/[email protected]
npm ERR! Found: [email protected]
npm ERR! node_modules/some-package

This happens when two workspaces require incompatible versions of the same dependency. npm cannot hoist both. Options:

  • Align the version across workspaces (preferred)
  • Use overrides in the root package.json to force a version
  • As a last resort, use --install-strategy=nested for that dependency
{
  "overrides": {
    "some-package": "^2.0.0"
  }
}

5. Scripts Run In Wrong Order

npm workspaces run scripts in filesystem order, not dependency order. If packages/api depends on packages/utils being built first, you need to either chain them explicitly or use a build script that respects the dependency graph.

# Wrong: might run api before utils
npm run build --workspaces

# Right: explicit ordering
npm run build -w packages/utils && npm run build -w packages/api

Best Practices

  1. Always run npm commands from the root. Never cd into a workspace to run npm install. This is the single most common source of problems. The root is your command center.

  2. Use scoped package names for internal packages. Prefix all workspace package names with @yourorg/. This avoids collisions with public npm packages and makes it clear which packages are internal.

  3. Keep the root package.json minimal. Only shared dev tooling belongs at the root. Runtime dependencies should always be in the workspace that uses them.

  4. Declare all dependencies explicitly in each workspace. Do not rely on hoisting to provide dependencies you forgot to list. Your workspace should work correctly in isolation.

  5. Use --if-present when running scripts across workspaces. Not every workspace needs every script. The --if-present flag prevents failures when a script is missing.

  6. Pin internal workspace references to actual versions. Use "@acme/utils": "^1.0.0" rather than "*". This documents the minimum version requirement and prevents issues when publishing.

  7. Set up a single CI pipeline with change detection. Use path-based triggers or a change detection script so you only build and test what changed. This keeps CI fast as the monorepo grows.

  8. Commit the lockfile. The single package-lock.json at the root is critical for reproducible builds. Always commit it. Resolve conflicts by regenerating rather than manual editing.

  9. Use a consistent directory naming convention. Whether it is packages/, apps/, or separate libs/ and services/ directories, pick a convention and stick with it. I prefer a flat packages/ for simplicity.

References

Powered by Contentful