6 min read

Setting Up CI/CD Pipelines for My TypeScript Monorepo

Table of Contents

Setting Up CI/CD Pipelines for My TypeScript Monorepo:

When I first decided to organize my TypeScript projects into a monorepo, I thought the hardest part would be managing dependencies. I was wrong. The real challenge came when I needed to set up continuous integration and deployment pipelines that could handle multiple applications and libraries living in the same repository.

Let me walk you through how I solved this problem and built a robust CI/CD system that actually works.


Why I Chose a Monorepo Structure

Before diving into the CI/CD setup, let me explain why I went with a monorepo. I was working on three related TypeScript projects: a React frontend, a Node.js API, and a shared utility library. Managing three separate repositories became a nightmare when I needed to make changes across all three projects.

With a monorepo, I could make atomic commits that updated all affected packages simultaneously. No more β€œoops, forgot to update the API when I changed the shared types” moments.

The Challenge of CI/CD in Monorepos

Traditional CI/CD pipelines assume one application per repository. They run tests, build the project, and deploy. Simple. But in a monorepo, you might have five different applications, each with different build requirements, test suites, and deployment targets.

I needed a system that could:

  • Detect which packages changed in each commit
  • Run tests only for affected packages
  • Build and deploy only what actually changed
  • Handle dependencies between packages correctly

My Project Structure

Here’s how I organized my monorepo:

my-monorepo/
β”œβ”€β”€ packages/
β”‚   β”œβ”€β”€ frontend/          # React app
β”‚   β”œβ”€β”€ api/              # Node.js API
β”‚   β”œβ”€β”€ shared-utils/     # Shared TypeScript library
β”‚   └── admin-dashboard/  # Another React app
β”œβ”€β”€ package.json          # Root package.json
└── .github/workflows/    # CI/CD workflows

Each package has its own package.json, tsconfig.json, and build scripts. The root package.json uses workspaces to manage everything together.

Setting Up the Foundation

I started by configuring the root package.json with workspace support:

{
  "name": "my-monorepo",
  "private": true,
  "workspaces": [
    "packages/*"
  ],
  "scripts": {
    "build:all": "npm run build --workspaces",
    "test:all": "npm run test --workspaces",
    "lint:all": "npm run lint --workspaces"
  }
}

This gave me the ability to run commands across all packages, but it wasn’t smart enough to only build what changed.

The Game Changer: Change Detection

The breakthrough came when I discovered how to detect which packages were affected by each commit. I created a simple script that compares the current commit with the previous one and identifies changed packages:

// scripts/get-changed-packages.js
const { execSync } = require('child_process');
const fs = require('fs');
const path = require('path');

function getChangedPackages() {
  try {
    const changedFiles = execSync('git diff --name-only HEAD~1 HEAD', 
      { encoding: 'utf8' }).trim().split('\n');
    
    const changedPackages = new Set();
    
    changedFiles.forEach(file => {
      if (file.startsWith('packages/')) {
        const packageName = file.split('/')[1];
        changedPackages.add(packageName);
      }
    });
    
    return Array.from(changedPackages);
  } catch (error) {
    console.log('Could not detect changes, building all packages');
    return ['frontend', 'api', 'shared-utils', 'admin-dashboard'];
  }
}

console.log(JSON.stringify(getChangedPackages()));

This script became the foundation of my smart CI/CD pipeline.

Building the GitHub Actions Workflow

I use GitHub Actions for my CI/CD pipeline. Here’s my main workflow file:

name: CI/CD Pipeline

on:
  push:
    branches: [main, develop]
  pull_request:
    branches: [main]

jobs:
  detect-changes:
    runs-on: ubuntu-latest
    outputs:
      changed-packages: ${{ steps.changes.outputs.packages }}
    steps:
      - uses: actions/checkout@v3
        with:
          fetch-depth: 2
      
      - name: Detect changed packages
        id: changes
        run: |
          CHANGED=$(node scripts/get-changed-packages.js)
          echo "packages=$CHANGED" >> $GITHUB_OUTPUT

  test-and-build:
    needs: detect-changes
    runs-on: ubuntu-latest
    if: needs.detect-changes.outputs.changed-packages != '[]'
    strategy:
      matrix:
        package: ${{ fromJSON(needs.detect-changes.outputs.changed-packages) }}
    
    steps:
      - uses: actions/checkout@v3
      
      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'
          cache: 'npm'
      
      - name: Install dependencies
        run: npm ci
      
      - name: Run tests
        run: npm test --workspace=packages/${{ matrix.package }}
      
      - name: Build package
        run: npm run build --workspace=packages/${{ matrix.package }}
      
      - name: Upload build artifacts
        uses: actions/upload-artifact@v3
        with:
          name: build-${{ matrix.package }}
          path: packages/${{ matrix.package }}/dist

This workflow does something clever: it runs the change detection first, then creates a matrix job that only processes the packages that actually changed.

Handling Dependencies Between Packages

One tricky part was handling internal dependencies. When I changed my shared-utils package, I also needed to rebuild and test any packages that depended on it.

I enhanced my change detection script to understand these relationships:

function getDependentPackages(changedPackages) {
  const dependencies = {
    'shared-utils': ['frontend', 'api', 'admin-dashboard'],
    'frontend': [],
    'api': [],
    'admin-dashboard': []
  };
  
  const allAffected = new Set(changedPackages);
  
  changedPackages.forEach(pkg => {
    if (dependencies[pkg]) {
      dependencies[pkg].forEach(dep => allAffected.add(dep));
    }
  });
  
  return Array.from(allAffected);
}

Now when I update the shared utilities, all dependent packages automatically get rebuilt and tested.

Deployment Strategy

For deployment, I use different strategies based on the package type:

  • Frontend apps: Deploy to Vercel or Netlify
  • API services: Deploy to Railway or similar platforms
  • Shared libraries: Publish to npm (private registry)

Each package has its own deployment job that only runs when that specific package changes:

deploy-frontend:
  needs: [detect-changes, test-and-build]
  if: contains(fromJSON(needs.detect-changes.outputs.changed-packages), 'frontend')
  runs-on: ubuntu-latest
  steps:
    - name: Deploy to Vercel
      run: # deployment commands

The Results

This setup transformed my development workflow. Now when I push changes:

  1. Only affected packages get tested and built
  2. Build times dropped from 15 minutes to 3-5 minutes average
  3. Deployments are atomic and predictable
  4. I catch integration issues early

The key insight was that CI/CD for monorepos isn’t about running everything all the timeβ€”it’s about being smart about what actually needs to run.