Serverless

Testing Serverless Functions Locally

Test serverless functions locally with Jest, SAM CLI, LocalStack, and DynamoDB Local for comprehensive Node.js test coverage

Testing Serverless Functions Locally

Overview

Testing serverless functions locally is the difference between deploying with confidence and deploying with crossed fingers. Without a local testing strategy, every code change becomes a slow, expensive round-trip to the cloud — you push, wait for deployment, invoke, check CloudWatch logs, fix, and repeat. A proper local testing setup gives you fast feedback loops using tools like Jest for unit tests, AWS SAM CLI for local invocation, and LocalStack for full AWS service emulation, so you catch bugs before they ever touch your cloud account.

Prerequisites

  • Node.js 18 or later installed
  • Basic understanding of AWS Lambda and serverless concepts
  • Familiarity with Jest testing framework
  • Docker installed (required for SAM CLI and LocalStack)
  • AWS SAM CLI installed (brew install aws-sam-cli or equivalent)
  • An AWS account (for generating realistic event payloads)

The Testing Pyramid for Serverless

The classic testing pyramid still applies to serverless, but the layers shift. In a traditional application, unit tests dominate the base, integration tests sit in the middle, and end-to-end tests cap the top. With serverless, the integration layer becomes far more important because your functions are glue between managed services — DynamoDB, SQS, API Gateway, S3. If you only unit test the business logic inside your handler, you miss the integration seams where most bugs actually live.

Here is how I think about the serverless testing pyramid:

Layer 1 — Unit Tests (fastest, most numerous): Test pure business logic functions in isolation. Mock all AWS SDK calls. These run in milliseconds.

Layer 2 — Local Integration Tests (medium speed): Use SAM CLI local invoke, DynamoDB Local, or LocalStack to test your handler against local emulations of AWS services. These run in seconds.

Layer 3 — Cloud Integration Tests (slowest, fewest): Deploy to a dedicated test environment and invoke real functions against real services. These run in minutes and cost money.

The goal is to push as much testing as possible into layers 1 and 2 so your feedback loop stays fast.

Project Structure

Before diving into tests, let us establish a clean project structure. This is a serverless API for managing user records, backed by DynamoDB:

serverless-api/
├── src/
│   ├── handlers/
│   │   ├── getUser.js
│   │   ├── createUser.js
│   │   └── processQueue.js
│   ├── lib/
│   │   ├── userService.js
│   │   └── dynamoClient.js
│   └── utils/
│       └── validation.js
├── tests/
│   ├── unit/
│   │   ├── userService.test.js
│   │   ├── validation.test.js
│   │   └── handlers/
│   │       ├── getUser.test.js
│   │       └── createUser.test.js
│   ├── integration/
│   │   ├── api.test.js
│   │   └── dynamodb.test.js
│   └── fixtures/
│       ├── apiGatewayEvent.json
│       ├── sqsEvent.json
│       └── dynamoStreamEvent.json
├── template.yaml
├── jest.config.js
└── package.json

Unit Testing Lambda Handlers with Jest

The key to unit testing Lambda handlers is separating business logic from AWS plumbing. Your handler should be a thin wrapper that extracts data from the event, calls a service function, and formats the response.

Here is a simple user retrieval handler:

// src/handlers/getUser.js
var userService = require('../lib/userService');

exports.handler = function(event, context) {
  var userId = event.pathParameters && event.pathParameters.id;

  if (!userId) {
    return Promise.resolve({
      statusCode: 400,
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ error: 'Missing user ID' })
    });
  }

  return userService.getUserById(userId)
    .then(function(user) {
      if (!user) {
        return {
          statusCode: 404,
          headers: { 'Content-Type': 'application/json' },
          body: JSON.stringify({ error: 'User not found' })
        };
      }
      return {
        statusCode: 200,
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify(user)
      };
    })
    .catch(function(err) {
      console.error('Error fetching user:', err);
      return {
        statusCode: 500,
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ error: 'Internal server error' })
      };
    });
};

And the Jest test:

// tests/unit/handlers/getUser.test.js
var getUser = require('../../../src/handlers/getUser');
var userService = require('../../../src/lib/userService');

jest.mock('../../../src/lib/userService');

describe('getUser handler', function() {
  beforeEach(function() {
    jest.clearAllMocks();
  });

  it('should return 400 when user ID is missing', function() {
    var event = { pathParameters: null };

    return getUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(400);
      expect(JSON.parse(result.body).error).toBe('Missing user ID');
    });
  });

  it('should return 200 with user data', function() {
    var mockUser = { id: 'user-123', name: 'Jane Doe', email: '[email protected]' };
    userService.getUserById.mockResolvedValue(mockUser);

    var event = { pathParameters: { id: 'user-123' } };

    return getUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(200);
      expect(JSON.parse(result.body)).toEqual(mockUser);
      expect(userService.getUserById).toHaveBeenCalledWith('user-123');
    });
  });

  it('should return 404 when user does not exist', function() {
    userService.getUserById.mockResolvedValue(null);

    var event = { pathParameters: { id: 'nonexistent' } };

    return getUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(404);
    });
  });

  it('should return 500 on service error', function() {
    userService.getUserById.mockRejectedValue(new Error('DynamoDB timeout'));

    var event = { pathParameters: { id: 'user-123' } };

    return getUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(500);
    });
  });
});

Mocking AWS SDK Calls

When your service layer calls DynamoDB, S3, or any AWS service directly, you need to mock the SDK. The AWS SDK v3 uses a modular client pattern that is straightforward to mock with Jest.

// src/lib/dynamoClient.js
var DynamoDBClient = require('@aws-sdk/client-dynamodb').DynamoDBClient;
var DynamoDBDocumentClient = require('@aws-sdk/lib-dynamodb').DynamoDBDocumentClient;

var client = new DynamoDBClient({
  region: process.env.AWS_REGION || 'us-east-1'
});

var docClient = DynamoDBDocumentClient.from(client);

module.exports = docClient;
// src/lib/userService.js
var docClient = require('./dynamoClient');
var GetCommand = require('@aws-sdk/lib-dynamodb').GetCommand;
var PutCommand = require('@aws-sdk/lib-dynamodb').PutCommand;

var TABLE_NAME = process.env.USERS_TABLE || 'Users';

function getUserById(userId) {
  var params = {
    TableName: TABLE_NAME,
    Key: { id: userId }
  };

  return docClient.send(new GetCommand(params))
    .then(function(result) {
      return result.Item || null;
    });
}

function createUser(userData) {
  var params = {
    TableName: TABLE_NAME,
    Item: userData,
    ConditionExpression: 'attribute_not_exists(id)'
  };

  return docClient.send(new PutCommand(params))
    .then(function() {
      return userData;
    });
}

module.exports = { getUserById: getUserById, createUser: createUser };

Now mock the SDK at the module level:

// tests/unit/userService.test.js
var mockSend = jest.fn();

jest.mock('../../src/lib/dynamoClient', function() {
  return { send: mockSend };
});

var userService = require('../../src/lib/userService');

describe('userService', function() {
  beforeEach(function() {
    mockSend.mockReset();
    process.env.USERS_TABLE = 'TestUsers';
  });

  describe('getUserById', function() {
    it('should return user when found', function() {
      var mockUser = { id: 'user-1', name: 'Alice' };
      mockSend.mockResolvedValue({ Item: mockUser });

      return userService.getUserById('user-1').then(function(result) {
        expect(result).toEqual(mockUser);
        expect(mockSend).toHaveBeenCalledTimes(1);
      });
    });

    it('should return null when user not found', function() {
      mockSend.mockResolvedValue({});

      return userService.getUserById('ghost').then(function(result) {
        expect(result).toBeNull();
      });
    });
  });

  describe('createUser', function() {
    it('should create and return user', function() {
      var newUser = { id: 'user-2', name: 'Bob', email: '[email protected]' };
      mockSend.mockResolvedValue({});

      return userService.createUser(newUser).then(function(result) {
        expect(result).toEqual(newUser);
      });
    });

    it('should throw when user already exists', function() {
      var err = new Error('The conditional request failed');
      err.name = 'ConditionalCheckFailedException';
      mockSend.mockRejectedValue(err);

      var newUser = { id: 'existing', name: 'Duplicate' };

      return userService.createUser(newUser).catch(function(error) {
        expect(error.name).toBe('ConditionalCheckFailedException');
      });
    });
  });
});

SAM CLI Local Invoke and Local API

AWS SAM CLI is the most direct way to run your Lambda functions locally. It uses Docker to simulate the Lambda runtime, giving you a near-identical execution environment.

First, your template.yaml defines the serverless application:

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31

Globals:
  Function:
    Runtime: nodejs18.x
    Timeout: 30
    Environment:
      Variables:
        USERS_TABLE: !Ref UsersTable

Resources:
  GetUserFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: src/handlers/getUser.handler
      Events:
        GetUser:
          Type: Api
          Properties:
            Path: /users/{id}
            Method: get

  CreateUserFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: src/handlers/createUser.handler
      Events:
        CreateUser:
          Type: Api
          Properties:
            Path: /users
            Method: post

  UsersTable:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName: Users
      AttributeDefinitions:
        - AttributeName: id
          AttributeType: S
      KeySchema:
        - AttributeName: id
          KeyType: HASH
      BillingMode: PAY_PER_REQUEST

Single Function Invocation

Invoke a single function with a test event:

# Generate a test event
sam local generate-event apigateway aws-proxy \
  --method GET \
  --path /users/user-123 \
  --path-parameter "id=user-123" \
  > tests/fixtures/apiGatewayEvent.json

# Invoke the function locally
sam local invoke GetUserFunction \
  --event tests/fixtures/apiGatewayEvent.json \
  --env-vars env.json

The env.json file provides environment variables:

{
  "GetUserFunction": {
    "USERS_TABLE": "Users-dev",
    "AWS_REGION": "us-east-1"
  }
}

Local API Gateway

For a more interactive development experience, SAM can spin up a local API Gateway:

sam local start-api --port 3001 --env-vars env.json

Now you can hit your endpoints with curl:

curl http://localhost:3001/users/user-123
curl -X POST http://localhost:3001/users \
  -H "Content-Type: application/json" \
  -d '{"id": "user-456", "name": "Charlie", "email": "[email protected]"}'

This is particularly valuable for testing request/response transformations, CORS headers, and API Gateway behavior that pure unit tests miss.

LocalStack for Full AWS Emulation

LocalStack emulates dozens of AWS services locally inside Docker. It is the closest thing to running AWS on your laptop, and it is indispensable for integration testing.

Start LocalStack with Docker Compose:

# docker-compose.yml
version: '3.8'
services:
  localstack:
    image: localstack/localstack:latest
    ports:
      - "4566:4566"
    environment:
      - SERVICES=dynamodb,sqs,sns,s3
      - DEFAULT_REGION=us-east-1
      - DOCKER_HOST=unix:///var/run/docker.sock
    volumes:
      - "./localstack-init:/etc/localstack/init/ready.d"
      - "/var/run/docker.sock:/var/run/docker.sock"

Create an initialization script that sets up your resources:

#!/bin/bash
# localstack-init/init.sh
awslocal dynamodb create-table \
  --table-name Users \
  --attribute-definitions AttributeName=id,AttributeType=S \
  --key-schema AttributeName=id,KeyType=HASH \
  --billing-mode PAY_PER_REQUEST

awslocal sqs create-queue \
  --queue-name user-events

Then configure your tests to point at LocalStack:

// tests/integration/setup.js
var DynamoDBClient = require('@aws-sdk/client-dynamodb').DynamoDBClient;
var DynamoDBDocumentClient = require('@aws-sdk/lib-dynamodb').DynamoDBDocumentClient;

function createLocalClient() {
  var client = new DynamoDBClient({
    region: 'us-east-1',
    endpoint: 'http://localhost:4566',
    credentials: {
      accessKeyId: 'test',
      secretAccessKey: 'test'
    }
  });

  return DynamoDBDocumentClient.from(client);
}

module.exports = { createLocalClient: createLocalClient };

DynamoDB Local for Database Testing

If you only need DynamoDB and do not want the overhead of full LocalStack, DynamoDB Local is a lightweight alternative. It runs as a standalone JAR file or Docker container.

docker run -d -p 8000:8000 amazon/dynamodb-local

Here is a full integration test suite that uses DynamoDB Local:

// tests/integration/dynamodb.test.js
var DynamoDBClient = require('@aws-sdk/client-dynamodb').DynamoDBClient;
var DynamoDBDocumentClient = require('@aws-sdk/lib-dynamodb').DynamoDBDocumentClient;
var CreateTableCommand = require('@aws-sdk/client-dynamodb').CreateTableCommand;
var DeleteTableCommand = require('@aws-sdk/client-dynamodb').DeleteTableCommand;
var PutCommand = require('@aws-sdk/lib-dynamodb').PutCommand;
var GetCommand = require('@aws-sdk/lib-dynamodb').GetCommand;

var client = new DynamoDBClient({
  region: 'us-east-1',
  endpoint: 'http://localhost:8000',
  credentials: { accessKeyId: 'fake', secretAccessKey: 'fake' }
});

var docClient = DynamoDBDocumentClient.from(client);
var TABLE_NAME = 'Users-test';

beforeAll(function() {
  var params = {
    TableName: TABLE_NAME,
    AttributeDefinitions: [
      { AttributeName: 'id', AttributeType: 'S' }
    ],
    KeySchema: [
      { AttributeName: 'id', KeyType: 'HASH' }
    ],
    BillingMode: 'PAY_PER_REQUEST'
  };

  return client.send(new CreateTableCommand(params));
});

afterAll(function() {
  return client.send(new DeleteTableCommand({ TableName: TABLE_NAME }));
});

describe('DynamoDB integration', function() {
  it('should write and read a user record', function() {
    var user = { id: 'integration-1', name: 'Test User', email: '[email protected]' };

    return docClient.send(new PutCommand({
      TableName: TABLE_NAME,
      Item: user
    }))
    .then(function() {
      return docClient.send(new GetCommand({
        TableName: TABLE_NAME,
        Key: { id: 'integration-1' }
      }));
    })
    .then(function(result) {
      expect(result.Item).toEqual(user);
    });
  });

  it('should return undefined for missing records', function() {
    return docClient.send(new GetCommand({
      TableName: TABLE_NAME,
      Key: { id: 'nonexistent' }
    }))
    .then(function(result) {
      expect(result.Item).toBeUndefined();
    });
  });
});

Event Fixture Generation

Hardcoding event JSON is fragile and tedious. SAM CLI can generate event fixtures for every AWS service trigger:

# API Gateway proxy event
sam local generate-event apigateway aws-proxy > tests/fixtures/apiGatewayEvent.json

# SQS event
sam local generate-event sqs receive-message > tests/fixtures/sqsEvent.json

# SNS event
sam local generate-event sns notification > tests/fixtures/snsEvent.json

# DynamoDB Streams event
sam local generate-event dynamodb update > tests/fixtures/dynamoStreamEvent.json

# S3 put event
sam local generate-event s3 put > tests/fixtures/s3PutEvent.json

# CloudWatch scheduled event
sam local generate-event cloudwatch scheduled-event > tests/fixtures/scheduledEvent.json

I also recommend building a fixture factory for customizing events in tests:

// tests/fixtures/eventFactory.js
var baseApiEvent = require('./apiGatewayEvent.json');

function createApiGatewayEvent(overrides) {
  var event = JSON.parse(JSON.stringify(baseApiEvent));

  if (overrides.pathParameters) {
    event.pathParameters = overrides.pathParameters;
  }
  if (overrides.body) {
    event.body = typeof overrides.body === 'string'
      ? overrides.body
      : JSON.stringify(overrides.body);
  }
  if (overrides.method) {
    event.httpMethod = overrides.method;
  }
  if (overrides.headers) {
    Object.assign(event.headers, overrides.headers);
  }
  if (overrides.queryStringParameters) {
    event.queryStringParameters = overrides.queryStringParameters;
  }

  return event;
}

function createSqsEvent(bodies) {
  return {
    Records: bodies.map(function(body, index) {
      return {
        messageId: 'msg-' + index,
        receiptHandle: 'handle-' + index,
        body: typeof body === 'string' ? body : JSON.stringify(body),
        attributes: {
          ApproximateReceiveCount: '1',
          SentTimestamp: Date.now().toString()
        },
        eventSource: 'aws:sqs',
        eventSourceARN: 'arn:aws:sqs:us-east-1:123456789:test-queue'
      };
    })
  };
}

module.exports = {
  createApiGatewayEvent: createApiGatewayEvent,
  createSqsEvent: createSqsEvent
};

Testing API Gateway Events

API Gateway events carry a lot of context — headers, query strings, path parameters, authorizer claims. Test all of these dimensions:

// tests/unit/handlers/createUser.test.js
var createUser = require('../../../src/handlers/createUser');
var userService = require('../../../src/lib/userService');
var eventFactory = require('../../fixtures/eventFactory');

jest.mock('../../../src/lib/userService');

describe('createUser handler', function() {
  beforeEach(function() {
    jest.clearAllMocks();
  });

  it('should create a user and return 201', function() {
    var userData = { id: 'u-1', name: 'Alice', email: '[email protected]' };
    userService.createUser.mockResolvedValue(userData);

    var event = eventFactory.createApiGatewayEvent({
      method: 'POST',
      body: userData
    });

    return createUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(201);
      expect(JSON.parse(result.body)).toEqual(userData);
    });
  });

  it('should return 400 for invalid JSON body', function() {
    var event = eventFactory.createApiGatewayEvent({
      method: 'POST',
      body: 'not valid json {'
    });

    return createUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(400);
      expect(JSON.parse(result.body).error).toContain('Invalid');
    });
  });

  it('should return 409 when user already exists', function() {
    var err = new Error('Conditional check failed');
    err.name = 'ConditionalCheckFailedException';
    userService.createUser.mockRejectedValue(err);

    var event = eventFactory.createApiGatewayEvent({
      method: 'POST',
      body: { id: 'existing', name: 'Dupe' }
    });

    return createUser.handler(event, {}).then(function(result) {
      expect(result.statusCode).toBe(409);
    });
  });

  it('should handle CORS preflight headers', function() {
    var event = eventFactory.createApiGatewayEvent({
      method: 'POST',
      body: { id: 'u-2', name: 'Bob' },
      headers: { Origin: 'https://example.com' }
    });

    userService.createUser.mockResolvedValue({ id: 'u-2', name: 'Bob' });

    return createUser.handler(event, {}).then(function(result) {
      expect(result.headers['Content-Type']).toBe('application/json');
    });
  });
});

Testing SQS and SNS Event Handlers

Queue and notification handlers process batches of records. The critical thing to test is partial failure handling — when some records succeed and others fail, you need to return the right batch item failures so SQS knows which messages to retry.

// src/handlers/processQueue.js
var userService = require('../lib/userService');

exports.handler = function(event) {
  var failures = [];

  var promises = event.Records.map(function(record) {
    var body;
    try {
      body = JSON.parse(record.body);
    } catch (e) {
      console.error('Invalid JSON in message:', record.messageId);
      failures.push({ itemIdentifier: record.messageId });
      return Promise.resolve();
    }

    return userService.createUser(body)
      .catch(function(err) {
        console.error('Failed to process message:', record.messageId, err);
        failures.push({ itemIdentifier: record.messageId });
      });
  });

  return Promise.all(promises).then(function() {
    return { batchItemFailures: failures };
  });
};
// tests/unit/handlers/processQueue.test.js
var processQueue = require('../../../src/handlers/processQueue');
var userService = require('../../../src/lib/userService');
var eventFactory = require('../../fixtures/eventFactory');

jest.mock('../../../src/lib/userService');

describe('processQueue handler', function() {
  beforeEach(function() {
    jest.clearAllMocks();
  });

  it('should process all messages successfully', function() {
    userService.createUser.mockResolvedValue({});

    var event = eventFactory.createSqsEvent([
      { id: 'u-1', name: 'Alice' },
      { id: 'u-2', name: 'Bob' }
    ]);

    return processQueue.handler(event).then(function(result) {
      expect(result.batchItemFailures).toHaveLength(0);
      expect(userService.createUser).toHaveBeenCalledTimes(2);
    });
  });

  it('should report partial failures', function() {
    userService.createUser
      .mockResolvedValueOnce({})
      .mockRejectedValueOnce(new Error('DB write failed'));

    var event = eventFactory.createSqsEvent([
      { id: 'u-1', name: 'Alice' },
      { id: 'u-2', name: 'Bob' }
    ]);

    return processQueue.handler(event).then(function(result) {
      expect(result.batchItemFailures).toHaveLength(1);
      expect(result.batchItemFailures[0].itemIdentifier).toBe('msg-1');
    });
  });

  it('should handle malformed JSON in message body', function() {
    var event = {
      Records: [{
        messageId: 'bad-msg',
        receiptHandle: 'handle',
        body: '{ invalid json',
        attributes: {},
        eventSource: 'aws:sqs',
        eventSourceARN: 'arn:aws:sqs:us-east-1:123456789:queue'
      }]
    };

    return processQueue.handler(event).then(function(result) {
      expect(result.batchItemFailures).toHaveLength(1);
      expect(userService.createUser).not.toHaveBeenCalled();
    });
  });
});

Environment Variable Management for Tests

Serverless functions depend heavily on environment variables for table names, queue URLs, feature flags, and service endpoints. Managing these in tests requires discipline.

// jest.config.js
module.exports = {
  testEnvironment: 'node',
  testMatch: ['**/tests/**/*.test.js'],
  collectCoverageFrom: ['src/**/*.js'],
  setupFiles: ['./tests/setup.js'],
  projects: [
    {
      displayName: 'unit',
      testMatch: ['<rootDir>/tests/unit/**/*.test.js'],
      setupFiles: ['<rootDir>/tests/unit/setup.js']
    },
    {
      displayName: 'integration',
      testMatch: ['<rootDir>/tests/integration/**/*.test.js'],
      setupFiles: ['<rootDir>/tests/integration/setup.js']
    }
  ]
};
// tests/unit/setup.js
process.env.USERS_TABLE = 'Users-test';
process.env.AWS_REGION = 'us-east-1';
process.env.QUEUE_URL = 'https://sqs.us-east-1.amazonaws.com/123456789/test-queue';
process.env.LOG_LEVEL = 'silent';
// tests/integration/setup.js
process.env.USERS_TABLE = 'Users-integration';
process.env.AWS_REGION = 'us-east-1';
process.env.DYNAMODB_ENDPOINT = 'http://localhost:8000';
process.env.LOCALSTACK_ENDPOINT = 'http://localhost:4566';

Never hardcode resource names in your function code. Always read from environment variables, and provide test-specific values in your setup files.

CI/CD Integration for Serverless Tests

Your CI pipeline should run unit tests on every push, integration tests that require Docker on pull requests, and cloud integration tests on deployment to staging. Here is a GitHub Actions workflow:

# .github/workflows/test.yml
name: Serverless Tests
on:
  push:
    branches: [main]
  pull_request:
    branches: [main]

jobs:
  unit-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: '18'
      - run: npm ci
      - run: npm run test:unit

  integration-tests:
    runs-on: ubuntu-latest
    services:
      dynamodb-local:
        image: amazon/dynamodb-local
        ports:
          - 8000:8000
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: '18'
      - run: npm ci
      - run: npm run test:integration
        env:
          DYNAMODB_ENDPOINT: http://localhost:8000

  template-validation:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: aws-actions/setup-sam@v2
      - run: sam validate --lint
      - run: sam build

Add corresponding scripts to package.json:

{
  "scripts": {
    "test": "jest",
    "test:unit": "jest --selectProjects unit",
    "test:integration": "jest --selectProjects integration",
    "test:coverage": "jest --selectProjects unit --coverage"
  }
}

Snapshot Testing for CloudFormation Templates

Your template.yaml is code. It deserves tests too. Snapshot testing catches unintended changes to your infrastructure:

// tests/unit/template.test.js
var fs = require('fs');
var path = require('path');
var yaml = require('js-yaml');

describe('SAM Template', function() {
  var template;

  beforeAll(function() {
    var templatePath = path.resolve(__dirname, '../../template.yaml');
    var content = fs.readFileSync(templatePath, 'utf8');
    template = yaml.load(content);
  });

  it('should match infrastructure snapshot', function() {
    var resources = Object.keys(template.Resources).sort();
    expect(resources).toMatchSnapshot();
  });

  it('should set timeout on all functions', function() {
    var globalTimeout = template.Globals.Function.Timeout;
    expect(globalTimeout).toBeLessThanOrEqual(30);
  });

  it('should use PAY_PER_REQUEST billing for DynamoDB', function() {
    var tables = Object.values(template.Resources).filter(function(r) {
      return r.Type === 'AWS::DynamoDB::Table';
    });

    tables.forEach(function(table) {
      expect(table.Properties.BillingMode).toBe('PAY_PER_REQUEST');
    });
  });

  it('should not expose functions publicly without auth', function() {
    var functions = Object.values(template.Resources).filter(function(r) {
      return r.Type === 'AWS::Serverless::Function';
    });

    functions.forEach(function(fn) {
      var events = fn.Properties.Events || {};
      Object.values(events).forEach(function(event) {
        if (event.Type === 'Api') {
          // Ensure POST/PUT/DELETE have auth (customize as needed)
          var method = event.Properties.Method.toUpperCase();
          if (method !== 'GET') {
            // This is where you would check for Auth property
            // expect(event.Properties.Auth).toBeDefined();
          }
        }
      });
    });
  });
});

Complete Working Example

Here is a comprehensive package.json and Jest configuration that ties everything together:

{
  "name": "serverless-api",
  "version": "1.0.0",
  "scripts": {
    "test": "jest",
    "test:unit": "jest --selectProjects unit --verbose",
    "test:integration": "jest --selectProjects integration --verbose",
    "test:coverage": "jest --selectProjects unit --coverage --coverageThreshold='{\"global\":{\"branches\":80,\"functions\":80,\"lines\":80}}'",
    "test:watch": "jest --selectProjects unit --watch",
    "local:api": "sam local start-api --env-vars env.json --port 3001",
    "local:dynamodb": "docker run -d -p 8000:8000 amazon/dynamodb-local",
    "local:localstack": "docker-compose up -d"
  },
  "dependencies": {
    "@aws-sdk/client-dynamodb": "^3.400.0",
    "@aws-sdk/lib-dynamodb": "^3.400.0",
    "uuid": "^9.0.0"
  },
  "devDependencies": {
    "jest": "^29.7.0",
    "js-yaml": "^4.1.0"
  }
}

A complete handler with validation, the service layer, and full test coverage:

// src/handlers/createUser.js
var userService = require('../lib/userService');
var validation = require('../utils/validation');
var crypto = require('crypto');

exports.handler = function(event, context) {
  var body;

  try {
    body = JSON.parse(event.body || '{}');
  } catch (e) {
    return Promise.resolve({
      statusCode: 400,
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ error: 'Invalid JSON body' })
    });
  }

  var errors = validation.validateUser(body);
  if (errors.length > 0) {
    return Promise.resolve({
      statusCode: 400,
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ error: 'Validation failed', details: errors })
    });
  }

  var userData = {
    id: crypto.randomUUID(),
    name: body.name.trim(),
    email: body.email.toLowerCase().trim(),
    createdAt: new Date().toISOString()
  };

  return userService.createUser(userData)
    .then(function(user) {
      return {
        statusCode: 201,
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify(user)
      };
    })
    .catch(function(err) {
      if (err.name === 'ConditionalCheckFailedException') {
        return {
          statusCode: 409,
          headers: { 'Content-Type': 'application/json' },
          body: JSON.stringify({ error: 'User already exists' })
        };
      }
      console.error('Error creating user:', err);
      return {
        statusCode: 500,
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ error: 'Internal server error' })
      };
    });
};
// src/utils/validation.js
function validateUser(data) {
  var errors = [];

  if (!data.name || typeof data.name !== 'string' || data.name.trim().length < 2) {
    errors.push('Name is required and must be at least 2 characters');
  }

  if (!data.email || typeof data.email !== 'string') {
    errors.push('Email is required');
  } else if (!/^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(data.email)) {
    errors.push('Email format is invalid');
  }

  return errors;
}

module.exports = { validateUser: validateUser };
// tests/unit/validation.test.js
var validation = require('../../src/utils/validation');

describe('validateUser', function() {
  it('should pass with valid data', function() {
    var errors = validation.validateUser({ name: 'Alice', email: '[email protected]' });
    expect(errors).toHaveLength(0);
  });

  it('should reject missing name', function() {
    var errors = validation.validateUser({ email: '[email protected]' });
    expect(errors).toHaveLength(1);
    expect(errors[0]).toContain('Name');
  });

  it('should reject short name', function() {
    var errors = validation.validateUser({ name: 'A', email: '[email protected]' });
    expect(errors).toHaveLength(1);
  });

  it('should reject invalid email', function() {
    var errors = validation.validateUser({ name: 'Alice', email: 'not-an-email' });
    expect(errors).toHaveLength(1);
    expect(errors[0]).toContain('Email');
  });

  it('should return multiple errors', function() {
    var errors = validation.validateUser({});
    expect(errors.length).toBeGreaterThanOrEqual(2);
  });
});

Common Issues and Troubleshooting

1. SAM CLI Docker Connection Failure

Error: Running AWS SAM projects locally requires Docker. Have you got it installed and running?

This happens when Docker Desktop is not running or the Docker daemon is not accessible. On macOS/Windows, make sure Docker Desktop is launched. On Linux, verify the daemon is running with sudo systemctl status docker. Also check that your user is in the docker group: sudo usermod -aG docker $USER.

2. DynamoDB Local Table Already Exists

ResourceInUseException: Cannot create preexisting table

This fires when your test setup tries to create a table that already exists from a previous test run that did not clean up. Wrap your table creation in a try-catch, or add a beforeAll that deletes the table first:

beforeAll(function() {
  return client.send(new DeleteTableCommand({ TableName: TABLE_NAME }))
    .catch(function() { /* table may not exist yet, ignore */ })
    .then(function() {
      return client.send(new CreateTableCommand(tableParams));
    });
});

3. Module Not Found When Using SAM Local Invoke

{"errorType":"Runtime.ImportModuleError","errorMessage":"Error: Cannot find module '../lib/userService'"}

SAM mounts your code directory into the Docker container. If your node_modules were installed on a different platform (e.g., you are on Windows but Lambda runs on Linux), native modules will fail. Run sam build before sam local invoke to ensure dependencies are installed in a Linux-compatible environment inside the .aws-sam/build directory.

4. Jest Mock Not Resetting Between Tests

Expected mock function to have been called 1 time, but it was called 3 times

This is the most common Jest mistake in serverless testing. Mocks accumulate call counts across tests unless you reset them. Always add jest.clearAllMocks() or jest.resetAllMocks() in your beforeEach block. The difference matters: clearAllMocks clears call history but preserves implementations, while resetAllMocks also removes mock implementations, so any mockResolvedValue calls need to be repeated in each test.

5. Timeout When Connecting to LocalStack Services

TimeoutError: Socket timed out without establishing a connection

LocalStack can take 15-30 seconds to fully initialize all services. In CI, add a health check wait loop before running integration tests:

until curl -s http://localhost:4566/_localstack/health | grep -q '"dynamodb": "running"'; do
  echo "Waiting for LocalStack..."
  sleep 2
done

Best Practices

  • Separate business logic from handler glue. Your handler should parse the event, call a service function, and format the response. The service function should be independently testable without any AWS event structures.

  • Use fixture factories, not hardcoded JSON files. Event fixtures generated by sam local generate-event are a good starting point, but wrap them in factory functions that let you override specific fields per test. This keeps tests readable and maintainable.

  • Run integration tests against DynamoDB Local by default, not LocalStack. LocalStack is powerful but resource-heavy. If your function only touches DynamoDB, use the lighter DynamoDB Local image. Reserve LocalStack for when you genuinely need multi-service emulation.

  • Test the sad paths harder than the happy paths. In serverless, failure modes are more varied and more consequential. Test malformed events, partial batch failures, cold start timeouts, and SDK throttling. These are the bugs that wake you up at 3 AM.

  • Pin your Lambda runtime version in SAM template. Do not use nodejs18.x and assume it stays stable. When AWS updates the minor runtime version, your locally-tested behavior might diverge from production. Run sam local invoke with --docker-network to match your CI environment exactly.

  • Never test against real AWS services in unit tests. This sounds obvious, but I have seen teams accidentally burn through DynamoDB capacity in CI because a mock was not wired up correctly. Add a Jest setup file that explicitly sets AWS_ACCESS_KEY_ID=fake so any unmocked SDK calls fail fast instead of hitting real infrastructure.

  • Version your event fixtures alongside your code. When API Gateway changes its event format (which happens during version upgrades), your fixtures should be updated in the same commit. Treat event schemas as part of your contract.

  • Use jest --selectProjects to separate fast and slow tests. Unit tests should run in under 5 seconds. Integration tests that spin up Docker containers might take 30 seconds. Do not make developers wait for integration tests on every save — let them opt in with npm run test:integration.

References

Powered by Contentful