Skip to content

Lab 2.2: Safety Net Construction

Module: 2.2 - Test Generation | ← SlidesDuration: 1 hour Sample Project: node-express-mongoose-demo

Learning Objectives

By the end of this lab, you will be able to:

  • Generate unit tests for untested functions
  • Create edge case tests (null, empty, boundary conditions)
  • Run tests and verify they pass
  • Build a safety net before refactoring

Prerequisites

  • Completed Lab 2.1 (Plan Mode)
  • Understanding of the testing framework in the project

Why Tests Matter

The Golden Rule

"You can't safely refactor code that doesn't have tests."

Tests are your safety harness:

  • They prove the code works before you change it
  • They catch regressions during refactoring
  • They document expected behavior

Setup

bash
# Navigate to the sample project
cd sample-projects/node-express-mongoose-demo

# Install dependencies (if not done)
npm install

# Verify tests run
npm test

# Start Claude Code
claude

Task 1: Find Untested Code

Time: 10 minutes

Identify functions that need test coverage.

Prompts to try:

What functions in this project have no tests?
Show me functions in app/controllers that aren't covered by tests.
Which critical functions lack test coverage?

Pick a target: Choose a function that:

  • Is used in multiple places (high impact)
  • Has no existing tests
  • Has clear inputs and outputs

Success criteria:

  • [ ] Identified 2-3 untested functions
  • [ ] Selected one critical function to test

Task 2: Generate Basic Tests

Time: 15 minutes

Create a test file for your chosen function.

Prompts to try:

Generate a test file for the create function in app/controllers/articles.js. Use the project's existing test framework.
Write unit tests for the User model's authenticate method.

Review the output:

  • Does it test the happy path?
  • Does it use the correct testing framework?
  • Are assertions meaningful?

Success criteria:

  • [ ] Test file generated
  • [ ] Tests run without errors
  • [ ] Happy path is covered

Task 3: Add Edge Case Tests

Time: 20 minutes

Expand coverage with edge cases.

Prompts to try:

Add edge case tests for:
- null or undefined inputs
- empty strings or arrays
- boundary conditions
- invalid data types
What edge cases should we test for the validateUser function?
Add tests for error scenarios in the article controller.

Common edge cases to test:

CategoryExamples
Null/undefinednull, undefined, missing params
Empty"", [], {}
Boundaries0, -1, MAX_INT, very long strings
TypesWrong type (string instead of number)
StateNot logged in, expired session

Success criteria:

  • [ ] At least 3 edge case tests added
  • [ ] Tests handle error scenarios
  • [ ] All tests pass

Task 4: Run and Verify

Time: 15 minutes

Make sure all tests pass and coverage improved.

bash
# Run the test suite
npm test

# If coverage report is available
npm run test:coverage

Verification checklist:

  • [ ] All new tests pass
  • [ ] Existing tests still pass
  • [ ] No false positives (tests that always pass)

Debugging failed tests:

This test is failing. Can you explain why and fix it?

Success criteria:

  • [ ] All tests pass
  • [ ] Understand what each test verifies

Test Quality Checklist

Good tests should be:

  • [ ] Isolated - Each test can run independently
  • [ ] Deterministic - Same result every time
  • [ ] Readable - Clear what's being tested
  • [ ] Fast - Run in milliseconds, not seconds
  • [ ] Meaningful - Test behavior, not implementation

Example Test Structure

javascript
describe('ArticleController', () => {
  describe('create', () => {
    it('should create article with valid data', async () => {
      // Arrange
      const articleData = { title: 'Test', body: 'Content' };

      // Act
      const result = await controller.create(articleData);

      // Assert
      expect(result).toHaveProperty('id');
      expect(result.title).toBe('Test');
    });

    it('should throw error with missing title', async () => {
      // Arrange
      const articleData = { body: 'Content' };

      // Act & Assert
      await expect(controller.create(articleData))
        .rejects.toThrow('Title is required');
    });

    it('should handle null input', async () => {
      await expect(controller.create(null))
        .rejects.toThrow();
    });
  });
});

Tips for Success

  1. Match the project's style - Look at existing tests for patterns
  2. Test behavior, not implementation - Focus on what, not how
  3. One assertion per test - Easier to debug when things fail
  4. Use descriptive names - "should return error when email is invalid"

Troubleshooting

Tests don't run

  • Check the testing framework is installed
  • Verify the test file naming convention
  • Check the test script in package.json

Tests fail unexpectedly

  • Read the error message carefully
  • Check if mocking is required
  • Verify test database setup

Stretch Goals

If you finish early:

  1. Generate integration tests for an API endpoint
  2. Add tests for async error handling
  3. Create a test for a complex multi-step flow

Deliverables

At the end of this lab, you should have:

  1. New test file(s) committed to the project
  2. At least 5 tests (happy path + edge cases)
  3. All tests passing
  4. Confidence to refactor the tested code

Next Steps

After completing this lab, move on to Lab 2.3: Red-Green-Refactor-AI to practice TDD with Claude.

Claude for Coders Training Course