Hone logo
Hone
Problems

Jest Regression Detection: Guarding Against Unintended Changes

In software development, ensuring that new code doesn't break existing functionality is paramount. This is where regression testing comes in. This challenge focuses on building a mechanism within Jest to automatically detect regressions by comparing test results across different versions of your codebase.

Problem Description

Your task is to implement a Jest reporter that detects regressions in your test suite. A regression is defined as a test case that passes in the current run but failed in a previous, baseline run. You will need to create a custom Jest reporter that:

  1. Stores Baseline Results: Captures and stores the results of a "baseline" test run.
  2. Compares Current Results: Compares the results of the current test run against the stored baseline.
  3. Reports Regressions: Identifies and reports any test cases that were failing in the baseline but are now passing.
  4. Exits with an Error Code: If regressions are detected, the Jest process should exit with a non-zero exit code, signaling a failure.

Key Requirements:

  • The reporter should handle different test statuses (passing, failing, pending, skipped).
  • It must store the baseline results in a persistent manner (e.g., a JSON file).
  • The comparison logic should accurately identify tests that have transitioned from a "failing" state to a "passing" state.
  • The reporter should be configurable, allowing users to specify the path to the baseline results file.

Expected Behavior:

  • First Run (No Baseline): The reporter should simply capture the current test results and save them as the baseline for future runs. The Jest process should exit with a zero exit code.
  • Subsequent Runs (With Baseline):
    • If no regressions are found, the Jest process should exit with a zero exit code.
    • If regressions are found, the reporter should print a clear message indicating the regressions and the Jest process should exit with a non-zero exit code. The reporter should also update the baseline with the current results.

Edge Cases to Consider:

  • What happens if the baseline file is corrupted or unreadable?
  • How should tests that are newly added or removed be handled (they are not regressions by our definition)?
  • Consider tests that change from passing to failing (this is a standard failure, not a regression by our definition, but the baseline should still be updated).

Examples

Example 1: No Baseline Exists

Input: A Jest test suite with several passing and failing tests.

Process:

  1. The custom reporter runs for the first time.
  2. It finds no baseline results file.
  3. It captures the current test results (e.g., 3 passing, 1 failing).
  4. It saves these results to a specified baseline.json file.
  5. Jest process exits with code 0.

Output:

[Standard Jest output for passing/failing tests]

Example 2: Baseline Exists, No Regressions

Input:

  • baseline.json: { "test1": "passed", "test2": "failed" }
  • Current test run results: test1 passes, test2 fails.

Process:

  1. The custom reporter loads baseline.json.
  2. It compares current results:
    • test1: Passed in baseline, passes now. (Not a regression)
    • test2: Failed in baseline, fails now. (Not a regression)
  3. No regressions detected.
  4. The reporter updates baseline.json with the current run's results (which are the same as the previous baseline in this case, but in a real scenario, they might have minor differences).
  5. Jest process exits with code 0.

Output:

[Standard Jest output for passing/failing tests]

Example 3: Baseline Exists, Regressions Detected

Input:

  • baseline.json: { "test1": "failed", "test2": "passed" }
  • Current test run results: test1 passes, test2 passes.

Process:

  1. The custom reporter loads baseline.json.
  2. It compares current results:
    • test1: Failed in baseline, passes now. (Regression detected!)
    • test2: Passed in baseline, passes now. (Not a regression)
  3. Regressions are detected. The reporter prints a message:
    Regression detected! The following tests have transitioned from failing to passing:
    - test1
    
  4. The reporter updates baseline.json with the current run's results.
  5. Jest process exits with a non-zero exit code (e.g., 1).

Output:

[Standard Jest output for passing tests]
Regression detected! The following tests have transitioned from failing to passing:
- test1

Constraints

  • The reporter must implement the JestReporter interface.
  • Baseline results should be stored in a JSON file.
  • The path to the baseline file should be configurable via Jest configuration (e.g., jest.config.js).
  • The reporter should not introduce significant overhead to the test execution time.
  • Consider a maximum of 1000 test cases for performance considerations.

Notes

  • You'll need to understand how Jest's custom reporters work and how to hook into their lifecycle events (e.g., onRunComplete).
  • Think about how to uniquely identify each test case to compare them accurately between runs. The testPath and testName from Jest's TestResult are good candidates.
  • The definition of "regression" here is specific: passing now after failing before. Tests that were passing and are now failing are standard failures and don't trigger the regression exit code, but they will update the baseline.
  • For simplicity in this challenge, assume test names remain consistent across runs. In a real-world scenario, you might need more sophisticated identification.
  • You can use Node.js's fs module for file operations.
Loading editor...
typescript