Jest Performance Tracking Challenge
Performance is crucial for any application, and it's equally important to monitor the performance of your tests. This challenge asks you to implement a Jest plugin that tracks the execution time of each test case and provides a summary report at the end of the test run. This will help identify slow tests that might indicate performance bottlenecks in your code or inefficient test implementations.
Problem Description
You need to create a Jest plugin that measures and reports the execution time of each test case within a Jest test suite. The plugin should:
- Measure Execution Time: Record the start and end time of each test case.
- Aggregate Results: Collect the execution time for each test case, grouped by file.
- Generate Report: After all tests have completed, generate a console report summarizing the execution times. The report should include:
- The file name.
- The test case name.
- The execution time in milliseconds (ms).
- An overall average execution time per file.
- Configuration: Allow users to configure whether the performance tracking is enabled (default: enabled).
- No Interference: The plugin should not interfere with the normal execution of tests. It should not cause tests to fail or alter their results.
Expected Behavior:
- When enabled, the plugin should silently measure the execution time of each test.
- After the test run, a clear and concise report should be printed to the console, showing the execution time of each test and the average per file.
- When disabled, the plugin should have no effect on the test run.
Edge Cases to Consider:
- Tests that pass, fail, or are skipped should all be tracked.
- Tests with very short execution times (e.g., < 1ms) should be handled gracefully (e.g., displayed as "0ms").
- Tests that throw errors should still be tracked.
- Asynchronous tests (using
async/awaitorPromise) should be accurately measured. - The plugin should work correctly with different Jest configurations (e.g., different reporters, test environments).
Examples
Example 1:
Input: A Jest test suite with three test cases in two files:
- file1.test.ts:
- test1: 10ms
- test2: 20ms
- file2.test.ts:
- test3: 5ms
Output:
Performance Report:
file1.test.ts:
test1: 10ms
test2: 20ms
Average: 15ms
file2.test.ts:
test3: 5ms
Average: 5ms
Example 2:
Input: A Jest test suite with a single test case that fails.
Output:
Performance Report:
file1.test.ts:
test1: 100ms
Average: 100ms
Constraints
- The plugin must be compatible with Jest versions 27 or higher.
- The plugin should be written in TypeScript.
- The report should be printed to the console using
console.tableor a similar method for clear presentation. - The plugin should not significantly impact the overall test run time (aim for less than a 5% overhead).
- The plugin should be configurable via Jest's configuration options (e.g.,
jest.config.js). A configuration option namedperformanceTracking(boolean) should control whether the plugin is enabled.
Notes
- You'll need to leverage Jest's
transformhook to intercept test execution. - Consider using
console.timeandconsole.timeEndfor measuring execution time. - Think about how to structure your plugin to be modular and extensible.
- Focus on providing a clear and informative report that is easy to understand.
- Remember to handle asynchronous tests correctly.
console.timeandconsole.timeEndwork well withasync/await.