Hone logo
Hone
Problems

Implementing Custom Performance Metrics in an Angular Application

Monitoring application performance is crucial for identifying bottlenecks and optimizing user experience. This challenge tasks you with creating a reusable Angular service that collects and reports custom performance metrics, allowing you to track specific aspects of your application's behavior beyond standard browser timings. This will enable you to gain deeper insights into your application's performance characteristics.

Problem Description

You need to implement an PerformanceMetricsService in Angular that allows you to define, measure, and report custom performance metrics. The service should provide methods for:

  1. Defining Metrics: A method to register a new metric with a unique name and a description.
  2. Measuring Metrics: A method to start and stop the measurement of a registered metric. This should record the start and end timestamps.
  3. Reporting Metrics: A method to log the collected metrics to a specified reporting endpoint (e.g., a server-side API). For this challenge, the reporting endpoint will be simulated by simply logging to the console.
  4. Retrieving Metrics: A method to retrieve all collected metrics.

The service should handle cases where a metric is not registered before being measured and ensure that metric names are unique. It should also provide a mechanism to clear all collected metrics.

Key Requirements:

  • The service must be injectable into Angular components.
  • Metrics should be stored internally in a structured format (e.g., an array of objects).
  • The reporting mechanism should be flexible enough to be easily adapted to send data to a real API.
  • Error handling should be implemented to gracefully handle invalid metric names or other unexpected issues.

Expected Behavior:

  • When a new metric is defined, it should be added to the internal registry.
  • When startMeasurement is called with a registered metric name, a timestamp should be recorded.
  • When stopMeasurement is called with the same metric name, the end timestamp should be recorded, and the duration calculated.
  • reportMetrics should log all collected metrics to the console in a readable format.
  • getMetrics should return all collected metrics.
  • clearMetrics should remove all collected metrics.

Edge Cases to Consider:

  • Attempting to measure a metric that hasn't been defined.
  • Calling stopMeasurement without a corresponding startMeasurement.
  • Defining a metric with a name that already exists.
  • Handling potential errors during metric reporting (e.g., network issues).

Examples

Example 1:

Input:
  - Define metric: "componentLoadTime" with description "Time taken to load a component"
  - Start measurement: "componentLoadTime"
  - Simulate some work (e.g., 500ms)
  - Stop measurement: "componentLoadTime"
  - Report metrics
Output:
  Console log:
  [
    {
      "name": "componentLoadTime",
      "description": "Time taken to load a component",
      "startTime": 1678886400000, // Example timestamp
      "endTime": 1678886405000,   // Example timestamp
      "duration": 500
    }
  ]
Explanation: A single metric is defined, measured, and reported with the correct duration.

Example 2:

Input:
  - Define metric: "apiRequestTime" with description "Time taken for an API request"
  - Start measurement: "apiRequestTime"
  - Simulate some work (e.g., 1000ms)
  - Stop measurement: "apiRequestTime"
  - Define metric: "renderingTime" with description "Time taken for rendering"
  - Start measurement: "renderingTime"
  - Simulate some work (e.g., 250ms)
  - Stop measurement: "renderingTime"
  - Report metrics
Output:
  Console log:
  [
    {
      "name": "apiRequestTime",
      "description": "Time taken for an API request",
      "startTime": 1678886410000, // Example timestamp
      "endTime": 1678886415000,   // Example timestamp
      "duration": 1000
    },
    {
      "name": "renderingTime",
      "description": "Time taken for rendering",
      "startTime": 1678886415000, // Example timestamp
      "endTime": 1678886417500,   // Example timestamp
      "duration": 250
    }
  ]
Explanation: Multiple metrics are defined, measured, and reported.

Example 3: (Edge Case)

Input:
  - Start measurement: "nonExistentMetric"
Output:
  Console log: "Error: Metric 'nonExistentMetric' is not registered."
Explanation: Attempting to measure a non-existent metric results in an error message.

Constraints

  • The service must be written in TypeScript and compatible with Angular 14 or later.
  • Metric names must be strings and should be unique.
  • Timestamps should be represented as numbers (milliseconds since epoch).
  • The reportMetrics function should log the metrics to the console. No external HTTP requests are required for this challenge.
  • The duration of a metric should be calculated in milliseconds.

Notes

  • Consider using a Map to store the registered metrics for efficient lookup.
  • Think about how to handle concurrent measurements of the same metric.
  • The focus is on the core functionality of defining, measuring, and reporting metrics. UI elements or complex reporting dashboards are not required.
  • Error handling is important for robustness. Provide informative error messages when something goes wrong.
  • Ensure your code is well-documented and easy to understand.
Loading editor...
typescript