1

Dashboard

TestReport.io

Last Update há 3 meses

The Dashboard is your go-to place for a quick overview of your test suite’s performance. It provides key metrics on test executions, failures, and trends, helping you spot issues and track progress. It can be selected in three time frames of 1 day, 7 days, and 30 days.

Let’s break down the important sections of the dashboard:


1. Summary Section

At the top, you have a quick summary of the overall test performance:


  • Test Summary: Shows the total number of tests run (8513 tests in this case).

  • Failure Rate: Indicates the percentage of tests that failed (47.87% failure rate).
  • Failure Count: The total number of failed tests (4075 failures).
  • Stability: Stability shows how consistent test results are over time. It is the percentage of tests that passed during a certain period, which is currently 48.44%.
  • Flakiness: Flakiness is the number of test cases where the flaky test percentage is 30% or higher (2.03%).
  • Platforms Covered: Icons indicate the platforms used for testing (e.g., Windows, iOS, etc.).

This section gives a high-level snapshot of the key metrics, helping you quickly assess the health of your test suite.


2. Comparison Graph

This graph allows you to compare test results across different reports:


  • You can select different test reports from the dropdown and compare the results visually.

  • In this case, the bar graph shows test executions for a report named High Level.

3. Build Performance

This section shows the performance of builds over time:


  • Yellow bars: Represent the number of test executions.

  • Green line: Represents the average duration of the tests in minutes.

This chart helps you analyze how many tests are being executed over time and how long they are taking.


4. Build Summary

The Build Summary section displays recent builds and their statuses:


  • Red bars: Failed tests.

  • Green bars: Passed tests.
  • Yellow bars: Skipped tests.
  • Grey bars: Ignored tests.

This gives you a quick glance at the success rate of recent builds.


5. Stability Graph

The Stability Graph shows the overall performance of the tests over time:


  • Blue line: Represents overall test executions.

  • Green line: Passed tests.
  • Red line: Failed tests.

By analyzing this graph, you can see trends in test performance, such as if the number of passed or failed tests is increasing or decreasing over time.


6. Flakiness Graph

This chart tracks Flakiness, which shows the percentage of tests that behave inconsistently:


  • Average Flakiness: Indicates that 28.84% of the tests are flaky, which means they may pass sometimes and fail other times.

Monitoring this helps identify unreliable tests that need further investigation.


7. Tests by Status

This area provides a visual breakdown of test statuses:


  • Green: Passed tests.

  • Red: Failed tests.
  • Yellow: Skipped tests.

The graph tracks these statuses over time, giving you an idea of how stable your tests have been recently.


8. Failure Categories

This section categorizes the failures into different types:


  • To Be Investigated: Failures that require further investigation.

  • Automation Bug: Failures due to issues in the automation scripts.
  • Environment Issue: Problems caused by the test environment.
  • Product Bug: Failures that are actual bugs in the product.
  • No Defect: Failures with no clear issues.

By categorizing failures, you can prioritize which issues to address first.


9. Top Unstable Tests

This section highlights the most Unstable Tests in your suite:


  • Each test is listed along with its Project, Failure Count, and Failure Rate.
  • The top test here is "Verifying Cross Dataset with Auto Extension WorkFlow 3", which has failed 40 times with a 100% failure rate.


This helps you identify which tests are causing the most instability and need immediate attention.

If you click on any of the test then it will take you to Build Run. Check it out here.

Was this article helpful?

0 out of 0 liked this article

Still need help? Message Us