Use results
Compare results
An efficient way to highlight the performance variations between two sets of test results is to plot the statistics for a same item (page, request, Container) in both tests simultaneously.
This provides a visual comparison of the application behavior under different scenarios, or further to its modification (e.g. update or optimization).
Use filters
An efficient way to pinpoint performance problems is to filter a test results. The aim is to limit the test statistics to the items (request, page, Container, Virtual User, Population, and so on) that are showing the problems.
For example, we may narrow down the statistics to a specified time period during the test run; this will display the statistics as if the test had been carried out over that time period only.
Interpret the advanced statistics
NeoLoad provides two advanced statistics:
- Standard deviation - allows the variation in values to be measured, compared to the average. A high standard deviation indicates that response times vary widely, whereas a low standard variation is a sign that response times are consistent and even.
- The truncated mean uses a subset of results with the extreme removed in order to take into account only the most representative values (those obtained by most end users). The truncated mean is obtained by discarding as many high values as low, or by discarding high values only. The percentage of results to be used and the position of the interval can be set in the Truncated Mean panel accessed through Edit > Preferences > Global Preferences > Advanced.
Correlate statistics and monitors
The statistics that show anomalies, such as a significant rise in response times or the occurrence of errors, can be correlated with the variations in the readings obtained by certain performance Monitors.
These correlations will usually provide an explanation for the performance slowdown and give a clue to the main cause of the problem, be it merely a server setting or the overload of one of the main resources (memory for example).
Manage test results
After multiple test runs, the volume of data can become difficult to manage. It is important therefore to add a short description to each test prior to its running. This description is included in each test summary and the reports generated.
The Results Manager allows the user to delete the results of a previous test session or use them to generate a report (XML, HTML, and PDF).
Home