Citrix Performance report
The Citrix performance report summarizes the validity of the run, summarizes the data most significant to the run, shows the response trend of the slowest 10 windows in the test, the server health depending on requests, and graphs the response trend of each window for a specified interval.
Citrix Overall page
The Overall page provides the following information:
- A progress indicator that shows the state of the run.
- The bar chart on the left indicates the overall success of the run with the percentage of window synchronization successes and timeouts. Synchronization success indicates that the expected window events in the test match the actual window events in the test run.
- The bar chart on the right indicates the overall success of the run with the percentage of image synchronization successes and timeouts. Synchronization success indicates that the expected image area or extracted text in the test matches the actual image area or extracted text in the test run.
Performance Summary page
The Summary page summarizes the most important data about the test run, so that you can analyze the final or intermediate results of a test at a glance.The Run Summary table displays the following information:
- The number of virtual users that are active and the number of virtual users that have completed testing. This number is updated during the run.
- The elapsed time. This is the total duration of the run, which is displayed in hours, minutes, and seconds.
- The location and name of the test.
- Display results for computer: All Hosts.
To see summary results for individual computers, click the computer name in the Performance Test Runs view.
- The status of the run. This can be Initializing Computers, Adding Users, Running, Performing Execution History Data Transfer, Stopped, or Complete.
- The total number of virtual users emulated during the test.
The Citrix Summary section displays the following information:
- The average response time for all response time measurements. Response times are determined by measurements that are located under the tests. Response time measurements can be automatically generated between the last input action before a window create event and the window create event. The table does not display values that equal zero.
- The total number of image synchronization attempts.
- The total number of image synchronization successes.
- The total number of image synchronization timeouts. A timeout occurs when the synchronization fails.
- The total number of window synchronization attempts.
- The total number of window synchronization successes.
- The total number of window synchronization timeouts. A timeout occurs when the synchronization fails.
- The maximum response time for all response time measurements. This indicates the highest response time that was measured during the run.
- The minimum response time for all response time measurements. This indicates the lowest response time that was measured during the run.
- The standard deviation response time for all response time measurements.
- Total user actions for run. This indicates the total number of user input actions that were emulated during the run.
Response Time Results page
The Response Time Results page shows the average response of the window events in the test as the test progresses. With this information, you can evaluate system response during and after the test. Response times are determined by measurements that are located under the tests. Response time measurements can be automatically generated between the last input action before a window create event and the window create event.The bar chart shows the average response time of each window event. Each bar represents a window that was created during the test. As you run the test, the bar chart changes, because the window response times are updated dynamically during the run. The table under the bar chart provides the following additional information for each window:
- The minimum response time during the run.
- The average response time during the run. This matches the information in the chart.
- The maximum response time during the run.
- The standard deviation response time during the run.
Response vs. Time Summary page
The Response vs. Time Summary page shows the average response trend as graphed for a specified interval. You set the Statistics sample interval value in the schedule, as a schedule property. Response times are determined by measurements that are located under the tests. Response time measurements can be automatically generated between the last input action before a window create event and the window create event.
The line graph shows the average response time for all measurements during the run. Each point on the graph is an average of what has occurred during that interval. The table under the graph lists one number: the total average response time for all measurements in the run.
Response vs. Time Details page
The Response vs. Time Details page shows the response trend as graphed for a specified interval. You set the Statistics sample interval value in the schedule, as a schedule property. Response times are determined by measurements that are located under the tests. Response time measurements can be automatically generated between the last input action before a window create event and the window create event.
The line graph shows the average response time of each measurement for a specified interval. Each measurement is represented by a separate line. The table under the graph provides the following additional information for each response time measurement:
- The minimum response time during the run.
- The average window response time during the run. This is similar to the graph, but the information in the table includes the entire run.
- The maximum window response time during the run.
- The standard deviation window response time during the run.
User Action Throughput page
The User Action Throughput page provides an overview of the frequency of requests being transferred per interval. You set the Statistics sample interval value in the schedule, as a schedule property.
- The line graph on the left shows the user action rate per interval for all windows. This represents the activity of virtual user input actions per second for each interval. The table under the graph lists the user action rate per second for the entire run, and the total number of user actions for the run.
- The line graph on the right shows active users and users that have completed testing, over the course of a run. The summary table under the graph lists the results for the most recent sample interval.
Server Health Summary page
The Server Health Summary page provides an overall indication of how well the server has performed. The graph does not display values that equal zero. The bar chart shows the following information:
- The total number of window synchronization attempts.
- The total number of window synchronization successes.
- The total number of window synchronization timeouts.
- The total number of image synchronization attempts.
- The total number of image synchronization successes.
- The total number of image synchronization timeouts.
Server Timeout page
The Server Timeout page shows when the synchronization timeouts and server errors occurred during the run. The graph does not display values that equal zero. The line graph shows the following information:
- Citrix window synchronization timeouts.
- Citrix image synchronization timeouts.
- Citrix server errors or errors encountered during test execution.
Resources page
The Resources page shows all resource counters monitored during the schedule run.
- The line chart shows the values of the resources counters monitored during the schedule run.
The chart scales automatically to accommodate the highest resource counter value.
- The summary table under the chart lists the average values of the resource counters monitored during the schedule run. This table is organized by resource monitoring hosts.
Related reference
Citrix Verification Points report