UI reference for performance testing


1. HTTP preferences
1.1. HTTP protocol data view preferences
1.2. HTTP recorder preferences
1.3. HTTP test editor preferences
1.4. HTTP test generation preferences
2. SAP test preferences
2.1. SAP test editor preferences
2.2. SAP recording preferences
2.3. SAP test generation preferences
3. Citrix test preferences
3.1. Citrix recorder preferences
3.2. Citrix test editor preferences
3.3. Citrix test generation preferences
4. Generic Service Client preferences
4.1. Service test editor preferences
4.2. Service message edition preferences
4.3. Service test generation preferences
4.4. Raw transaction data view preferences
4.5. Auto values preferences
4.6. Response time breakdown preferences
4.7. Cookies support preferences
4.8. WSDL Information Preferences
5. Report preferences
5.1. Test report preferences
5.2. Default report preferences
5.3. Export report preferences
5.4. Web report preferences
5.5. Percentile analysis preferences
6. Test editor reference
6.1. HTTP test editor reference
6.1.1. HTTP test details
6.1.2. HTTP page details
6.1.3. HTTP request details
6.1.4. HTTP response data details
6.1.5. HTTP server access configuration details
6.2. SAP test editor reference
6.2.1. SAP test details
6.2.2. SAP connection details
6.2.3. SAP screen details
6.2.4. SAP set details
6.2.5. SAP get details
6.2.6. SAP call details
6.2.7. SAP server request details
6.2.8. SAP batch connection details
6.2.9. SAP batch input transaction details
6.3. Citrix test editor reference
6.3.1. Citrix test details
6.3.2. Citrix session details
6.3.3. Citrix window details
6.3.4. Citrix window event details
6.3.5. Citrix key action details
6.3.6. Citrix mouse action details
6.3.7. Citrix text input details
6.3.8. Citrix mouse sequence details
6.3.9. Citrix screen capture details
6.3.10. Citrix image synchronization details
6.3.11. Citrix logoff details
6.4. Service test editor preferences
6.4.1. Service test details
6.4.2. Service call details
6.4.3. XML call details
6.4.4. Binary call details
6.4.5. Text call details
6.4.6. Service message return details
6.4.7. Service verification point details
6.4.8. Service callback details
6.4.9. Service timeout details
6.4.10. Service parallel details
6.4.11. Service receive details
6.5. Service stub editor reference
6.5.1. Stub operation details
6.5.2. Stub case details
6.5.3. Stub response details
6.6. Socket and TN3270 test editor reference
6.6.1. Socket test details
6.6.2. Socket connection details
6.6.3. Socket close details
6.6.4. Socket-secure upgrade details

6.6.5. Socket send details

6.6.6. Socket receive details
6.6.7. Terminal screen details
6.6.8. Terminal input details
6.6.9. Socket content verification point details
6.6.10. Socket size verification point details
6.6.11. Socket custom verification point details
6.6.12. Terminal content verification point details
7. Schedule editor reference
7.1. Schedule properties
7.2. User group properties
8. Citrix monitoring panel reference


1. HTTP preferences

You can change Performance Tester behavior by changing these HTTP-related settings.


1.1. HTTP protocol data view preferences

Preference settings control how protocol data is displayed when tests run.

Access the preference settings for the HTTP protocol data view: Click Window > Preferences > Test > HTTP Protocol Data View.

Render binary response data
Typically, you leave this box unchecked, because the data is generally unreadable and can cause temporary high processor usage when converted into text. If enabled, the Response and Browser pages of the Protocol Data view display unrecognized binary data.
Replay delay
During test debugging, when you replay one virtual user after the run is completed, specify the number of seconds that the Protocol Data view pauses between showing each page.
Enable real-time protocol data support for HTTP test
Typically, you leave this box checked and select whether you want to display the Browser tab or the Event Log tab by default; you can switch between these pages during playback.
Show the following page when launching HTTP test
Specifies which page is displayed when an HTTP test runs.

  • Browser: Click to view rendered HTTP pages during playback, thus verifying that a test is behaving as expected. Because the protocol data is used, the Browser page might not render the contents exactly as a web browser would.
  • Event Log: Click to see a line of summary information for each defined page of the currently running test. This summary includes a count of verdicts that did not pass, unexpected response codes, and other items of interest. Click an event to drill down for more detailed information.
Highlight Substitutions in Protocol Data View
This option visibly highlights substituted data in the Request, Response Headers and Response Content pages of the Protocol Data View when viewing test log or test editor elements that use data correlation.


1.2. HTTP recorder preferences

Preference settings control the behavior of the recording wizard.

Access the preference settings: Click Window > Preferences > Test > Recording > HTTP Recording. After changing a setting, click Apply.

Enable the RPT toolbar in browsers
Click to install the annotation toolbar. This enables you to add comments and transactions, and to change page names during recording.
Verify annotation toolbar is installed before recording
Click to verify that the annotation toolbar is installed in the web browser before recording.


1.3. HTTP test editor preferences

The preference settings on the HTTP page of the test editor control how URLs are displayed in a test and how content verification occurs.

To access the preference settings for HTTP test editor, click Window > Preferences > Test > Test Editor > HTTP Test.

You can set the following preferences for the HTTP test editor:

Display decoded URLs whenever possible
Select to decode any encoded element in a URL. Decoding improves readability.
Hide HTTP request/response content larger than (kB)
Select to hide data larger than a specific size. The Content area in a response indicates the size of the hidden data and whether it is binary. To display hidden data, press Ctrl+Shift+Spacebar.
Show in all requests
Select to display the host and port information in every request in the Test Contents area of the test editor. A test often contains many server connections. When you clear this preference, it is easier to read a test.
Show on primary requests only
Select to display the host and port information in only the primary request for each HTTP page in the Test Contents area of the test editor.
Show when different from primary request
Select to display the host and port information in the Test Contents area of the test editor for requests that use a different connection than the primary request.
Skip responses with binary contents
Select to skip binary response data when you enable content verification points in a test. Content verification points verify whether specified strings are present in response data.
Create only in primary responses
Select to limit the creation of content verification points to primary responses when you enable content verification points in a test.


1.4. HTTP test generation preferences

Preference settings control how performance tests are generated, such as how tests will process verification points, data correlation, and generic protocols.


Test generation options

To access the preference settings for test generation options, click Window > Preferences > Test > Test Generation > HTTP Test Generation, and click the Test Generation Options tab.

Do not generate a new page if think time is less than
Enter the shortest time, in milliseconds, that the generator uses as a delay to emulate user think time for an HTTP page. If your tests contain fewer pages than expected, try a shorter interval.
Generate a new page if delay between requests is greater than
Enter the longest delay, in milliseconds, that the generator allows between page requests. If this time is exceeded, a new page is generated. If your tests contain more pages than expected, try a longer interval.
Maximum request delay
Enter the longest delay, in milliseconds, that the generator allows before truncating HTTP requests. The requests are truncated on the generated test. The recorded test still contains the original values, and you can get them back by generating a new test.
Save only the first 4KB of responses larger than
Enter the limit of response data, in KB, that the generator saves. If a response is larger than the specified limit, only the first 4 KB of data is saved.
Suppress NSLookup() and use numeric IPs
Select this option to shorten test generation time. The disadvantage is that IP addresses in a test are less user-friendly than web page format (www.example.com).
Disable Page Cache Emulation during test generation
Disable page cache emulation. When page cache emulation is enabled, caching information in server response headers is honored. Additionally, requests are not submitted to the server for content that is confirmed by the client as fresh in the local cache. Page cache emulation is enabled by default.
Use Legacy Test Generator
Select this option if you have been instructed to use the legacy HTTP test generator.
Automatically include verification point of
Click to specify the types of verification points to be automatically included. If a check box for a verification point is selected, the code and edit controls for this type of verification point are generated in all tests. Verification points can also be enabled or disabled within specific tests.
Relaxed
Response codes that are in the same category (for example, 200, 201, 203, 209) are considered equivalent. An error is reported if the response code is not in the same category.
Exact
An error is reported if the response code does not match the recorded value exactly.
Accept sizes for primary request within
If you are automatically generating response size verification points, click to specify the acceptable size range for primary requests. No error is reported if a response is within the specified percentage above or below the expected size. By default, for primary requests, HTTP response size verification points use range matching.


Data correlation

To access the preference settings for data correlation, click Window > Preferences > Test > Test Generation > HTTP Test Generation, and click the Data Correlation tab.

Automatically correlate host and port data
By default, host and port data is correlated automatically. If tests in a previous release have significant manual correlations, or you are using proxies, the migration of the replace-host functionality feature is likely to fail during playback. In this situation, clear the check box. When you reopen your tests, they will not have the automatic correlation feature in them.
Automatically correlate URL pathname if redirected by response
Specifies whether URL path names are correlated if they are redirected by a selected response code. If a check box for a response code is selected, the test generator performs correlations for that response code. This option applies only to responses that are redirects, with a status code between 300 and 399.
Automatically correlate Referers
By default, the Referer field in an HTTP request header is correlated automatically. Clear the check box if you plan to correlate Referers manually. If you run tests against servers that do not require a Referer field, clearing this check box reduces the number of correlations performed when the test runs, and can increase user throughput.
Enable all other data correlation
By default, request and response data is correlated automatically. Clear the check box to disable automatic data correlation of request and response data. Consider clearing the check box if you create your own data correlation rules in the rules editor.
Optimize automatic data correlation for execution
Specifies the characteristic that tests are automated for.

  • With the Accuracy setting (the default), many references with an identical session ID value are created and the value of each session ID is substituted from the nearest previous reference.
  • To make a test run faster by reducing the number of references that are created during automatic data correlation, change the optimization to Efficiency. For example, consider a test where a session ID, which is assigned when a user logs in, is included in every subsequent request in the test. With the Efficiency setting, all session IDs are substituted from a single previous reference. The downside of this setting is that it can result in incorrect correlations. For example, a request containing the Joe Smith string might be incorrectly correlated with a request containing the Joe Brown string.
URL rewriting for execution
Specifies how web addresses (URLs) are rewritten during test execution. When correlating data, the test generator replaces part of a URL request string with a value that the server returned in response to a previous request.

  • Automatic (default): The test generator automatically determines when rewriting the entire URL during substitution will facilitate test execution.
  • On: Select to rewrite URLs in every instance of data correlation. This produces larger tests that take longer to run. Try this setting if your tests fail unexpectedly.
  • Off: Select to manually correlate the instances where URL rewriting is needed. This setting might cause execution errors.

Note: To turn data correlation off entirely or to set whether names are automatically generated for data correlation references, click Window > Preferences > Test > Test Generation > HTTP Test Generation, and click the Data Correlation tab.


Data correlation types

To access the preference settings for types of data correlation, click Window > Preferences > Test > Test Generation > HTTP Test Generation, and click the Data Correlation Types tab.

Data Correlation Types
Specify when to generate data correlation constructs. With the Automatic setting, the test generator creates the required constructs where needed. If the test does not contain the required constructs, change the setting to On, which will always perform data correlation. If tests do not require a specific construct, select Off, which has the additional benefit of improving performance on subsequent test generation.
For Jazz Foundation Services, On and Automatic enable data correlation for Jazz applications that use REST storage or query APIs from Jazz Foundation Services. An example of such an application is Rational Requirements Composer. Although data correlation does not typically apply to browser-based Jazz web clients, it may be useful for other HTTP client-server applications that use REST services and the Atom Publishing Protocol for updating web resources.
For Jazz Web Applications, On and Automatic enable data correlation for Jazz web applications that use the Jazz Foundation web UI framework Examples of these web applications are the web interfaces for Rational Quality Manager and Rational Team Concert. Data correlation can also be useful for other web applications that contain javascript that employs JSON for client-server data exchange. This is a common practice with DOJO- and AJAX-based applications.


2.1. SAP test editor preferences

The SAP test editor preferences control the specific behavior of the test editor with SAP test suites.

To access the SAP test editor preferences, click Window > Preferences, expand Test, expand Test Editor, and click SAP Test Editor. After changing a setting, click Apply.

SAP Protocol Data View
These settings specify how the SAP Protocol Data view is displayed.
SAP GUI object highlight color
This setting specifies the color of the frame that highlights selected objects on the SAP GUI Screen page of the SAP Protocol Data view. By default, the highlight color is red.
Automatically set focus on SAP Protocol Data view
When enabled, this option automatically ensures that the SAP Protocol Data view is displayed each time an element is selected in the test editor. Disable this option if you want to hide the SAP Protocol Data view or remove it from the Performance Test perspective. This option is enabled by default.


2.2. SAP recording preferences

Test recorder preferences control the default settings for recording SAP tests.

To access the SAP Test Recorder preferences, click Window > Preferences, expand Test, expand Recording, and click SAP Recording. After changing a setting, click Apply.

Screen capture options
These settings specify how the test recorder handles the screen captures that are shown in the SAP Protocol Data view.
None
No screen captures are recorded. This saves disk space, but disables the ability to create events or verification points from the SAP Protocol Data view.
On SAP screen entry
Screen captures are recorded each time a new screen is displayed in the SAP GUI. The recorded screen capture shows the initial state of the screen, before user input. This option is enabled by default.
On SAP screen exit
Screen captures are recorded each time a request is sent to the SAP R/3 server. The recorded screen capture shows the final state of the screen, after user input.
Both
Screen captures are recorded when a new screen is displayed in the SAP GUI and when the request is sent to the SAP R/3 server. The SAP Data Protocol view displays the final state on the send request elements and the initial state on all other events.
Select a SAPLOGON configuration file
The saplogon.ini configuration file provides a list of SAP system names displayed in the SAP recorder wizard. Use this setting to change the location of the saplogon.ini file.


2.3. SAP test generation preferences

Test generation preferences control how SAP tests are generated, such as how tests will process verification points, data correlation, and the default settings for generated test elements.

To access the SAP Test Generation preferences, click Window > Preferences, expand Test, expand Test Generation, and click SAP Test Generation. After changing a setting, click Apply.

Automatic Generation
These settings specify test elements that are automatically generated after recording the test.
Use connection by string
When enabled, tests are generated with the connection by string launch method instead of using the SAP Logon program. This option is enabled by default.
Verification points for SAP screen titles
When enabled, this option generates verification points on screen titles with each SAP screen. This option is disabled by default.
Verification points for SAP request response time threshold
When enabled, this option generates verification points on the response time of the SAP R/3 server. If the server response time is above the specified threshold, the test produces a failed verification point. This option is disabled by default.
Calculate threshold from recorded (%)
This specifes the default response time threshold that is calculated when response time verification points are generated. The threshold value is calculated as a percentage of the actual response time that was measured during the recording. By default, the response time threshold is generated with a value of 120% of the recorded response time.
GUI on execution
During test execution, it might not be desirable to display the SAP GUI. Hiding the SAP GUI improves the performance of the virtual users. This setting specifies the default behavior when the test is generated. However, you can change this setting in the test editor by selecting the SAP test element.
Hide GUI during execution
When selected, all instances of the SAP GUI are hidden. In some cases, modal dialog boxes from the SAP GUI can flash briefly on the screen. This is the default setting.

Note: If you run a test in the hidden mode and the test fails due to modal dialog boxes or pop-up windows in transactions, you must add the RPT_VMARGS property with value set to rptSapForceShowNone=true in the Location property.

Show GUI for only one virtual user
When selected, the SAP GUI is displayed only for the first virtual user. All other instances of the SAP GUI are hidden. This allows you to monitor the execution.
Show GUI for all virtual users
When selected, the SAP GUI is displayed for all virtual users.
Password prompt
Specifies behavior of the password request.
Prompt me for password when generating test
When enabled, a password is requested at the end of the recording session. If disabled, the password is recorded with an empty string. The recorder cannot record the password during the test. Therefore, if this option is disabled, the test uses an empty string for the password.


3.1. Citrix recorder preferences

Citrix recorder preferences control the behavior of the recording wizard.

To access the Citrix Recorder preferences, click Window > Preferences, expand Test, and click Citrix Recording. After changing a setting, click Apply.

Screen capture options
These settings specify how the test recorder performs screen captures of the Citrix desktop during recording.
No automatic screen capture
Select this option if you do not want the test recorder to record screen captures automatically. When this option is selected, you can still record screen captures manually. This option is selected by default.
Capture screen every
Automatically record a periodic screen capture and specify the time between captures.
Capture screen on window creation
Select this option to record a screen capture each time a window object is created in Citrix.
Exclude tooltips
When Capture screen on window creation is selected, enable this option to prevent creating a screen capture each time a tooltip event is displayed during the recording. If this option is disabled, screen captures are recorded when tooltips are displayed.
Capture screen on image synchronization
Select this option to ensure that a screen capture is recorded each time an image synchronization is recorded.


3.2. Citrix test editor preferences

Citrix test editor preferences control the test editor for Citrix performance tests.

To access the Citrix Test Editor preferences, click Window > Preferences, expand Test, expand Test Editor, and click Citrix Test Editor. After changing settings, click Apply.

Image Previews
These settings specify how screen captures are displayed in the test editor.
Fit screen to visible area
Automatically fit screen captures to the available area in the test editor. If disabled, the screen capture will be the actual size, which might require scrolling. This option is enabled by default.
Draw only last window
Select this option if you want to display only the current window in mouse sequence actions. When disabled, all recorded windows are displayed. This option is disabled by default.
Mouse Sequence
These settings specify how mouse sequences are displayed in the test editor.
Display mouse sequences for
This option specifies how you want to display previous, current, or all mouse sequences in the current mouse sequence.
Current sequence
Only the current mouse sequence is displayed in the test editor. This option is selected by default.
Previous and current sequences
The current mouse sequence is displayed with any previous mouse sequences of the current window.
All sequences
All mouse sequences of the current window are displayed simultaneously.
Current mouse sequence color
This option specifies the color of the currently selected mouse sequence.
Current mouse sequence bold
Select this option if you want to display the current mouse sequence in bold. This option is selected by default.
Mouse move sequence color
This option specifies the color of mouse-move sequences when previous or all sequences are displayed.
Mouse drag sequence color
This option specifies the color of mouse-drag sequences when previous or all sequences are displayed.
Window color (when screen capture is not available)
This option specifies the color of a rectangle that represents the current window if there is no screen capture.


3.3. Citrix test generation preferences

Test generation preferences control how Citrix performance tests are generated, such as how tests will process verification points, data correlation, and options of the generated test elements.

Recording Optimization Options
These settings specify how mouse and window events are interpreted in the generated test.
Window activate recording
Specify whether to record no, last, or all window-activate actions when a sequence of similar actions is detected.

  • none disables recording of window-activate events.
  • last records only the last of an uninterrupted sequence of window events. This eliminates redundant window-activate actions from the recording.
  • all records all events of the sequence.
Mouse move recording
This setting specifies which mouse move events are recorded. Relevant is the default setting.

  • All records an uninterrupted sequence of mouse movements in the generated test.
  • Relevant records only the mouse movements that generate a response, such as hover text.
  • First and last records a simplified mouse-move action.
Automatic Generation
These settings specify test elements that are automatically generated after recording the test.
Verification point on every window title change
When enabled, this option generates a window title verification point whenever the caption changes. If this option is disabled, the window title is verified only when a new window is created. This option is disabled by default.
Response times for main windows
When enabled, this option generates response time measurements for all recorded main window-create events. A main window is a window that is created at the top level of the test contents tree and contains user actions. The generated response time measurement starts with the keyboard or mouse action that immediately precedes the window-create event. This option is enabled by default.
Window event synchronization criteria
Use this option to disable window recognition on the window position, size, or title. Disable any of these options if the test produces synchronization timeouts because a window changes its position, size, or title between or during test runs.
Default Test Execution Delays
This page specifies the default keyboard and mouse delays for the test client. Do not change these settings unless you are experiencing problems with events that do not run correctly.
Synchronization timeout delay
This is the delay after which a timeout error is produced when a window event or an image synchronization element is not recognized during test runs. The default value is 15 000 milliseconds. The specified delay is for synchronizations that are set as conditional. Mandatory synchronizations use a delay of three times the specified delay. Optional synchronizations use a fixed delay of 2 seconds.

Note: In the generated test, the Override synchronization timeout for a particular window creation event will be enabled with the corresponding recorded time only if it is greater than what is specified in this preference.

If think time is under x ms, then replace with
If the delay between two events is above the specified limit, then it is handled as a think time. If the delay is below the limit, then the test generator replaces the think time with one of the following delays. The think time is the delay spent by a virtual user before performing an action. The default limit is 20 000 milliseconds.

Note: In the generated test, the think time for a particular user action will be enabled only when the recorded think time is greater than the value specified for this preference.

Delay between mouse down and mouse up in a click
This is the default delay used to generate a mouse click action using a mouse down and a mouse up action. The default value is 20 milliseconds.
Delay between two mouse clicks in a double click
This is the default delay used to generate a double-click action using two mouse clicks. The default value is 50 milliseconds.
Delay between key down and a key up in a stroke
This is the default delay used to generate a key-stroke action using a key-down and a key-up action. The default value is 20 milliseconds.
Delay between two keyboard strokes in a text input
This is the default delay used to generate a text input action using multiple key stroke actions. The default value is 50 milliseconds.
Default OCR settings
This page specifies the settings for text extraction by optical character recognition in image synchronizations. You might need to experiment with various settings to obtain good results. These settings define the default behavior for new image synchronizations. You can change the behavior for individual image synchronization elements by changing the OCR settings in the test editor.
OCR default language
This is the language of the dictionary that is used to recognize words for the application that you are testing. This setting defines the subset of languages that will be available in image synchronization elements in the test editor.
OCR default zoom factor
This is the enlargement factor that is applied to the image. The default setting is medium for standard font sizes. Increase the zoom factor to improve recognition of smaller fonts or decrease for larger fonts.
OCR default brightness
This is the brightness level from 0 to 250 that is applied to the image. The default setting is 70 for text with normal contrast. Increase the brightness setting to improve recognition of darker images or decrease for lighter images.
OCR default recognition rate
This is the rate of recognition that is required for the extracted string to match the expected text. Decrease the recognition rate to tolerate a proportion of mismatching characters in the recognized text. The default is 100%, which means that an exact match is required.


4. Generic Service Client preferences

The Generic Service Client preferences define the network proxy and authentication credentials to use for importing files from a URL. It also contains the timeout settings.

To access the preferences, click Window > Preferences > Generic Service Client. After changing a setting, click Apply.

Service request timeout
Specify the time allocated for the service request by the generic service client
URL connection timeout
Specify the time allocated to connect to a URL that has a WSDL or XSD file.
WSDL parser timeout
Specify the time allocated for the WSDL file to be parsed.
Proxy host
Type the name or IP address of the proxy server.
Proxy port
Type the port of the proxy server.
Keystore
Specify the file that includes a certificate to access the server.
Keystore password
Specify the password for authentication.


4.1. Service test editor preferences

The Service test editor preferences control the behavior of the test editor when working with service tests.

To access the web service test editor preferences, click Window > Preferences, expand Test, expand Test Editor, and click Web service Test Editor. After changing a setting, click Apply.

When response content is XML, ask whether to display content as XML or text.
The test editor can display XML content either as text or as XML. Select this option to be asked how to display the data. By default XML content is displayed as XML.
Add quotes around SOAP actions into WSDL model
This option automatically inserts quotes around SOAP actions when this syntax is required by the WSDL.


4.2. Service message edition preferences

The message edition preferences control the behavior of the test editor when editing message content.

To access the message edition preferences, click Window > Preferences, expand Test, expand Service Testing, and click Message Edition. After changing a setting, click Apply.

Edit XML
These settings specify how XML is displayed in the test editor.
Background color for source errors
This setting specifies the color that marks errors in the XML source view.
Delay used to synchronize source (ms)
This specifies the number of milliseconds after which the display of the XML source is updated.
Use color in source view
This enables colorized XML source on the Source tab of web service call elements.
Create SOAP message with pretty XML serialization
This improves readability by adding indentation and line wrapping to the XML source displayed in the web service protocol data view.
Test Contents Tree Label Formatting
These settings specify how the test elements are displayed in the test contents pane of the test editor.
Maximum number of parameters displayed in message
This specifies the number of parameters displayed in the test editor to identify web service test elements. When the number of parameters exceeds this value, only the first parameters are displayed.
Maximum parameter list displayed in message
This specifies the number of characters displayed for each parameter in the test editor to identify web service test elements. Parameters that exceed are truncated to the specified value. The minimum value is 3.
Attachments
These settings specify the default settings for adding attachments to web service test elements.
Default MIME type
This is the default MIME type for new attachments.
Default encoding
This is the default encoding for new attachments.


4.3. Service test generation preferences

The service test generation preferences control how service tests are generated.

To access the web service test generation preferences, click Window > Preferences, expand Test, expand Performance Test, and click Web Service Test Generation. After changing a setting, click Apply.

Time out delay used for call
This is the default time out delay for service calls. If the service does not respond within this period, an error is produced.
Time out delay used for call
This is the default time out delay for asynchronous callbacks. If the service does not respond within this period, the test runs the timeout element of the callback.
Think time default value
This is the default think time for generated tests.
XML data correlation limit
This is the default maximum number of attributes and text nodes supported for data correlation.
XML Message maximum length for answers
This is the default maximum number of characters for the generated XML.
Text Message maximum length for answers
This is the default maximum number of characters for the generated text.
Use case sensitive URL matching
Select this option to enable the test generation to match URLs from the WSDL with recorded URLs only when their case matches. Disable this option to ignore differences between upper and lower case characters.


4.4. Raw transaction data view preferences

The raw transaction data view preferences control how XML data is displayed in the Raw Transaction Data view.

To access the service protocol data view preferences, click Window > Preferences, expand Test, and click Raw Transaction Data View. After changing a setting, click Apply.

Enable XML source coloring
Select this option to enable XML syntax coloring.
Coloring styles
This section enables you to set a color for each element type in the XML source.
Coloring mode
This enables you to select the style that is used for start tags, end tags, attributes, and CData tags.


4.5. Auto values preferences

The auto values preferences define the default values that are generated by default in SOAP-based service calls.

To access the Auto values preferences, click Window > Preferences, expand Generic Service Client, and click Auto Values. After changing a setting, click Apply.

The table lists the default value used for each primitive type. Click Edit to modify the default values or click Reset to revert to the default settings.


4.6. Response time breakdown preferences

Set these preferences to disable response time breakdown information in service tests. Some server implementations can have problems processing SOAP or HTTP header that include response time breakdown information.

To access the service protocol data view preferences, click Window > Preferences, expand Test, and click Raw Transaction Data View. After changing a setting, click Apply.

Include response time breakdown information in SOAP header
Specifies whether response time breakdown information is automatically included in the SOAP header of service requests.
Include response time breakdown information in HTTP header
Specifies whether response time breakdown information is automatically included in the HTTP header of service requests.


4.7. Cookies support preferences

The cookies support preferences define how cookies are managed with HTTP services.

To access the cookies support preferences, click Window > Preferences, expand Test, expand Service Testing, and click Cookies Support. After changing a setting, click Apply.

Enable cookies for web services that use HTTP cookies
Select this option to enable support for HTTP cookies in the generic service client. If the option is not selected, cookies are ignored.
Load persistent cookies
Select this option to reload cookies that were saved from a previous session. If the option is not selected, cookies are reset when the workbench is restarted.
Reload transient cookies
Select this option to reuse saved cookies in existing requests in the generic service client.
Cookies specification
Select the specification that is used for supporting cookies. In most cases, use the best-match default setting.
Release Session Cookies
Click this button to immediately release all saved cookies.


4.8. WSDL Information Preferences

The WSDL Information preferences define the tags including server, creation, and technology tags from where certain information is fetched when importing a WSDL file

To access the WSDL information preferences, click Window > Preferences, expand Generic Service Client and click WSDL Information. After changing a setting, click Apply.


5. Report preferences

Report preference settings apply to all protocols.


5.1. Test report preferences

The preference settings for test reports control such preferences as the typeface, color, and graph style of reports, and whether a Compare report is automatically launched when a staged run completes. You can also display a warning when changing Page Percentile report options will cause data to be lost.

You have different preferences to set behavior for Performance Test Reports and Legacy reports. To access the Performance Test Reports preference settings, click Window > Preferences > Test > Performance Test Reports.

You can set the following preferences for test reports:

Remain focused on default time range
Select to show all data from time 0 to the end of run in a staged run report.
Focus report on active time range
Select to show all data aggregated in real-time relative to the start of a user-load plateau and not relative to the beginning of the run in staged run report (default).
Add active time range to a comparison report for all time ranges
Select to add a user-load plateau to a real-time comparison report of all smart-load time ranges in a staged run report.
Open a new report on the active time range
Select to open a new staged run report when a user-load plateau becomes active.
Launch compare report when staged run completes
When you are running a schedule containing multiple stages, time ranges are automatically created for each stage. Click to display a Compare report, which compares the time ranges of each stage, automatically after the run completes.
Warn of lost changes when a dynamic tab is modified
The most common example of losing changes occurs in the Page Percentile report. If you generate a report on a run, and then regenerate it with different percentiles, the data from the original percentiles will be overwritten. Display a warning when you modify a report.
Default Result Action
Set the report or log viewer to be displayed after a test or schedule is run, or when a prior result is opened from the Test Navigator view.

To access the Legacy Test Reports preference settings, click Window > Preferences > Test > Performance Test Reports > Legacy Reports.

You can set the following preferences for test reports:

Title color
Click the color button to set the color of the report titles.
Title font
Click Change to set the typeface and the size for the report titles.
Use thin bars on bar chart
Select to display graph bars that do not touch; clear to display graph bars that touch.
Use 3D bars on bar chart and pie chart
Select to display three-dimensional bars.
Use symbols on line chart
If your report is long with many data points, you typically clear this box. When there are many data points, the symbols obscure the trends and make the data hard to interpret.
Time range markers highlight full areas
Select to apply background highlights to reports.
Use alternating background on tables
Select to give every other table row a gray background. The alternating background helps make the tables easy to read.
Data gradient
The default of None displays reports in neutral colors, which generally will suit your needs. You can also select a brown, gray, or red-green-blue color scheme. Select 256 color supported only if you have problems with the other data gradients.
Marker gradient
When you are modeling a workload over time, select a contrasting color palette for markers to distinguish them from the data gradient that you have selected. Markers separate the time ranges displayed on line charts.


5.2. Default report preferences

Use this page to select the default report that opens during a run. Typically, you select Determine default report based on protocols in test, which determines the protocols that you are testing, and automatically opens the appropriate protocol-specific reports. Select a specific default report to display a customized report or if the default reports do not meet your needs. Note, however, that you will have to change this setting when you record other protocols.

Open the Default Report Preferences page: Click Window > Preferences > Test > Performance Test Reports > Default Report.


5.3. Export report preferences

Use this page to automatically export reports to a comma-separated-values (CSV) file at the end of a run. The CSV file is useful when you run a schedule from the command line because you can automatically export results without opening the workbench. The CSV file contains metadata about the test run, a blank line, and the report counter data. Simple CSV format contains only the last data value in the run. Full CSV format contains all data values for every sample interval during a run.

Open the Export Report Preferences page: Click Window > Preferences > Test > Performance Test Reports > Export Reports.


5.4. Web report preferences

Preference settings control access of reports from external web browser.

To access the preference setting for web report, click Window > Preferences > Test > Performance Test Reports > Web Reports.

You can set the following preferences for the web reports:

Allow remote access from a web browser
Select this check box to allow access to reports from a web browser.
Allow control of schedule execution from the web browser
Select this check box to control schedule execution from a web browser.
No security is required to access reports
Click this option to allow access to reports without login credentials. Specify a port number.
Security is required to access reports
Click this option to provide an authentication layer for accessing reports. Specify a port number and provide the login credentials.


5.5. Percentile analysis preferences

Use this view to customize the percentiles that are reported in the Page Percentile report or to customize the performance requirements on a percentile response. The defaults, 85, 90, and 95, are sufficient for most purposes. However, if you need to report on a different percentile set, or to set a different percentile requirement, edit the percentiles or add new percentiles.

Open the Percentile Analysis Targets preference page: Click Window > Preferences > Test > Percentile Analysis Targets.


6.1. HTTP test editor reference

In HTTP testing, the test editor information is divided into five categories. This section describes the fields in each category that can be edited manually.

This section focuses on low-level editing tasks that experienced performance testers do. For information about the layout of the test editor and the more commonly performed, high-level editing tasks, see Edit HTTP tests .


6.1.1. HTTP test details

Test detail fields apply to the entire test.


Common options

Datapools
Lists details about each datapool that the test uses: the name of the datapool, the columns that are used, and the location in the test where the datapool column is referenced. Click an item in the Location column to go to that location.
Add Datapool
Click to add a reference to a datapool for test to use. Clicking this option is the same as clicking Add > Datapool with the test selected.
Delete
Select a datapool reference, and then click to delete the reference from the test. The datapool is still available to other tests.
Show Datapool Candidates
Click to open the Show Datapool Candidates window, where you can review and change data correlation.
Digital Certificates
Lists details about the certificate stores that the test uses. Click Add to add a certificate store for the test to use. HTTP and SOA support digital certificates. Other protocols do not support digital certificates.
Enable response time breakdown
Enables collection of response time breakdown data. With response time breakdown, you can see statistics on any page element. The statistics show how much time was spent in each part of the system under test. You can use response time breakdown to identify code problems. You can see which application on which server is the performance bottleneck, and then drill down further to determine exactly which package, class, or method is causing the problem.

This option is displayed in multiple test elements. Enable this option in an element also enables it in the element.s children. For example, enabling monitoring at the test level also enables monitoring at the page and request levels. You can enable monitoring for a specific page; doing so enables monitoring for the requests of that page, but not for other pages or their requests.

HTTP and SOA support response time breakdown. Other protocols do not support response time breakdown.


Security

Digital Certificates
Lists details about the certificate stores that the test uses. Click Add to add a certificate store for the test to use. Not all protocols support digital certificates.
Enable Kerberos authentication
Select to enable Kerberos authentication. The user ID, password, and realm are supplied when a Kerberos authentication challenge occurs during playback. If you record a test using no authentication, and then enable Kerberos authentication on the system under test, select this check box.
User ID
Type the user principal name. The user principal name format consists of the user name, the "at" sign (@), and a user principal name suffix. Do not use the domain\username format. User IDs are case-sensitive.
Password
Type the password for the User ID. Passwords are case-sensitive.
Client realm
Type the realm of the client application. In Windows environments, the client realm is the Windows domain name for the computer sending the request to the server. Typically, the client realm is all uppercase.
Client KDC
Type the name of the client key distribution center. In Windows environments, the client key distribution center is the hostname of the domain controller for the client realm. By default, the client key distribution center is set to the domain controller of the computer where the test was recorded. Verify the default value with your system administrator.
Server realm
Type the realm of the server under test. The client and server might share the same realm. Type the server realm only if the server realm is different from the client realm. Contact your system administrator for more information about the server realm.
Server KDC
Type the name of the server key distribution center. In Windows environments, the server key distribution center is the hostname of the domain controller for the server domain. Type the server key distribution center only if the server is in a different domain than the client.
Enable response time breakdown
Select to enable the collection of response time breakdown data. You can enable response time breakdown collection at the parent or page level. Not all test elements support response time breakdown data collection.


Performance Requirements

Performance Requirements
The table displays the performance requirements that are defined in the test. To edit a requirement definition, double-click a table row. To return to this table, click the root name of the test in the Test Contents area.
Clear
Select one or more requirements and click to remove the definition. The requirement is still available and can be redefined.
Enable response time breakdown
Select to enable the collection of response time breakdown data. You can enable response time breakdown collection at the parent or page level. Not all test elements support response time breakdown data collection.


HTTP options

Timeout action
Specifies what the test does if the primary request for a page does not succeed within the Timeout interval. If you select Log error and continue execution, the test logs the error and proceeds to the next page. If you select Try to reload the page, the test attempts to reload the page one more time. If that attempt fails, the test logs an error and proceeds to the next page.
Timeout
Time threshold for initiating the action that you select for Timeout action.
Clear cookie cache when the test starts
This option resets the cookie cache when looping in the schedule or when a test follows another test in the schedule. By default, the cookie cache for a virtual user is not reset, which is consistent with browser behavior. If you want each loop iteration to behave as a new user, select this option. Otherwise, the cookies in the cache might alter the server responses and verification points might fail. To reset the cookie cache from one loop iteration to the next when looping within a test, add custom code and call an API.
Clear page cache when the test starts
This option deletes the page cache when a test starts. Typically, when a test follows another test in the schedule or when you anticipate an out-of-memory exception due to overload, you can delete the cache.
Disable page cache emulation in this test
This option disables page cache emulation. When page cache emulation is enabled, caching information in server response headers is honored. Additionally, requests are not submitted to the server for content that is confirmed by the client as fresh in the local cache. Page cache emulation is enabled by default.
Playback speed
Move the slider to increase or decrease the speed at which the HTTP requests are sent. You can specify a range from no delays to twice the recorded length. This scale is applied to the Delay field of each request in the test. If you speed playback up dramatically, requests might occur out of order. To fix this problem, reduce playback speed until the test runs correctly again.

Note: To set a maximum request delay, click Window > Preferences > Test > Test Generation > HTTP Test Generation. Click the Protocol tab, and enter a value for Maximum Request Delay.

Secondary request behavior
Click Modify to disable or reenable requests that occur within a page. You can disable all secondary requests, images, host-based or port-based requests, or user-defined requests.
Enable response time breakdown
Select to enable the collection of response time breakdown data. You can enable response time breakdown collection at the parent or page level. Not all test elements support response time breakdown data collection.


6.1.2. HTTP page details

Page detail fields apply to the page that is currently selected.

General tab
Page title
Display name for the page. If the primary request returned a title, the display name for the page is the content between the <title></title> tags. If the primary request returned no title or an empty title, a name for the page is constructed from the first node in the web address for the primary request URL, for example, www.site.com/displayname/.... If two pages have the same page title but are at different web addresses (for their primary request), then a number might be appended to indicate that they are different (for example, displayname {1}, displayname {2}). The pages are included in reports as separate pages, with their unique appended names.
Pages with the same title and web address appear in the test editor with the same page title and in reports as the same page. Rename any pages to be reported on under a different name. Renaming a page neither changes the value (if any) between the <title></title> tags nor affects how the test runs.
Primary request
Displays a hyperlink to the primary request for the page. This request is highlighted and is the request from which the display name for the page is derived.
Think time
Specifies the programmatically calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking about a page before requesting another page from the server.
Test data
Summarizes data substitutions and potential matches in the page. Right-click a row, or select a row and then click Options, to perform common operations. Double-click a row to navigate to the location where a substitution or potential match occurs. To associate a datapool candidate with a datapool, click the row, and then click Substitute. To remove a datapool substitution, click the row, and then click Remove Substitution. To find more locations in the test that have the same value as the selected row, click Find More. Click the icons to the left of the preview area to switch between an inline view and a hierarchical view of the selected data.
URL Encode
Indicates whether a test value contains special characters such as spaces or commas. With this option, special characters are encoded when variable data is substituted from a datapool.
Page title verification point
Indicates whether the page title verification point is enabled for this page. If so, Enable verification point is selected.
When Enable verification point is selected, the value between the <title></title> tags, if any, is copied to the Expected page title field on the properties page of the verification point. Click Edit Properties to change the Expected page title. The value between the title tags is different from the display page title (the value in the Page title field) that is used for reporting. Changing the Page title does not change the value between the title tags, and therefore does not affect what is initially copied to the Expected page title field.
If Enable verification point is selected, the test verifies whether the page returns the value in the Expected page title field. An error is reported in the test log if the title returned by the primary request for the page does not contain the expected title. Although the comparison is case-sensitive, it ignores multiple white-space characters (such as spaces, tabs, and carriage returns).
Enable response time breakdown
Select to enable the collection of response time breakdown data. You can enable response time breakdown collection at the parent or page level. Not all test elements support response time breakdown data collection.
Enable response time breakdown
Enables collection of response time breakdown data. With response time breakdown, you can see statistics on any page element. The statistics show how much time was spent in each part of the system under test. You can use response time breakdown to identify code problems. You can see which application on which server is the performance bottleneck, and then drill down further to determine exactly which package, class, or method is causing the problem.

This option is displayed in multiple test elements. Enable this option in an element also enables it in the element.s children. For example, enabling monitoring at the test level also enables monitoring at the page and request levels. You can enable monitoring for a specific page; doing so enables monitoring for the requests of that page, but not for other pages or their requests.

HTTP and SOA support response time breakdown. Other protocols do not support response time breakdown.

Advanced tab
Enable Performance Requirements
Select to enable the use of performance requirements for this test.
Name
Name of this set of enabled performance requirements. By default, it is the URL of the page. Although you can change the name to improve readability, only the Performance Requirements report uses the changed name. Other reports use the default name. Click Use Defaults to reset Name to the default value.
Performance Requirement
All performance requirements are displayed in the table. Shaded requirements indicate that they are undefined. To define a requirement, set an Operator and Value. To apply the defined requirement to multiple pages, select the pages in the test, right-click the requirement row in the table, and click Copy Requirements.
Operator
Click this field to display a list of mathematical operators. Select an operator for this performance requirement.
Value
Click this field to set a numeric time value in milliseconds.
Standard
Select to enable this requirement to be processed by the report as a standard requirement. Standard requirements can cause a test to fail. Performance requirements that are not listed as standard do not cause the test to fail.
Hide Undefined Requirements
Select to prevent undefined performance requirements from appearing in the table. This hides the shaded rows.
Clear
Select one or more requirements and click to remove the definition. The requirement is still available and can be redefined.
Error Handling
Click to open the error condition table. You can use error handling to specify an action to take and a message to log when a specific condition occurs. Conditions include verification point failures, server timeouts, custom code alerts, and data correlation problems. All conditions are displayed in the table, along with the action to take and the message to log when the error occurs. To define an error handler, select a Condition, and then click Edit.
Hide unselected conditions
Click to display only the selected error handlers. Hiding a condition does not deactivate the condition.


6.1.3. HTTP request details

Page request fields apply to the page that is currently selected.

General tab
Version
Indicates the HTTP version.
Method
Indicates the HTTP request method that was used during recording. Typically, you do not change this value unless you are adding a new request to a test. GET, POST, PUT, HEAD, and DELETE are supported.
Primary request for page
Displayed for the primary request, and cannot be modified. A page can contain only one primary request.
Click to set as primary
Displayed for all secondary requests. Because each page can have only one primary request, if you select this option, the Primary request for page option is moved to this request, and the Click to set as primary option is moved to the original primary request. To undo your change, select Click to set as primary on the original primary request.
Connection
Specifies the connection to the web server. The connection includes the host name, which is typically the fully qualified domain name, and the listener port on the web server. Click the name of the connection to navigate to the server access configuration where the connection is defined. Click Change to change the connection used for this request.
URL
Specifies the path to a resource (such as a page, graphics file, or stylesheet file). When the method is GET, the URL field typically includes query strings that are designated as datapool candidates.
Data
Specifies additional content data that might be needed to clarify the request. When the method is POST, the data frequently includes values that are designated as datapool candidates.
Request Headers
Lists each request header and its value. To change the value of a header, click the row, and then click Modify. To add a new header, click Add. To delete a header, click Remove.
Enable response time breakdown
Select to enable the collection of response time breakdown data. You can enable response time breakdown collection at the parent or page level. Not all test elements support response time breakdown data collection.

Use the Advanced page to configure performance requirements, error handling, and delay behavior for the request.

Advanced tab
Enable Performance Requirements
Select to enable the use of performance requirements for this test.
Name
Name of this set of defined performance requirements. By default, the name is the URL of the request. Although you can change the name to improve readability, only the Performance Requirements report uses this name. Other reports use the default name. Click Use Defaults to reset Name to the default value.
Performance Requirement
All performance requirements are displayed in the table. Shaded requirements indicate that they are undefined. To define a requirement, provide details in Operator and Value. To apply the defined requirement to multiple requests, select the requests in the test, right-click the requirement row in the table, and click Copy Requirements.
Operator
Click this field to display a list of mathematical operators. Select an operator for the performance requirement.
Value
Click this field to set a value for the requirement.
Standard
Select to enable this requirement to be processed by the report as a standard requirement. Standard requirements can cause a test to fail. Performance requirements that are not listed as standard do not cause the test to fail.
Hide Undefined Requirements
Select to prevent the table from including undefined performance requirement candidates. Selecting this check box hides all shaded rows.
Clear
Select one or more requirements, and click to remove the definition. The requirement is still available and can be redefined.
Error Handling
Click to open the error condition table. You can use error handling to specify an action to take and a message to log when a specific condition occurs. Error conditions include verification point failures, server timeouts, custom code alerts, and data correlation problems. All error conditions are displayed in the table, beside the action to take and the message to log when the error occurs. To define an error handler, select a Condition, and then click Edit.
Hide unselected conditions
Click to display only the selected error handlers. Hiding a condition does not deactivate the condition.
Applied Transform
Indicates the data transformation that is applied to the request. Click Change to select a data transformation to apply to the request.
Character set
Indicates the character set to be used for the page request. Click Change to see the valid character sets.
Wait for
Indicates the associated request that must start or finish before this request is issued. Click Request to select a different request. Click the Clear request association icon to remove the association.
Release when
Select Last Character Received or First Character Received to indicate when this request is issued in relation to the associated request.
Additional delay (ms)
Indicates the additional delay, in milliseconds, to wait before this request is issued. Delays are statistical emulations of user behavior. You can scale this delay at the test level to make a test play back faster (or slower) than it was recorded.
Delay
Previous versions of tests support only waiting for primary requests. Wait for and Release when are not available. The additional delay in previous versions of tests is measured from the first character received of the primary request.
Digital Certificates
Lists details about the certificate stores that the test uses. Click Add to add a certificate store for the test to use. HTTP and SOA support digital certificates. Other protocols do not support digital certificates.
Enable response time breakdown
Enables collection of response time breakdown data. With response time breakdown, you can see statistics on any page element. The statistics show how much time was spent in each part of the system under test. You can use response time breakdown to identify code problems. You can see which application on which server is the performance bottleneck, and then drill down further to determine exactly which package, class, or method is causing the problem.

This option is displayed in multiple test elements. Enable this option in an element also enables it in the element.s children. For example, enabling monitoring at the test level also enables monitoring at the page and request levels. You can enable monitoring for a specific page; doing so enables monitoring for the requests of that page, but not for other pages or their requests.

HTTP and SOA support response time breakdown. Other protocols do not support response time breakdown.


6.1.4. HTTP response data details

Response data fields apply to the response data that is returned by each page request.

General tab
Status
Indicates the status code for the HTTP response, such as 200, 201, 203, or 302.
Version
Indicates the HTTP version, such as 1.1.
Reason
Indicates the code for the HTTP response, such as OK, Found, or Not Found.
Response Headers
Lists each response header and its value. To change the value of a header, click the row, and then click Modify. To add a new header, click Add. To delete a header, click Remove.
Content
Shows the content (such as tagged HTML, graphics files, or stylesheet files) that the web server returned, based on the corresponding request.
Advanced tab
Applied transform
Indicates the data transform that is applied to the response. Click Change to select a data transform to apply to the response.
Character set
Indicates the character set to be used for the response. Click Change and select the encoding to change the character set.


6.1.5. HTTP server access configuration details

Server access configurations store HTTP connection information. By default, a connection does not remain open across test boundaries. Several connections can use the same server access configuration, and the same connection can be used by several other requests in the same test. If you change the host, port, or authentication for a server access configuration, those changes apply to all connections in the test that use the configuration.

Configuration name
Name of the server access configuration.
Host
Name of the host for the web server. Usually, this is the fully qualified domain name, but it can be an IP address or other name.
Port
Specifies the listener port on the web server.
Authentication and security
Indicates whether this connection uses the SSL protocol, the NT/LAN Manager (NTLM) authentication protocol from Microsoft, or an HTTP proxy server. A blank field indicates that the connection is unauthenticated and not secure. To add proxy, SSL, or NTLM authentication, expand the request, click the connection, and then click Add.
Connections that use this configuration
Lists the connections that use this configuration.


6.2.1. SAP test details

In the test editor, the test element is the first element in the test suite. These settings apply to the entire test.


SAP options

Display SAP GUI on execution
During test execution, it might not be desirable to display the SAP GUI. Hiding the SAP GUI improves the performance of the virtual users. This setting specifies the behavior for the current test suite. However, you can change the default setting for generated tests in the SAP Test Generation preferences.
Hide
When selected, all instances of the SAP GUI are hidden. In some cases, modal dialog boxes from the SAP GUI can flash briefly on the screen. This is the default setting.
Show
When selected, the SAP GUI is displayed for all virtual users.
Show only first virtual user
When selected, the SAP GUI is displayed only for the first virtual user. All other instances of the SAP GUI are hidden. This allows you to monitor the execution.


Common options

Datapools
Lists details about each datapool used by the test: the name of the datapool, the columns that are used, and the location in the test where the datapool column is referenced. Click the location to navigate there.
Add datapool
Adds a reference to a datapool that you want a test to use. Clicking this option is the same as clicking Add > Datapool with the test selected.
Remove
Removes the selected datapool. This option is not available if the datapool is in use.


6.2.2. SAP connection details

In the test editor, SAP connection elements are at the top of the test site and describe the connection to the SAP R/3 server. These settings apply to the entire test.

SAP system name
This is the description normally used by SAP Logon to identify the server. If the Connection by string option is selected this field is ignored.
Connection by string
Select this option to use the connection string that was returned by the server when recording to connect to the server without referring to the SAP Logon program. This is safer when deploying the test on remote computers. Advanced users can edit the connection string if necessary. You can use data correlation to substitute this value.
Get SAP GUI session statistics
Select this option to record session statistics from the SAP GUI client in the test results. These results are displayed on the User Load page of the test report.
Use new visual design
Select this option to run tests with a visual design theme when using SAP GUI 7.0 or later. In most cases, it is best to leave this option disabled, which causes tests to run with the default SAP GUI visual design and avoids compatibility issues.
Use recorded visual design theme
If Use new visual design is selected, select this option to use the visual design theme that was used during the recording o
Use other visual design theme
If Use new visual design is selected, select this option to use a specific visual design theme. Ensure that the name is correct and that the visual design theme is installed on the test computer. Unexpected results might occur if you specify a visual design theme name that cannot be located on the test computer.


6.2.3. SAP screen details

In the test editor, SAP screen elements are located in transactions and are the basic performance measurement unit for the test. These settings apply to the selected get event.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.

Title
This is the recorded name of the SAP screen. This field is read-only.
Do not measure performance on this screen
Select this option if you do not want to obtain response time results for the current SAP screen. Use this for SAP screens that are not meaningful for your test, such as the logon screen.
Optional screen
Select this option if you do not want to log an error when the current SAP screen is not displayed. Use this for SAP screens that are not always displayed.
Data Table
Summarizes data substitutions and substitution candidates in the SAP screen. Double-click a row to navigate to the location where a substitution or candidate occurs. To associate a datapool candidate with a datapool, click the row and then click Datapool Variable. To remove a datapool substitution, click the row and then click Remove Substitution.


Screen Title Verification Point

Enable Verification Point
When selected, the test verifies whether the SAP screen returns the value shown in the Expected screen title field. An error is reported in the test log if the screen title returned during the test does not match the expected title.
Expected screen title
This field allows you to specify the expected SAP screen title. By default, the expected title is the recorded title. The expected title can optionally be expressed as a regular expression.
Recorded screen title
This field displays the recorded title of the current SAP screen. This field is read-only.
Use Regular Expression
Select this option to express the expected title using the standard regular expression syntax.


6.2.4. SAP set details

In the test editor, SAP sets are located in SAP screen elements and describe a user input action in the SAP GUI client. These settings apply to the selected SAP set.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.

Think Time
Specifies the programmatically-calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.


SAP Set

Property name
This is the description of the GUI object related to the current SAP set as it appears to the user in the SAP GUI. This field is read-only.
Value
This is the value entered by the user in the current SAP set. You can use data correlation to substitute this value.


SAP GUI Object Information

Name
This is the recorded name of the GUI object related to the current element. This field is read-only.
Type
This is the recorded type of the GUI object related to the current element. This field is read-only.
Identifier
This is the recorded identifier of the GUI object related to the current element. This field is read-only.


6.2.5. SAP get details

In the test editor, SAP get events are located in SAP screen elements and provide a way to retrieve data from a SAP GUI object to implement verification points. These settings apply to the selected get event.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.

Think Time
Specifies the programmatically-calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.


SAP Get

Property name
This is the description of the GUI object related to the current event as it appears to the user in the SAP GUI client. This field is read-only.
Value
This is the value recorded during the test or during the last execution. You can use data correlation to reference this value. This field is read-only.


Verification Point

Enable Verification Point
When selected, the test verifies whether the screen returns the value specified in Expected Value. An error is reported in the test log if the value returned during the test does not match the expected value.
Expected Value
This field enables you to specify the expected value for the get event. The expected value can optionally be expressed as a regular expression. You can use data correlation to substitute this value.
Use Regular Expression
Select this option to express the expected value using standard regular expression syntax.


SAP GUI Object Information

Name
This is the recorded name of the GUI object related to the current element. This field is read-only.
Type
This is the recorded type of the GUI object related to the current element. This field is read-only.
Identifier
This is the recorded identifier of the GUI object related to the current element. This field is read-only.


6.2.6. SAP call details

In the test editor, SAP call elements are located in SAP screen elements and describe various recorded interactions with the SAP R/3 server. These settings apply to the selected SAP event.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.

Think Time
Specifies the programmatically-calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.


SAP Call

Method name
This is the internal method call used by the SAP GUI client. This field is read-only.
Parameter
If the method uses parameters, one Parameter line is displayed for each parameter. Advanced users can modify these parameters. Refer to SAP documentation for more information about the parameters used by SAP GUI method calls. You can use data correlation to substitute this value.
Return
If the method returns a value, a Return line is displayed, which can be used for data correlation or for a verification point. The value displayed is not the actual return value, but only represents the type of the parameter, for example, string for a string type or 0 for an integer. Refer to SAP documentation for more information about the parameters used by SAP GUI method calls.


Verification Point

Enable Verification Point
When selected, the test verifies whether the Return value of the SAP call (if any) matches the value specified in Expected Value. An error is reported in the test log if the value returned during the test does not match the expected value.
Expected Value
This field enables you to specify the expected value for the call. The expected value can optionally be expressed as a regular expression. You can use data correlation to substitute this value.
Use Regular Expression
Select this option to express the expected value using standard regular expression syntax.


SAP GUI Object Information

Name
This is the recorded name of the GUI object related to the current element. This field is read-only.
Type
This is the recorded type of the GUI object related to the current element. This field is read-only.
Identifier
This is the recorded identifier of the GUI object related to the current element. This field is read-only.


6.2.7. SAP server request details

In the test editor, server request elements are located at the end of every SAP screen and provide information that the server returns for the selected screen.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.


SAP Screen

Name
This is the name of the current SAP transaction code. This field is read-only.
Program
This is the name of the SAP source program that is currently running. This field is read-only.
Flushes
This is the count of the number of flushes in the automation queue during server communication. This field is read-only.
Response Time
This is the delay between the moment the SAP GUI client sends the request to the SAP R/3 server and the moment the server response arrives. The units are milliseconds. This field is read-only.
Interpretation Time
This is the delay between the moment the data is received by the SAP GUI client and the moment the screen is updated. It measures interpretation of data by the SAP GUI client, not SAP R/3 server performance. The units are milliseconds. This field is read-only.
Roundtrips
This is the count of token switches between the SAP GUI client and the SAP R/3 server to perform the request. This field is read-only.


Request Time Verification Point

Enable verification point
When selected, the test verifies whether the request time returned by the server is below the specified threshold value. An error is reported in the test log if the measured request time is above the threshold.
Response time threshold (ms)
This is the request time limit above which an error is reported in the test log.


Request Timeout

Timeout value (ms)
Select this option to change the default timeout value (3 minutes) for very long transactions.
Response time threshold (ms)
The test verifies that the request time returned by the server is below the specified threshold value. An error is reported in the test log if the measured request time is above the threshold.


6.2.8. SAP batch connection details

In SAP batch input tests, SAP batch connections contain the basic connection information for a batch input test to connect to the SAP R/3 server without a SAP GUI. In most cases, these details are the same as those used when you connect manually to SAP R/3 with the SAP GUI.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.


SAP Batch Input Connection

Client
This is the SAP client number that is used by the batch input test to connect to the SAP R/3 server. You can use data correlation to substitute this value.
User
This is the user name that the batch input test uses to connect to the SAP R/3 server. You can use data correlation to substitute this value.
Password
This is the password that the batch input test uses to connect to the SAP R/3 server. You can use data correlation to substitute this value.
Language
This is the two-letter language code. You can use data correlation to substitute this value.
Host
This is the IP address or computer name of the SAP R/3 server. You can use data correlation to substitute this value.
System Number
This is the system number of the SAP R/3 server. You can use data correlation to substitute this value.
Additional SAP Connection Properties
Use this list to specify any advanced SAP Java. Connector (JCo) properties for advanced SAP router setup. Select the JCo property to set in the Property name list, and type the required value in Property value. Click Add to add more properties.


6.2.9. SAP batch input transaction details

In the SAP batch input tests, SAP batch input transactions are located in transactions and are recorded transactions that are to be run at a low level, without a SAP GUI, to produce a load on the SAP R/3 server.

SAP element label
This is the name of the selected SAP test element as it is displayed in the Test Contents. Use this field to rename the test element, or click Restore Default to revert to the default name.


SAP Batch Input Transaction

Code
This is the SAP transaction code of the recorded transaction.
Mode
This is the mode of the batch input transaction as it was recorded in the SAP GUI.
Data table
This is the data table of the batch input transaction as it was recorded in the SAP GUI. See the SAP documentation for details on the contents of the recording.


6.3.1. Citrix test details

In the test editor, the Citrix Test is the first element of a Citrix test. These settings apply to the entire Citrix test.


Citrix options

Synchronization timeout delay
This is the delay after which a timeout error is produced when a window event is not recognized during test execution. The default value is 6000 milliseconds.
Delay between mouse down and mouse up in a click
This is the default delay used to generate a mouse click action using a mouse down and a mouse up action. The default value is 50 milliseconds.
Delay between two mouse clicks in a double click
This is the default delay used to generate a double-click action using two mouse clicks. The default value is 200 milliseconds.
Delay between key down and a key up in a key stroke
This is the default delay used to generate a key stroke action using a key down and a key up action. The default value is 100 milliseconds.
Delay between two keyboard strokes in a text input
This is the default delay used to generate a text input action using multiple key stroke actions. The default value is 500 milliseconds.


Common options

Datapools
Lists details about each datapool used by the test: the name of the datapool, the columns that are used, and the location in the test where the datapool column is referenced. Click the location to navigate there.
Add datapool
Adds a reference to a datapool that you want a test to use. Clicking this option is the same as clicking Add > Datapool with the test selected.
Remove
Removes the selected datapool. This option is not available if the datapool is in use.


6.3.2. Citrix session details

In the test editor, the session is located at the top of the Citrix test. Session settings apply to connection with the server.


Session Attributes

Session Title
This is the name of the current session. By default, it is the same as the name of the test.
Server Address
This is the address of the Citrix server. The value can be a host name or an IP address. This value can be linked to a datapool.
Initial Program
This is the name of a published application on the Citrix server. Use this option to manually specify a published application if no Independent Computing Architecture (ICA) file is available. If no published application and no ICA file is specified, the session starts with the Windows desktop.
ICA File
If you recorded the test with an ICA file, this is the location and name of the file. The ICA file contains connection and application information to launch directly a published application with the Citrix XenApp client.
User name and Password
These fields allow you to specify user authentication information. These values can be linked to a datapool.
Color Depth
This is the recorded color depth for the Citrix XenApp client. This value is read-only.
Screen Size
This is the recorded screen resolution for the Citrix XenApp client. This value is read-only.


Response Time Definitions

This table defines the response time measurements that will be performed during the test. By default, response times are automatically generated on main create window events.

Response Time
This is the name of the response time measurement. To change a name, select a response time and click Rename. These names appear in the performance test report.
Started by
This is the user input action that triggers the start of the response time measurement. To navigate to the corresponding user input action in the test editor, click Go to Start.
Stopped by
This is the user window event that stops the response time measurement. To navigate to the corresponding user input action in the test editor, click Go to Stop.
Add, Rename and Delete
These buttons allow you to manually create, rename or delete a response time measurement.


6.3.3. Citrix window details

In the test editor, the Citrix window elements contain all user input actions and window events. These settings apply to the selected window element.

Window Title
This is the title of the window as displayed in the Citrix session. Some windows do not have titles, and the window ID is used for identification. This field is read-only.
Window ID
This is the window ID number attributed by the Windows operating system when the window is created during the recording session. This number changes each time the test is executed, but the ID remains the same throughout a session. This field is read-only.
Locations
This field displays the X and Y coordinates of the top left corner of the window and size of the window in pixels. This field is read-only.
Window recognition during execution uses
This option allows you to disable window recognition on window position or size. Disable any of these options if the test produces synchronization timeouts because a window changes its position or size between or during test runs.
Parent Window
This is a link to the window element that is the parent of the selected window.
Go to same occurrences of this window
Use these navigation buttons to navigate through the test to other occurrences of this window, for example if during a test the user switches back and forth between windows, or if the current window is modified in any way.


Styles

Window Styles
This area lists the style properties that are enabled for the current window. These are read-only.
Window Extended Styles
This area lists the extended style properties that are enabled for the current window. These are read-only.


Verification Point

Enable Verification Point
When selected, the test verifies whether the window returns the title shown in the Expected title field. An error is reported in the test log if the title returned during the test does not match the expected title.
Use Regular Expression
Select this option to express the expected title using standard regular expression syntax.
Expected title
This field allows you to specify the expected title for the window. The expected title can optionally be expressed as a regular expression.
Recorded title
This displays the title that was recorded for the current window. This field is read-only.


6.3.4. Citrix window event details

In the test editor, the Citrix window event elements are located inside window elements and describe any changes to the location or size of a window. These settings apply to the selected window event element.

Type of Event
This is the type of window event.
Window ID
This is the window ID number attributed by the Windows operating system when the window is created during the recording session. This number changes each time the test is executed, but the ID remains the same throughout a session. This field is read only.
Window Title
This is the title of the window as displayed in the Citrix session. Some windows do not have titles, and the window ID is used for identification. This field is read only. You can click the window title to select the window element in the test contents.
Window Title
This is the title of the window as displayed in the Citrix session. Some windows do not have titles, and the window ID is used for identification. This field is read only. You can click the window title to select the window element in the test contents.
Synchronization state
This describes the behavior of the test if a synchronization timeout occurs on the window event. The base timeout delay is specified in the Citrix test generation preferences, however the actual delay varies with the level of synchronization.
Conditional
The conditional timeout delay is the base timeout delay as specified in the Citrix test generation preferences. If the synchronization fails, the test tries to resume execution and a timeout is logged in the Citrix performance report and the test log.
Mandatory
The mandatory timeout delay is three times the base timeout delay. If the synchronization fails, the test exits with an error status and a timeout is logged in the test log.
Optional
The optional timeout delay is fixed at 2 seconds. If the synchronization fails, the test ignores the timeout.


Response Time

Stop response time for
Select this option to use the current window event to stop a response time measurement. When you select this option on an window event that is not already linked to a response time, a new response time is created with a default name. If there are response times that do not have a stop action, then these are also listed. Select the response time to link to.
Go to response time definition
Click here to navigate to the session element to view the Response Time Definitions table.


6.3.5. Citrix key action details

Citrix key action fields apply to the selected key action element.

Type
This is the type of key action.

  • Key Down: The key is pressed.
  • Key Up: The key is released.
  • Key Stroke: The key is pressed and released.
Key Code
This is the code of the key as interpreted by the Windows operating system.
Character
This field displays the actual key combination that is interpreted.
Modifiers
These options allow you to specify the standard keyboard modifiers: Control, Shift, Alt, or Extended.


Think Time

Enable Think Time
Select this option to specify a think time for the current user input action.
Think Time
Specifies the programmatically calculated time delay that is observed for each virtual user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking about an input before performing the action.


Character edition

Enter a character
This area allows you to enter any key combination to produce Unicode characters that are not normally available through single keystrokes. Select the input field and enter the character on your keyboard. The Key Code and Character fields display the corresponding character.

Note: The workbench uses some key combinations as keyboard shortcuts. Such combinations can be intercepted and cause undesirable actions instead of displaying a particular character in the Character field.


Response Time

Start response time for
Select this option to use the current input action to trigger the start of a response time measurement. When you select this option on an input action that is not already linked to a response time, a new response time is created with a default name. If there are response times that do not have a start action, then these are also listed. Select the response time to link to.
Go to response time definition
Click here to navigate to the session element to view the Response Time Definitions table.


6.3.6. Citrix mouse action details

In the test editor, mouse action elements are located in window elements and describe mouse input. These settings apply to the selected mouse action element.

Type of Event
This is the type of mouse action:

  • Mouse Down: The mouse button is pressed.
  • Mouse Up: The mouse button is released.
  • Mouse Click: The mouse button is pressed and released.
  • Mouse Double Click: The mouse button is clicked twice.
  • Mouse Move: The mouse is moved to a new location.
X Position and Y Position
These are the coordinates of the mouse action. In the case of a mouse move action, these are the coordinates at the end of the movement.
Buttons
These are the buttons that are activated, if any.


Think Time

Enable Think Time
Select this option to specify a think time for the current user input action.
Think Time
Specifies the programmatically calculated time delay that is observed for each virtual user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking about an input before performing the action.


Response Time

Start response time for
Select this option to use the current input action to trigger the start of a response time measurement. When you select this option on an input action that is not already linked to a response time, a new response time is created with a default name. If there are response times that do not have a start action, then these are also listed. Select the response time to link to.
Go to response time definition
Click here to navigate to the session element to view the Response Time Definitions table.


6.3.7. Citrix text input details

In the test editor, text input action elements are located under window events and describe a series of key strokes. These settings apply to the selected text input element.

Value
Specify a string or portion of text that can be entered during the test. You can use references or datapool variables.


Think Time

Enable Think Time
Select this option to specify a think time for the current user input action.
Think Time
Specifies the programmatically calculated time delay that is observed for each virtual user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking about an input before performing the action.


Response Time

Start response time for
Select this option to use the current input action to trigger the start of a response time measurement. When you select this option on an input action that is not already linked to a response time, a new response time is created with a default name. If there are response times that do not have a start action, then these are also listed. Select the response time to link to.
Go to response time definition
Click here to navigate to the session element to view the Response Time Definitions table.


6.3.8. Citrix mouse sequence details

In the test editor, Citrix mouse sequence elements are located under window elements and describe a series of mouse movements. These settings apply to the selected mouse sequence element.

Display mouse sequences for
This option specifies how you want to display previous, current, or all mouse sequences in the current mouse sequence:
Current sequence
Only the current mouse sequence is displayed in the test editor. This option is selected by default.
Previous and current sequences
The current mouse sequence is displayed with any previous mouse sequences.
All sequences
All mouse sequences are displayed simultaneously.
Fit screen to visible area
Adjust the display of the mouse sequence to the available area in the test editor. If disabled, the screen capture will be the actual size, which might require scrolling. This option is enabled by default.
Screen capture area
This area represents the mouse movements on the screen. If a screen capture was recorded, it is displayed in the background. Mouse sequences are displayed as specified.


6.3.9. Citrix screen capture details

In the test editor, screen captures display a graphical overview of the state of the application at a given moment in the test, providing you with a point of reference for navigating through the test.


Session Attributes

Screen captures are obtained by clicking the Capture screen button in the Citrix Recorder Control window during recording.

Locations
These are the screen coordinates and the size of the captured screen area.


Screen Capture Preview

This section displays a view of the screen or screen area that was captured during the recording.

Fit screen to visible area
Select this option to resize the screen capture to the available space in the test editor.


6.3.10. Citrix image synchronization details

In the test editor, the Citrix image synchronization allows Citrix performance tests to keep track of the contents of a screen area during the replay. These settings apply to the image synchronization element that is selected.


Image synchronization attributes

Locations
These are the coordinates of the top left corner of the image synchronization area, and the size of the image synchronization area in pixels. This field is read only.
Synchronization state
This describes the behavior of the test if a synchronization timeout occurs on the image. The base timeout delay is specified in the Citrix test generation preferences, however the actual delay varies with the level of synchronization.
Conditional
The conditional timeout delay is the base timeout delay as specified in the Citrix test generation preferences. If the synchronization fails, the test tries to resume execution and a timeout is logged in the Citrix performance report and the test log.
Mandatory
The mandatory timeout delay is three times the base timeout delay. If the synchronization fails, the test exits with an error status and a timeout is logged in the test log.
Optional
The optional timeout delay is fixed at 2 seconds. If the synchronization fails, the test ignores the timeout.
Image synchronization preview
This is the screen capture of the image synchronization area as it was recorded. Select Fit screen to visible area to limit the size of the screen capture in the test editor.


Synchronization

Bitmap hash code
This specifies that the synchronization will be evaluated on the bitmap hash code. A hash code is a unique number that is calculated from the image of the selected area. When an image synchronization is encountered during test execution, the test calculates the hash code on the selected area and synchronizes the test if the hash code of the screen area matches the expected hash code before a timeout occurs.
Optical character recognition
This specifies that the synchronization will be evaluated on a recognized text value. Optical character recognition extracts a text string from the selected image area. When an image synchronization is encountered during test execution, the test continually applies text recognition to the selected area and synchronizes the test as soon as the extracted text value matches the expected text value before a timeout occurs.
Value
This page specifies the expected value depending on the specified recognition technique. You can add alternate values by clicking Add so that the image synchronization can succeed in multiple conditions. Alternate values are evaluated in the same way as the main expected value.
Bitmap hash code
When Bitmap hash code is selected, this is the hash code that was calculated on the selected image area during the recording. After executing a test, you can create alternate hash code values by copying the resulting hash codes from the Citrix image synchronization view.
Expected text

When Optical character recognition is selected, this is the expected text value that was extracted by the optical character recognition from the selected image area. Click Extract text to extract a text string from the selected image area.

If the text extraction is unsuccessful, try changing the text recognition settings on the Options page. However, accuracy of the recognized text is not essential. It is only important that the recognized text is consistent each time the test is executed for the test to synchronize.

Use regular expression
Select this option to express the expected text string using standard regular expression syntax.
Options
This page specifies the settings for text extraction by optical character recognition. You might need to experiment with various settings to obtain good results. After changing a setting, click the Value tab and click Extract text to see if the text recognition has improved. Note that because optical character recognition is used for verification purposes, consistency of the results in more important than the accuracy of the extracted text.
Zoom factor
This is the enlargement factor that is applied to the image. The default setting is medium for standard font sizes. Increase the zoom factor to improve recognition of smaller fonts or decrease for larger fonts.
Language
This is the language of the dictionary used by the text recognition synchronization. Select the language of the application you are testing. If the language of your application is not available in the list, change the language setting in the Default OCR settings of the Citrix Test Generation preferences.
Brightness
This is the brightness level from 0 to 250 that is applied to the image. The default setting is 70 for normally contrasted text. Increase the brightness setting to improve recognition of darker images or decrease for lighter images.
Recognition rate
This is the rate of recognition required for the extracted string to match the expected text. Decrease the recognition rate to tolerate a proportion of mismatching characters in the recognized text. The default is 100%, which means that an exact match is required.


Verification Point

Enable verification point on synchronized element
When selected, the test verifies whether the image synchronization succeeds. If the synchronization produces a timeout, the verification point returns a fail status in the Citrix performance test report.


Response Time

Stop response time for
Select this option to use the current image synchronization to stop a response time measurement. When you select this option on an image synchronization that is not already linked to a response time, a new response time is created with a default name. If there are response times that do not have a stop action, then these are also listed. Select the response time to link to.
Go to response time definition
Click here to navigate to the session element to view the Response Time Definitions table.


6.3.11. Citrix logoff details

In the test editor, the logoff element is located at the end of the Citrix test. The logoff element is created only when the recording is stopped by clicking Stop Recording

in the Recorder Control window. Other methods of ending a recording, such as closing the Citrix XenApp client or closing the Windows session, do not create a session logoff element in the generated test.


Session Logoff Attributes

Session Title
This is the name of the current session. By default, it is the same as the name of the test.
Type of Event
Select whether the logoff element performs a Logoff or a Disconnect event.


Think Time

Enable Think Time
Select this option to specify a think time for the current user input action.
Think Time
Specifies the programmatically calculated time delay that is observed for each virtual user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking about an input before performing the action.


6.4.1. Service test details

In the test editor, the test element is the first element in the test suite. The settings in the test element apply to the entire test.


Common options

Datapools
This lists details about each datapool that the test uses: the name of the datapool, the columns that are used, and the location in the test where the datapool column is referenced. Click the location to navigate there.
Add datapool
This adds a reference to a datapool that you want a test to use. Clicking this option is the same as clicking Add > Datapool with the test selected.
Remove
This removes the selected datapool. This option is not available if the datapool is in use.


SSL configuration

Define an SSL configuration for certificate authentication between the client and the server. SSL configurations can be used by any message request in the test. If you use multiple SSL configurations in the test, you must specify the configuration in each message request.

The default SSL configuration always trusts servers, which is equivalent to no authentication.

SSL configuration
Select an existing SSL configuration or create one. You can use the toolbar push buttons to create a New SSL configuration and to Rename or Delete existing SSL configurations. You can also Copy and Paste SSL configurations to and from the SSL editor and the test editor.
Server Authentication
This section describes how the client trusts the server.
Always trust server
Select this option if no authentication is required or to ignore server certificates so that all servers are trusted. If you are using single authentication and you want to accept trusted servers only, then disable this option and specify a truststore containing the trusted server certificates.
Client truststore
When you are using single authentication, the client truststore contains the certificates of all trusted servers. Click Browse to specify a KS, JKS, or JCEKS file containing valid certificates of the trusted servers.
Password
If the client truststore file is encrypted, type the password required to access the file.
Mutual Authentication
This section describes how the server trusts the client in addition to server authentication.
Use client-side certificate
If you are using double authentication, select this option to specify a keystore containing the client certificate. This certificate allows the server to authenticate the client.
Client certificate keystore
Click Browse to specify a KS, JKS, or JCEKS file containing a valid certificate that authenticates the client.
Password
If the client truststore file is encrypted, type the password required to access the file.


Protocol Configuration (HTTP)

The HTTP configuration page of the test element specifies the information that your server libraries require to execute the HTTP send and receive functions.

An HTTP configuration can be used by any message call in the test. If you are using multiple protocol configurations in the test, you must specify the configuration for each message call.

Use HTTP Keep Alive
Select this option to keep the HTTP connection open after the request. This option is not available if you are using IBM Rational AppScan .
Use SSL
Select this option to use an SSL configuration. Click Configure SSL to create an SSL configuration or select an existing configuration.
Platform Authentication
In this section, specify the type of authentication that is required to access the service. Select None if no authentication is required.
Basic HTTP authentication
Select this option to specify the User Name and Password that are used for basic authentication.
NTLM authentication
Select this option to use the Microsoft NT LAN Manager (NTLM) authentication protocol. NTLM uses challenge-response authentication. This view lists what is negotiated (supported by the client and requested of the server) and what is authenticated (the client reply to the challenge from the server).
Kerberos authentication
Select this option to use the Kerberos authentication protocol between the client and server.
Connect through proxy server
If the HTTP connection needs to go through a proxy server or a corporate firewall, specify the Address and Port of the proxy server. If the proxy requires authentication, select either Basic proxy authentication or NTLM proxy authentication.
Proxy authentication
In this section, specify the type of authentication that is required to access the proxy. Select None if no authentication is required.
Basic proxy authentication
Select this option to specify the User Name and Password that are used for basic authentication.
NTLM proxy authentication
Select this option to use the Microsoft NT LAN Manager (NTLM) authentication protocol. NTLM uses challenge-response authentication. This view lists what is negotiated (supported by the client and requested of the server) and what is authenticated (the client reply to the challenge from the server).
Custom class
Select this option if the communication protocol requires complex, low-level processing with a custom Java. code to transform incoming or outgoing messages. Click Browse to select a Java class that uses the corresponding API. This option is not available in IBM Security AppScan.


Protocol Configuration (JMS)

The Java Message Service (JMS) configuration page of the test element specifies the information that your server libraries require to execute the JMS send and reception.

A JMS configuration can be used by any message call in the test. If you are using multiple protocol configurations within the test, you must specify the configuration in each message call.

Destination style
This is the style of the JMS destination. Select either Topic or Queue.
End-point address
This is the address of the destination.
Use temporary object
Select this option to send the JMS destination as a temporary object. For a JMS queue, a temporary JMS queue is sent in the message.
Reception point address
If Use temporary object is disabled, specify the JMS address of the destination endpoint.
Basic authentication
Select this option to specify the User Name and Password that are used for basic authentication.
Custom adapter class name
Set up a custom Java Naming and Directory Interface (JNDI) vendor adapter for this configuration. To use a custom adapter, you must write a Java class that extends the Axis class and methods. Specify the name of your custom adapter class in Adapter class name.
Text message
Specify whether the message is a text or a byte message.
Context factory properties
Edit the properties for a context factory. Click Add to add string properties to the context factory configuration.
Connector properties
Edit the properties for a connector. Click Add to add string properties to the connector configuration. The product supports the following connectors:

  • JMS priority
  • JMS delivery mode
  • JMS time to live


Protocol Configuration (WebSphere MQ)

The WebSphere MQ configuration page of the test element specifies the information that your server libraries require to execute the WebSphere MQ transport send and receive functions.

An MQ configuration can be used by any message call in the test. If you are using multiple protocol configurations in the test, you must specify the configuration for each message call.

Queue Manager
Use this area to specify queue manager options for the service.
Queue manager name
Specify the name of the queue manager to which to send the request.
Use local queue manager
Select this option to use a local queue manager. If you disable this option, specify the following information:
Queue manager address
Specify the IP address or host name of the remote WebSphere MQ server.
Queue manager port
Specify the listener port of the remote WebSphere MQ server.
Client channel
Specify the server-connection mode channel of the remote queue manager.
Queues
Use this area to specify the send queue options for the service.
Send queue name
Specify the name of the queue that the queue manager manages.
Use temporary queue for response
Specifies whether the WebSphere MQ server creates a temporary queue. If selected, the temporary queue is created for the sole purpose of receiving specific messages, and then deleted.
Receive queue name
If Use temporary queue is cleared, this option specifies the queue manager that is specified on the Queue manager name line. The specified queue manager must manage this queue. You can specify multiple queue names by using a semicolon (;) as a separator.
Use RFH2 header
Select whether to use the transport for SOAP over MQ feature that is provided by WebSphere MQ. This feature uses a predetermined MQ message format (RFH2); therefore, when selected, other Message Descriptor options are disabled.
SSL connection
Select this option to use an SSL configuration if a Client Channel setting refers to a secure channel. Click Open SSL Editor to create an SSL configuration or Change to change the SSL configuration that is associated with the current test.

If the WSDL used to create the message request uses a supported JMS URI to point to the WebSphere MQ server, then the SSL configuration is created automatically. If the test generator is unable to create the SSL configuration, you must create a new one manually.

If the WSDL is generated with the WebSphere MQ service (amqwdeployWMService), you must edit the WSDL to change the transport binding from HTTP to JMS to prevent the test generator from producing an HTTP configuration.

Cipher suite
Specify the cipher suite that is used in the channel configuration.
Message Descriptor
Configure the fields of the request. Replace a subset of an MQ message descriptor with a custom format for use with other server types, specifically when using an XML message request. Refer to WebSphere MQ documentation for details about message descriptors.
Use the Message Properties table to specify the following MQ message properties:

  • JMSXDeliveryCount
  • JMSXGroupSeq
  • JMS_IBM_Report_Exception
  • JMS_IBM_Report_Expiration
  • JMS_IBM_Report_COA
  • JMS_IBM_Report_COD
  • JMS_IBM_Report_PAN
  • JMS_IBM_Report_NAN
  • JMS_IBM_Report_Pass_Msg_ID
  • JMS_IBM_Report_Pass_Correl_ID
  • JMS_IBM_Report_Discard_Msg
  • JMS_IBM_MsgType
  • JMS_IBM_Feedback
  • JMS_IBM_PutApplType
  • JMS_IBM_Encoding
  • JMS_IBM_Last_Msg_In_Group
For more information about these properties, refer to the IBM WebSphere MQ documentation.
Target service
When using Microsoft .NET framework with the SOAP over MQ feature of WebSphere MQ, specify the name of the target service for the WSDL.


6.4.2. Service call details

Service call elements contain the contents of the call and the transport information for this call. The contents are made of the SOAP envelope. The transport information refers to the information that is required to send and receive and answer depending on the selected protocol.


Call Settings

Update node name automatically
Automatically rename the request in the Test Contents view.
Do not wait for response
Select this option to skip directly to the next request in the test after the current request is sent.
Operation and WSDL Name
These identify the WSDL name and operation to which the service request is binded.
WSDL Resource
This is the name of the WSDL resource in the workbench. Click the link to edit the WSDL file. If the WSDL file is missing, click the link to bind the request to a WSDL in the workspace or to import a WSDL. You can click the Edit WSDL Security button to edit the security policy for the WSDL or click the WSDL Synchronization button to update an imported WSDL with a remote WSDL.
Time Out (ms)
This is the timeout value in milliseconds. If no response is received after the specified time, an error is produced.
Think Time (ms)
This specifies the programmatically calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Update Response
Click this button to invoke the request with the current settings and to use the response to create a service response element or to update the existing response element.


Message

This page shows the XML content of the request and provides access to data correlation. The same content is presented in three different ways.

Form
This view provides a simplified view of the message that focuses on editing the values of the XML content. Use the Schema menu to enable assistance with editing XML content so that the XML is valid and complies with the XSD specification.

In the Form view, add the XML headers required for standard web service calls. On the Header bar, click Add () to create the default XML header structure for WS-Addressing, WS-ReliableMessaging or WS-Coordination requests, or click More for other standards. You can enable or disable XML header elements and specify the correct values for each XML element. Checks are performed to ensure that the XML content is valid.

Note: To add XML headers to calls in IBM Security AppScan , add a Static XML Headers algorithm on the Request Stack tab of the request.

Tree

This view provides a hierarchical view of the XML structure of the message, including elements, namespaces, and the associated values. You can use Add, Insert, Remove, Up, and Down to edit the XML elements and namespaces in the tree.

Click Filter to hide or show namespace, attribute, or text nodes, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable Allow only valid modifications if you do not have an XSD or if you want to bypass the schema.

You can right-click an XML element to convert it to an XML fragment. This enables you to perform data correlation (use datapools and create references) on the entire XML fragment instead of only on the value.

Source
This view displays the source XML content of the message or plain text content.

Important: In the Source view, do not edit the tags that start with SoaTag. If you delete or change these tags, any references and substitutions in the test will be broken. You cannot recreate these tags after you delete them.


Attachments

This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.

The Content ID is the identifier that the request uses to refer to the attachments. The method for using this identifier depends on your server requirements.

MIME or DIME
Select whether the attachment conforms to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification
Use MTOM transmission mechanism
By default, the request uses SOAP Messages with Attachments (SwA) to handle attachments. Select this option to handle attachments with the SOAP Message Transmission Optimization Mechanism (MTOM).


Transport

This page covers the transport settings used to send the request. The transport protocol settings apply to a transport configuration, which can be either HTTP, Java. Message Service (JMS), WebSphere MQ, or Microsoft .NET. You can create several configurations for each protocol so that you can easily switch protocols or variants of protocols.

Note: If you are using IBM Security AppScan,only the HTTP transport protocol is available.

HTTP
Select HTTP to use the HTTP transport for the request. At the request level, you can update a URL or SOAP action and the reference to the global configuration of a test.
Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. HTTP transport configurations contain proxy and authentication settings that can be reused.
URL
Specify the URL end point of the service request.
Method and Version
Specify the HTTP method and version to be used to invoke the service request.
Headers
Specify the names and values of any custom HTTP headers required by the service. Click Add, Edit or Remove to modify the headers list.
Cookies
Specify the names and values of any cookies required by the service. Click Add, Edit or Remove to modify the cookies list.
JMS

Select JMS to use the Java Messaging Service transport for the request. This page enables you to add string properties that are attached to the request for a JMS configuration. These will be sent as message properties through JMS.

Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. JMS transport configurations contain generic end point, reception point, and adapter settings that can be reused.
Properties
Specify the names and values of any string properties required by the request for the current JMS transport configuration. These are sent as message properties through JMS. Click Add, Edit or Remove to modify the properties list.
WebSphere MQ
Select MQ to use the IBM WebSphere MQ transport for the request. This page enables you to specify the SOAP action and override the settings for the WebSphere MQ configuration selected at the test level.
Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. WebSphere MQ transport configurations contain generic queue, header, and SSL settings that can be reused.
SOAP Action
Specifies the SOAP action to be used to invoke the WebSphere MQ request.
Override MQ protocol configuration values
Select this option to configure the fields of the WebSphere MQ message. Replace a subset of an MQ message descriptor with a custom format for use with other server types, specifically when using an XML message request.
Customize message header
Select this option to specify custom headers for the transport for the SOAP over MQ feature that is provided by WebSphere MQ. This feature uses a predetermined MQ message format (RFH2), therefore, when selected, other Message Descriptor options are disabled.
Message descriptor
These settings replace the message descriptor and header settings of the MQ protocol configuration. Refer to WebSphere MQ documentation for information about message descriptors.
Microsoft .NET
Select Microsoft .NET to use the Microsoft .NET Framework transport for requests based on Windows Communication Foundation (WCF). This page enables you to override the settings for the Microsoft .NET configuration selected at the test level.
Item
Click Add to specify the name and value of the WCF actions required by the service. This table is automatically generated when you import a Microsoft .NET WSDL file. Refer to the Microsoft .NET WCF documentation for more information.


6.4.3. XML call details

XML call elements contain the contents of the call and the transport information for this call. The contents consist of plain XML that is transmitted over an HTTP or JMS transport. The transport information refers to the information that is required to send and receive and answer depending on the selected protocol.

Update node name automatically
When enabled, this option updates the name of the XML call element in the test contents.
One way
This option specifies that no response from the server is expected after the call. This disables the Update Return button.
Time Out (ms)
This is the timeout value in milliseconds. If no response is received after the specified time, an error is produced.
Think Time (ms)
This specifies the programmatically-calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Update Return
This opens the Return Preview window. From this window, you can invoke the call from the workbench to create or update the message return that is associated with the call.


Message

These pages present the XML contents of the call and provide access to data correlation in three different forms

Form
This view provides a simplified view of the message that focuses on editing the values of the XML content. Use the Schema menu to enable assistance with editing XML content so that the XML is valid and complies with the XSD specification.

In the Form view, add the XML headers required for standard web service calls. On the Header bar, click Add () to create the default XML header structure for WS-Addressing, WS-ReliableMessaging or WS-Coordination requests, or click More for other standards. You can enable or disable XML header elements and specify the correct values for each XML element. Checks are performed to ensure that the XML content is valid.

Note: To add XML headers to calls in IBM Security AppScan , add a Static XML Headers algorithm on the Request Stack tab of the request.

Tree

This view provides a hierarchical view of the XML structure of the message, including elements, namespaces, and the associated values. You can use Add, Insert, Remove, Up, and Down to edit the XML elements and namespaces in the tree.

Click Filter to hide or show namespace, attribute, or text nodes, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable Allow only valid modifications if you do not have an XSD or if you want to bypass the schema.

You can right-click an XML element to convert it to an XML fragment. This enables you to perform data correlation (use datapools and create references) on the entire XML fragment instead of only on the value.

Source
This view displays the source XML content of the message or plain text content.

Important: In the Source view, do not edit the tags that start with SoaTag. If you delete or change these tags, any references and substitutions in the test will be broken. You cannot recreate these tags after you delete them.


Attachments

This page lists the MIME attachments that are attached to the call. The contents of this view correspond to the specification of Multipurpose Internet Mail Extensions (MIME). You can use this page to add workbench resources as MIME attachments and change properties.

The Content ID is the identifier that the call uses to refer to the attachments. The method for using this identifier depends on your server requirements.


Protocol

This page covers the protocol that is used to send the call. The protocol can be either HTTP or Java. Message Service (JMS) on a message-by-message basis.

HTTP
This page enables you to override the HTTP settings that are attached to the call for a local HTTP configuration.
Method

This option enables you to specify the HTTP method of the XML call, among the following list of methods:

  • POST
  • GET
  • PUT
  • DELETE
Version

This option enables you to specify either HTTP 1.0 or HTTP 1.1.

URL

This field enables you to specify the URL of the XML call.

Headers

This section enables you to add headers to your call. Headers must be compatible with the specified HTTP method.

The application manages the following headers and they cannot be added:

  • User Agent
  • Host Connection
  • Cache-Control
  • Pragma
  • Content-Type
  • Content-Length
Cookies

This section enables you to manage cookies. You can add, edit and remove cookies, and create references.

JMS

This page enables you to add string properties that are attached to the call for a local JMS configuration. These will be sent as message properties through JMS.

MQ

This page enables you to override settings that are attached to the call for a local WebSphere MQ configuration.

Name
This is the name that is displayed in the message call as a link to this protocol configuration.
Queue manager name
Name of the queue manager to which you want to send the call.
Queue name
Name of the queue that the queue manager manages.
Use local queue manager
Specifies whether the WebSphere MQ server is running on the local computer. If the server is located on a remote computer, clear this option to specify the remote MQ server details.
Queue manager address
Specifies the IP address or hostname of the remote MQ server.
Queue manager port
Specifies the listener port of the remote MQ server.
Client channel
Server-connection mode channel of the remote queue manager.
Use temporary queue
Specifies whether the MQ server creates a temporary queue. If selected, the temporary queue is created for the sole purpose of receiving specific messages, and then deleted.
Queue name
If Use temporary queue is cleared, this option specifies the name of the queue where message returns from the MQ server are received. The queue manager that is specified in Queue manager name must manage this queue.

The calls and message returns are associated by the correlation ID in the MQ message, which means that the report setting of the message is set to MQC.MQRO_COPY_MSG_ID_TO_CORREL_ID. The server must follow this constraint. This supports the transport for SOAP feature provided by WebSphere MQ.

Target service
This option is for using Microsoft .NET with the IBM WebSphere MQ transport for SOAP feature. This specifies the name of the ASPX file within the .NET listener directory.
Use RFH2 Header
Specifies whether the SOAP messages uses an RFH2 header, which uses a predetermined MQ message format. When selected, other Message Descriptor options are disabled. Use this option for the WebSphere MQ transport for SOAP feature. If you are using WebSphere Integration Developer (WID) MQ binding, the binding understands messages with or without the RFH2 header.
Message Descriptor
This section enables you to configure the fields of the message call. Replace a subset of an MQ message descriptor with a custom format for use with other server types, specifically when using an XML message call. See WebSphere MQ documentation for details about message descriptors.
Use temporary queue
This section enables you to specify a user name and password for basic authentication on the application server.
SSL connection
Select this option to use an SSL configuration if a Client Channel setting refers to a secure channel. Click Open SSL Editor to create a new SSL configuration, or Change to change the SSL configuration that is associated with the current test.

If the Web Services Description Language (WSDL) that is used to create the message call uses a supported JMS URI to point to the WebSphere MQ server, then the SSL configuration is created automatically. If the test generator was unable to create the SSL configuration, you must create a new one manually.

If the WSDL was generated with the WebSphere MQ service (amqwdeployWMService), edit the WSDL to change the transport binding from HTTP to JMS in order to prevent the test generator to produce an HTTP configuration.

Cipher suite
Specify the cipher suite that is used in the channel configuration.


Local XML Security

This page allows you to add a custom security algorithm that is implemented in a Java class. Custom algorithms can be applied to the XML contents that are sent to and received from the server.

Add, Insert, Remove, Up, and Down
These buttons allow you to create a stack of security algorithms. Each algorithm is applied to the stack sequentially. Click Add to add a custom security algorithm.
Tools
This button allows you to change the way the algorithm stack is displayed.
Custom Security Algorithm

After adding a custom security algorithm to the stack. With this window, you can specify the Java class that implements the algorithm. The Java class uses the following interface:

/**
* ***************************************************************
* IBM Confidential
* 
* (c) Copyright IBM Corporation. 2008. All Rights Reserved.
* 
* The source code for this program is not published or otherwise
* divested of its trade secrets, irrespective of what has been
* deposited with the U.S. Copyright Office.
* *************************************************************** 
* 
*/
 
package com.ibm.rational.test.lt.models.wscore.datamodel.security.xmlsec;
 
import java.util.Properties;
import org.w3c.dom.Document;
 
 
public interface ICustomSecurityAlgorithm {
	
	/**
	 * The following methods can be used in both case:
	 * Execution in the workbench and execution of the test.
	 */
	
	
	/**
	 * Called to process de Document that is sent over a transport.
	 * @param subject 	 */
	void process(Document subject);
	/**
	 * Called to un process a document that is received from a server.
	 * @param subject 	 */
	void unProcess(Document subject);
	
	/**
	 * Properties defined in the UI of the CustomSecurityAlgorithm.
	 * @param map
	 */
	void setProperties(Properties map);
	
	/**
	 * The following methods can only be used in terms of cast to test service interface, 	 * or in terms of access to the previous XML  information, when the jar containing 	 * the custom security algorithm is deployed in the performance test project. In 	 * this case you cannot use the algorimth directly from the workbench.
	 */
	
	/**
	 * This object corresponds to the ITestExecutionService object.
	 * This applies only to an algorithm that must link to the execution of the test.
	 * If you plan to use this object you will need to deploy the jar containing the 	 * implementation into your performance test project and not directly into the JRE.
	 * 
	 * In case of a need of the previous xml document received from the execution you can 
	 * obtain the value using:
	 * IDataArea area = ((ITestExecutionService)executionObject).findDataArea(IDataArea.VIRTUALUSER);
	 *String previousXML = (String) area.get("PREVIOUS_XML"); //$NON-NLS-1$
	 * 
	 */
	void setExecutionContext(Object executionObject);

  • The process method modifies the XML before it is sent to the server.
  • The unprocess method modifies the XML after it is received from the server.
  • ThesetProperties method retrieves any properties that are defined in the security editor for this custom security interface.
  • The setExecutionContext method is called during test with the object ITestExecutionServices that corresponds to the message using this custom security interface.
Custom Security Algorithm Class Name
This specifies the class that implements the security algorithm. Click Browse Class to select a class from the workspace.
Algorithm Name
This specifies a name for the current algorithm.
Properties
This list specifies properties that the setProperties method uses in the algorithm. Use Add, Remove, or Edit to create the properties list.


6.4.4. Binary call details

Binary calls are specialized service calls that can be used to send binary messages. The transport information refers to the information that is required to send and receive and answer depending on the selected protocol.


Message

Update node name automatically
Automatically rename the request in the Test Contents view.
Do not wait for response
Select this option to skip directly to the next request in the test after the current request is sent.
Time Out (ms)
This is the timeout value in milliseconds. If no response is received after the specified time, an error is produced.
Think Time (ms)
This specifies the programmatically calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Update Response
Click this button to invoke the request with the current settings and to use the response to create a binary response element or to update the existing response element.
Source
This page presents the binary contents of the request and provides access to data correlation. The same contents are presented in Binary and Raw ASCII views.
Attachments
This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.
Transport
This page covers the transport protocol used to send the request. The transport protocol can be either HTTP, Java. Message Service (JMS), or WebSphere MQ. You can create several configurations for each protocol so that you can easily switch protocols or variants of protocols.

Note: If you are using IBM Security AppScan ,only the HTTP transport protocol is available.


Attachments

This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.

The Content ID is the identifier that the request uses to refer to the attachments. The method for using this identifier depends on your server requirements.

MIME or DIME
Select whether the attachment conforms to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification
Use MTOM transmission mechanism
By default, the request uses SOAP Messages with Attachments (SwA) to handle attachments. Select this option to handle attachments with the SOAP Message Transmission Optimization Mechanism (MTOM).


Transport

This page covers the transport settings used to send the request. The transport protocol settings apply to a transport configuration, which can be either HTTP, Java Message Service (JMS), WebSphere MQ, or Microsoft .NET. You can create several configurations for each protocol so that you can easily switch protocols or variants of protocols.

Note: If you are using IBM Security AppScan,only the HTTP transport protocol is available.

HTTP
Select HTTP to use the HTTP transport for the request. At the request level, you can update a URL or SOAP action and the reference to the global configuration of a test.
Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. HTTP transport configurations contain proxy and authentication settings that can be reused.
URL
Specify the URL end point of the service request.
Method and Version
Specify the HTTP method and version to be used to invoke the service request.
Headers
Specify the names and values of any custom HTTP headers required by the service. Click Add, Edit or Remove to modify the headers list.
Cookies
Specify the names and values of any cookies required by the service. Click Add, Edit or Remove to modify the cookies list.
JMS

Select JMS to use the Java Messaging Service transport for the request. This page enables you to add string properties that are attached to the request for a JMS configuration. These will be sent as message properties through JMS.

Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. JMS transport configurations contain generic end point, reception point, and adapter settings that can be reused.
Properties
Specify the names and values of any string properties required by the request for the current JMS transport configuration. These are sent as message properties through JMS. Click Add, Edit or Remove to modify the properties list.
WebSphere MQ
Select MQ to use the IBM WebSphere MQ transport for the request. This page enables you to specify the SOAP action and override the settings for the WebSphere MQ configuration selected at the test level.
Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. WebSphere MQ transport configurations contain generic queue, header, and SSL settings that can be reused.
SOAP Action
Specifies the SOAP action to be used to invoke the WebSphere MQ request.
Override MQ protocol configuration values
Select this option to configure the fields of the WebSphere MQ message. Replace a subset of an MQ message descriptor with a custom format for use with other server types, specifically when using an XML message request.
Customize message header
Select this option to specify custom headers for the transport for the SOAP over MQ feature that is provided by WebSphere MQ. This feature uses a predetermined MQ message format (RFH2), therefore, when selected, other Message Descriptor options are disabled.
Message descriptor
These settings replace the message descriptor and header settings of the MQ protocol configuration. Refer to WebSphere MQ documentation for information about message descriptors.
Microsoft .NET
Select Microsoft .NET to use the Microsoft .NET Framework transport for requests based on Windows Communication Foundation (WCF). This page enables you to override the settings for the Microsoft .NET configuration selected at the test level.
Item
Click Add to specify the name and value of the WCF actions required by the service. This table is automatically generated when you import a Microsoft .NET WSDL file. Refer to the Microsoft .NET WCF documentation for more information.


6.4.5. Text call details

Text calls are specialized calls for sending plain text messages. The transport information refers to the information that is required to send and receive and answer depending on the selected protocol.


Message

Update node name automatically
Automatically rename the request in the Test Contents view.
Do not wait for response
Select this option to skip directly to the next request in the test after the current request is sent.
Time Out (ms)
This is the timeout value in milliseconds. If no response is received after the specified time, an error is produced.
Think Time (ms)
This specifies the programmatically calculated time delay that is observed for each user when this test is run with multiple virtual users. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Update Response
Click this button to invoke the request with the current settings and to use the response to create a service response element or to update the existing response element.
Source
This page presents the plain text contents of the request and provides access to data correlation.
Attachments
This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.
Transport
This page covers the transport protocol used to send the request. The transport protocol can be either HTTP, Java. Message Service (JMS), WebSphere MQ, or Microsoft .NET. You can create several configurations for each protocol so that you can easily switch protocols or variants of protocols.

Note: If you are using IBM Security AppScan ,only the HTTP transport protocol is available.


Attachments

This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.

The Content ID is the identifier that the request uses to refer to the attachments. The method for using this identifier depends on your server requirements.

MIME or DIME
Select whether the attachment conforms to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification
Use MTOM transmission mechanism
By default, the request uses SOAP Messages with Attachments (SwA) to handle attachments. Select this option to handle attachments with the SOAP Message Transmission Optimization Mechanism (MTOM).


Transport

This page covers the transport settings used to send the request. The transport protocol settings apply to a transport configuration, which can be either HTTP, Java Message Service (JMS), WebSphere MQ, or Microsoft .NET. You can create several configurations for each protocol so that you can easily switch protocols or variants of protocols.

Note: If you are using IBM Security AppScan,only the HTTP transport protocol is available.

HTTP
Select HTTP to use the HTTP transport for the request. At the request level, you can update a URL or SOAP action and the reference to the global configuration of a test.
Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. HTTP transport configurations contain proxy and authentication settings that can be reused.
URL
Specify the URL end point of the service request.
Method and Version
Specify the HTTP method and version to be used to invoke the service request.
Headers
Specify the names and values of any custom HTTP headers required by the service. Click Add, Edit or Remove to modify the headers list.
Cookies
Specify the names and values of any cookies required by the service. Click Add, Edit or Remove to modify the cookies list.
JMS

Select JMS to use the Java Messaging Service transport for the request. This page enables you to add string properties that are attached to the request for a JMS configuration. These will be sent as message properties through JMS.

Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. JMS transport configurations contain generic end point, reception point, and adapter settings that can be reused.
Properties
Specify the names and values of any string properties required by the request for the current JMS transport configuration. These are sent as message properties through JMS. Click Add, Edit or Remove to modify the properties list.
WebSphere MQ
Select MQ to use the IBM WebSphere MQ transport for the request. This page enables you to specify the SOAP action and override the settings for the WebSphere MQ configuration selected at the test level.
Protocol configuration
Click Change to specify a predefined transport configuration or to create a configuration. WebSphere MQ transport configurations contain generic queue, header, and SSL settings that can be reused.
SOAP Action
Specifies the SOAP action to be used to invoke the WebSphere MQ request.
Override MQ protocol configuration values
Select this option to configure the fields of the WebSphere MQ message. Replace a subset of an MQ message descriptor with a custom format for use with other server types, specifically when using an XML message request.
Customize message header
Select this option to specify custom headers for the transport for the SOAP over MQ feature that is provided by WebSphere MQ. This feature uses a predetermined MQ message format (RFH2), therefore, when selected, other Message Descriptor options are disabled.
Message descriptor
These settings replace the message descriptor and header settings of the MQ protocol configuration. Refer to WebSphere MQ documentation for information about message descriptors.
Microsoft .NET
Select Microsoft .NET to use the Microsoft .NET Framework transport for requests based on Windows Communication Foundation (WCF). This page enables you to override the settings for the Microsoft .NET configuration selected at the test level.
Item
Click Add to specify the name and value of the WCF actions required by the service. This table is automatically generated when you import a Microsoft .NET WSDL file. Refer to the Microsoft .NET WCF documentation for more information.


6.4.6. Service message return details

In the test editor, message return elements are located after every service call. Message returns describe the expected content returned by the service. You can use the information in the message return element for data correlation.

You can automatically generate or update the contents of a message return by clicking Update Return in the call element.


Message

This page shows the XML content of the request and provides access to data correlation. The same content is presented in three different ways.

Form
This view provides a simplified view of the message that focuses on editing the values of the XML content. Use the Schema menu to enable assistance with editing XML content so that the XML is valid and complies with the XSD specification.

In the Form view, add the XML headers required for standard web service calls. On the Header bar, click Add () to create the default XML header structure for WS-Addressing, WS-ReliableMessaging or WS-Coordination requests, or click More for other standards. You can enable or disable XML header elements and specify the correct values for each XML element. Checks are performed to ensure that the XML content is valid.

Note: To add XML headers to calls in IBM Security AppScan , add a Static XML Headers algorithm on the Request Stack tab of the request.

Tree

This view provides a hierarchical view of the XML structure of the message, including elements, namespaces, and the associated values. You can use Add, Insert, Remove, Up, and Down to edit the XML elements and namespaces in the tree.

Click Filter to hide or show namespace, attribute, or text nodes, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable Allow only valid modifications if you do not have an XSD or if you want to bypass the schema.

You can right-click an XML element to convert it to an XML fragment. This enables you to perform data correlation (use datapools and create references) on the entire XML fragment instead of only on the value.

Source
This view displays the source XML content of the message or plain text content.

Important: In the Source view, do not edit the tags that start with SoaTag. If you delete or change these tags, any references and substitutions in the test will be broken. You cannot recreate these tags after you delete them.


Attachments

This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.

The Content ID is the identifier that the request uses to refer to the attachments. The method for using this identifier depends on your server requirements.

MIME or DIME
Select whether the attachment conforms to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification
Use MTOM transmission mechanism
By default, the request uses SOAP Messages with Attachments (SwA) to handle attachments. Select this option to handle attachments with the SOAP Message Transmission Optimization Mechanism (MTOM).


Response Properties

This page lists the names and values of properties of the response.


6.4.7. Service verification point details

Verification points enable you to test the behavior of a service by checking the message return of a call against criteria. You can perform checks on the contents of the XML document of the message return, the number of nodes returned by an XPath query, or the existence of a specific attachment.


Contain and equal verification points

Contain verification points return a Pass status when the message return object contains the specified XML message. Equal verification points return a Pass status when the message return object matches the specified XML message.

The verification occurs if the message return object is a valid XML message. The verification is performed on both the name of the XML element and final return value of the element. Attributes are not checked.

Use the Form, Tree and Source views to edit the message content.

Test using XML namespace

Perform the verification on a qualified structure, including the XML namespace, instead of the simple name. For example, if the expected XML data is:

<ns1:responseElement xmlns:ns1="http://www.ibm.com/wbse"></ns1:responseElement>

When Namespace aware is enabled, the verification is made on the full name of the return value:

<ns1:responseElement xmlns:ns1="http://www.ibm.com/wbse"></ns1:responseElement>

When Namespace aware is disabled, the verification ignores the namespace tagging and checks only the simple name of the element and the final return value:

<ns1:responseElement xmlns:ns1="http://www.ibm.com/wbse"></ns1:responseElement>
In this case, you can simplify the value of the expected XML data to:
<responseElement><responseElement>
Test XML text nodes
Select this option to include XML text values in the verification.
Test XML attributes
Select this option to include XML attributes in the verification.
Form

This view provides an simple view of the elements of the call with their values. Use this view to quickly edit values in the form.

Tree

This view provides a hierarchical view of the elements of the call with their values, attributes, and the associated namespaces. You can use Add, Insert, Remove, Up, and Down to edit this list.

Click the namespace, attribute, or text filter buttons, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable smart editing if you do not have an XSD or if you want to bypass the schema.

You can specify standard Java. regular expressions. In the Regexp column, select the line of an attribute or text value and type the regular expression in the Value column. For example, the following regular expression checks for a correctly formatted email address: /^([a-zA-Z0-9_\.\-])+\@(([a-zA-Z0-9\-])+\.)+([a-zA-Z0-9]{2,4})+$/

Source
This view displays the source XML document of the call.

Important: The ID tags that are shown in the Source page refer to an internal representation for the test. If you remove these tags, you will remove any existing references and substitutions. You cannot re-create these tags after you delete them.


Query verification points

Query verification points return a Pass status when the number of nodes returned by an XML Path language query matches the expected number of nodes specified in the verification point.

The verification occurs if the message return object is a valid XML document.

XPath expression

Specify a query using the XML path language. Refer to the XPath specification for details on expressing an XPath query: http://www.w3.org/TR/xpath . Click Build Expression to open the XPath Expression Builder window.

Note: Because XPath expressions require that the qualified name has a prefix, XPath expressions will return null for the default namespace declared with xmlns.

Operator and Expected Count
These specify the expected number of nodes returned by the query.
Evaluate
Click this button to calculate the number of nodes based on the current input. This value automatically replaces the current Expected count.


Attachment verification points

Attachment verification points return a Pass status when the message return attachment matches all of the criteria specified in the verification point.

The verification occurs only if the message return object is a valid XML document.

Index of the attachment to be verified

In the case of multiple attachments, this number specifies which attachment to check.

Attachment size

This specifies the expected size of the attachment.

MIME type
This specifies the expected MIME type of the attachment.
Encoding
This specifies the expected encoding of the attachment.


XSD verification points

XSD verification points check that the content returned by the service is validated by the specified XML Schema Definition (XSD) files or Web Service Definition Language (WSDL) files that contain XSDs.

The verification occurs only if the message return object is a valid XML document.

Add XSD
Add an XSD to the list of validation checks.
Add WSDL

Add a WSDL containing an XSD to the list of validation checks.

Open

Open a selected XSD or WSDL file.


6.4.8. Service callback details

Callback elements define the web service or XML call containing the element as an asynchronous call. The behavior of the test after invoking the asynchronous call is specified by the parallel, receive, and timeout elements that are contained in the callback element.

Callback endpoint location

This element specifies the XML element in the asynchronous call that defines the endpoint URL of the callback receiver. During a test, this endpoint is used to redirect the callback to the tester instead of the real receiver.

Display full path

Display the extended path of the endpoint XML element.


6.4.9. Service timeout details

Timeout elements describe the behavior of an asynchronous service test when the callback is not received after a specified timeout period. Timeout elements are created inside callback elements.

Timeout

This value specifies the timeout delay after which the test runs the elements that the timeout element contains.

Enable verification point for timeout

With this option, you can enable a verification point on the timeout. If the timeout delay is reached, the verification point reports a fail status in the test log.

The list displays the test elements that the timeout element contains. These are the same contents as displayed in the Test Contents of the test editor.


6.4.10. Service parallel details

Parallel elements describe the behavior of an asynchronous web service test after the asynchronous call has been made and while the tester is waiting for a callback message return. Parallel elements are created inside callback elements.

The list displays the test elements that the parallel element contains. These are the same contents as displayed in the Test Contents area of the test editor.


6.4.11. Service receive details

Service receive elements specify the callback message return from an asynchronous web service. A receive element can contain elements that describe the behavior of the test when the callback message return is received. Receive elements are created inside callback elements.

The contents of a receive element are the same as a typical message return element.


Message

This page shows the XML content of the request and provides access to data correlation. The same content is presented in three different ways.

Form
This view provides a simplified view of the message that focuses on editing the values of the XML content. Use the Schema menu to enable assistance with editing XML content so that the XML is valid and complies with the XSD specification.

In the Form view, add the XML headers required for standard web service calls. On the Header bar, click Add () to create the default XML header structure for WS-Addressing, WS-ReliableMessaging or WS-Coordination requests, or click More for other standards. You can enable or disable XML header elements and specify the correct values for each XML element. Checks are performed to ensure that the XML content is valid.

Note: To add XML headers to calls in IBM Security AppScan , add a Static XML Headers algorithm on the Request Stack tab of the request.

Tree

This view provides a hierarchical view of the XML structure of the message, including elements, namespaces, and the associated values. You can use Add, Insert, Remove, Up, and Down to edit the XML elements and namespaces in the tree.

Click Filter to hide or show namespace, attribute, or text nodes, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable Allow only valid modifications if you do not have an XSD or if you want to bypass the schema.

You can right-click an XML element to convert it to an XML fragment. This enables you to perform data correlation (use datapools and create references) on the entire XML fragment instead of only on the value.

Source
This view displays the source XML content of the message or plain text content.

Important: In the Source view, do not edit the tags that start with SoaTag. If you delete or change these tags, any references and substitutions in the test will be broken. You cannot recreate these tags after you delete them.


Attachments

This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.

The Content ID is the identifier that the request uses to refer to the attachments. The method for using this identifier depends on your server requirements.

MIME or DIME
Select whether the attachment conforms to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification
Use MTOM transmission mechanism
By default, the request uses SOAP Messages with Attachments (SwA) to handle attachments. Select this option to handle attachments with the SOAP Message Transmission Optimization Mechanism (MTOM).


Response Properties

This page lists the names and values of properties of the response.


6.5. Service stub editor reference

Service stub elements are part of the service stub and can be edited in the stub editor.


6.5.1. Stub operation details

Stub operation elements describe the format of the call that the service stub expects to receive. There is one stub operation for each operation that was detected in the WSDL specification. Each stub operation contains at least one default case element, or several case elements describing the response of the service stub depending on the incoming calls. The information from the stub operation element can be used for data correlation.

This page presents the XML or text contents of the call and provides access to data correlation. The same contents are presented in Form, Tree or Source view.

Form
This view provides a simplified view of the message that focuses on editing the values of the XML content. Use the Schema menu to enable assistance with editing XML content so that the XML is valid and complies with the XSD specification.

In the Form view, add the XML headers required for standard web service calls. On the Header bar, click Add () to create the default XML header structure for WS-Addressing, WS-ReliableMessaging or WS-Coordination requests, or click More for other standards. You can enable or disable XML header elements and specify the correct values for each XML element. Checks are performed to ensure that the XML content is valid.

Note: To add XML headers to calls in IBM Security AppScan , add a Static XML Headers algorithm on the Request Stack tab of the request.

Tree

This view provides a hierarchical view of the XML structure of the message, including elements, namespaces, and the associated values. You can use Add, Insert, Remove, Up, and Down to edit the XML elements and namespaces in the tree.

Click Filter to hide or show namespace, attribute, or text nodes, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable Allow only valid modifications if you do not have an XSD or if you want to bypass the schema.

You can right-click an XML element to convert it to an XML fragment. This enables you to perform data correlation (use datapools and create references) on the entire XML fragment instead of only on the value.

Source
This view displays the source XML content of the message or plain text content.

Important: In the Source view, do not edit the tags that start with SoaTag. If you delete or change these tags, any references and substitutions in the test will be broken. You cannot recreate these tags after you delete them.


6.5.2. Stub case details

Stub case elements enable you to specify the response of a service stub according to the content of an incoming call. You can perform checks on the contents of the XML document of the message return, the number of nodes returned by an XPath query, or the existence of a specific attachment. Each case element has an associated response element. There can be multiple case elements in a stub operation, but the Case : Default element is mandatory.


Default case

The default case contains the default response when no other criteria has been met. When multiple cases are defined, the default case is always the last one to be evaluated.


Contain and equal cases

Contain cases send their response when the incoming call contains the specified XML message. Equal cases send their response when the incoming call matched the specified XML message.

The verification occurs if the message return object is a valid XML message. The verification is performed on both the name of the XML element and final return value of the element. Attributes are not checked.

Use the Form, Tree and Source views to edit the message content.

Test using XML namespace

Perform the verification on a qualified structure, including the XML namespace, instead of the simple name. For example, if the expected XML data is:

<ns1:responseElement xmlns:ns1="http://www.ibm.com/wbse"></ns1:responseElement>

When Namespace aware is enabled, the verification is made on the full name of the return value:

<ns1:responseElement xmlns:ns1="http://www.ibm.com/wbse"></ns1:responseElement>

When Namespace aware is disabled, the verification ignores the namespace tagging and checks only the simple name of the element and the final return value:

<ns1:responseElement xmlns:ns1="http://www.ibm.com/wbse"></ns1:responseElement>
In this case, you can simplify the value of the expected XML data to:
<responseElement><responseElement>
Test XML text nodes
Select this option to include XML text values in the verification.
Test XML attributes
Select this option to include XML attributes in the verification.
Form

This view provides an simple view of the elements of the call with their values. Use this view to quickly edit values in the form.

Tree

This view provides a hierarchical view of the elements of the call with their values, attributes, and the associated namespaces. You can use Add, Insert, Remove, Up, and Down to edit this list.

Click the namespace, attribute, or text filter buttons, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable smart editing if you do not have an XSD or if you want to bypass the schema.

You can specify standard Java. regular expressions. In the Regexp column, select the line of an attribute or text value and type the regular expression in the Value column. For example, the following regular expression checks for a correctly formatted email address: /^([a-zA-Z0-9_\.\-])+\@(([a-zA-Z0-9\-])+\.)+([a-zA-Z0-9]{2,4})+$/

Source
This view displays the source XML document of the call.

Important: The ID tags that are shown in the Source page refer to an internal representation for the test. If you remove these tags, you will remove any existing references and substitutions. You cannot re-create these tags after you delete them.


Query case

Query cases send their response when the number of nodes returned by an XML Path language query matches the expected number of nodes specified in the case element.

The verification occurs if the message return object is a valid XML document.

XPath expression

Specify a query using the XML path language. Refer to the XPath specification for details on expressing an XPath query: http://www.w3.org/TR/xpath . Click Build Expression to open the XPath Expression Builder window.

Note: Because XPath expressions require that the qualified name has a prefix, XPath expressions will return null for the default namespace declared with xmlns.

Operator and Expected Count
These specify the expected number of nodes returned by the query.
Evaluate
Click this button to calculate the number of nodes based on the current input. This value automatically replaces the current Expected count.


Default case

Attachment verification points return a Pass status when the message return attachment matches all of the criteria specified in the verification point.

The verification occurs only if the message return object is a valid XML document.

Enable verification point
When selected, the test verifies whether the web service message return objects match the expected criteria of the verification point. An error is reported in the test log if the message return does not match the expected criteria.
Index of the attachment to be verified

In the case of multiple attachments, this number specifies which attachment to check.

Attachment size

This specifies the expected size of the attachment.

MIME type
This specifies the expected MIME type of the attachment.
Encoding
This specifies the expected encoding of the attachment.


6.5.3. Stub response details

In the stub editor, there is one response elements associated with each case element. Stub responses describe the content that is returned by the stub service, simulating the response of the original service.


Message

This page shows the XML content of the request and provides access to data correlation. The same content is presented in three different ways.

Form
This view provides a simplified view of the message that focuses on editing the values of the XML content. Use the Schema menu to enable assistance with editing XML content so that the XML is valid and complies with the XSD specification.

In the Form view, add the XML headers required for standard web service calls. On the Header bar, click Add () to create the default XML header structure for WS-Addressing, WS-ReliableMessaging or WS-Coordination requests, or click More for other standards. You can enable or disable XML header elements and specify the correct values for each XML element. Checks are performed to ensure that the XML content is valid.

Note: To add XML headers to calls in IBM Security AppScan , add a Static XML Headers algorithm on the Request Stack tab of the request.

Tree

This view provides a hierarchical view of the XML structure of the message, including elements, namespaces, and the associated values. You can use Add, Insert, Remove, Up, and Down to edit the XML elements and namespaces in the tree.

Click Filter to hide or show namespace, attribute, or text nodes, depending on your requirements.

Click Allow only valid modifications to enable smart editing, based on a specified XML schema document (XSD). To specify a set of XSD documents for the workbench, in the test navigator, right-click the project and select Properties and Schema Catalog. Disable Allow only valid modifications if you do not have an XSD or if you want to bypass the schema.

You can right-click an XML element to convert it to an XML fragment. This enables you to perform data correlation (use datapools and create references) on the entire XML fragment instead of only on the value.

Source
This view displays the source XML content of the message or plain text content.

Important: In the Source view, do not edit the tags that start with SoaTag. If you delete or change these tags, any references and substitutions in the test will be broken. You cannot recreate these tags after you delete them.


Attachments

This page lists the MIME or DIME attachments that are attached to the request. The contents of this view conform to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification. You can use this page to add workbench resources as MIME or DIME attachments and change properties.

The Content ID is the identifier that the request uses to refer to the attachments. The method for using this identifier depends on your server requirements.

MIME or DIME
Select whether the attachment conforms to the Multipurpose Internet Mail Extensions (MIME) or Direct Internet Message Encapsulation (DIME) specification
Use MTOM transmission mechanism
By default, the request uses SOAP Messages with Attachments (SwA) to handle attachments. Select this option to handle attachments with the SOAP Message Transmission Optimization Mechanism (MTOM).


6.6.1. Socket test details

In the test editor, the socket test is the highest level element of a socket test.

Common options
Lists details about each datapool that is used by the test: the name of the datapool, the columns that are used, and the location in the test where the datapool column is referenced. Click the location to navigate there.
Datapools
Lists details about each datapool that is used by the test: the name of the datapool, the columns that are used, and the location in the test where the datapool column is referenced. Click the location to navigate there.
Add datapool
Adds a reference to a datapool that you want a test to use. Clicking this option is the same as selecting a test, and then clicking Add > Datapool .
Delete
Removes the selected datapool reference from the test. This option is not available if the datapool is in use.
Enable response time breakdown
This is not supported for socket performance tests
Client program
Lists details about each client program that was used to record the test.


6.6.2. Socket connection details

In the test editor, socket connection elements describe the connection to a server. A connection must exist before you can send or receive data. These settings apply to all send, receive, and close elements that use the selected connection.

Establish a new connection to the host
Specify whether to create a new connection to the host or to reuse a connection from a different test. Select Reuse an existing connection if you are using multiple split tests in a performance schedule. For example, one test can open a connection, another test can reuse the connection, and a final test can close the connection. Specific name is a label that must match the name specified in the test that opens the connection.
Host and Port
Specify the computer name or IP address, and the port, to which the connection is made.
Timeout
Specify the timeout delay (in seconds) after which the test returns an error if no connection is established.
Symbolic name
Type the name of the connection as it will appear in the test results.
Think Time (ms)
Specify the programmatically calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Connection time
This is the reference time that was measured when the test was recorded.
Client program
This is the name of the recorded client application


6.6.3. Socket close details

In the test editor, socket close elements are located inside a socket test and represent the closing of a socket connect. These settings apply to the selected socket close element.

Connection
Socket connection that is closed. Click the link to navigate to the socket connection test element.
Think Time (ms)
Specify the programmatically calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.


6.6.4. Socket-secure upgrade details

In the test editor, the socket-secure upgrade represents the SSL or TLS negotiation that upgrades an existing connection to a secure connection.

Connection
Specify the socket connection that is used to send the data. Click the link to navigate to the socket connection test element.
Recorded negotiation time
Displays the delay that was required to negotiate the secure connection when recording the test. This value is read-only.
Think Time (ms)
Specify the programmatically calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Encryption protocol
Select whether the connection uses the SSL v3 or the TLS v1 protocol.
Recorded negotiation time
Displays the delay that was required to negotiate the secure connection when recording the test. This value is read-only.
Recorded cipher suite
Displays the encryption algorithm that was detected when the test recorded. This value is read-only.
Cipher suite
In most cases, select Auto to have the protocol negotiation to occur automatically. If necessary, select a specific encryption algorithm.


6.6.5. Socket send details

In the test editor, socket send elements represent the reception of data from the server.

Connection
Socket connection that is used to send the data. Click the link to navigate to the socket connection test element.
Think Time (ms)
Specify the programmatically calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Manipulate data from custom code
Process the receive data with a custom Java. class. Type the name of a custom Java class that is located in your workspace or click Generate Code to create a new Java class template. Click View Code to open the class in the Java editor.
Data
Specify the data that is sent through the connection. Bytes are expressed as 7-bit alphanumeric characters or hexadecimal bytes preceded with "\x". Bytes displays the number of bytes sent as data through the connection.
Connection
Socket connection that is used to send the data. Click the link to navigate to the socket connection test element.
Think Time (ms)
Specify the programmatically-calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Manipulate data from custom code
Process the receive data with a custom Java class. Type the name of a custom Java class that is located in your workspace or click Generate Code to create a new Java class template. Click View Code to open the class in the Java editor.
Data
Specify the data that is sent through the connection. Bytes are expressed as 7-bit alphanumeric characters or hexadecimal bytes preceded with "\x". Bytes displays the number of bytes sent as data through the connection.


6.6.6. Socket receive details

In the test editor, socket receive elements represent the reception of data from the server.

Connection
Socket connection that is used to receive the data. Click the link to navigate to the socket connection test element.
Think Time (ms)
Specify the programmatically calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Response Timeout
The maximum delay to receive the first byte of the response. If no data is received before the end of the response timeout delay, the receive action produces an error in the test log. The response timeout counter starts when the receive action starts after the think time; the counter is interrupted when the first byte is received.
End Policy
This specifies when to stop receiving data and to move on to the next test element.
Data
Specify the data that is received through the connection. Bytes are expressed as 7-bit alphanumeric characters or two-digit hexadecimal bytes preceded with \x. Additionally, \r and \n respectively stand for carriage-return and line-feed, while \\ represents the backslash character.
End Policy
This specifies when to stop receiving data and to move on to the next test element.

  • Detects inactivity: The receive action stops when no bytes are received from the connection after a delay specified in Inactivity threshold (in milliseconds). After this delay, the remote computer has finished sending the response and is considered inactive. This is the default setting.
  • Receives exact number of bytes: The receive action stops when the recorded number of bytes is received. Specify a Timeout (in seconds) after which the receive action produces an error in the test log, if the correct number of bytes is not received. If Link data size is enabled, the receive action expects the number of bytes displayed in the Data area. If Link data size is disabled, the receive action expects the number of bytes displayed in Bytes.
  • Receives until end of stream: The receive action stops when the connection is closed by the remote computer. If Accepts empty response is selected, then the reception of a single byte is not required and the Response Timeout is ignored. Specify a Timeout (in seconds) after which the receive action produces an error in the test log, if the correct number of bytes is not received.
  • Matches a string: The receive action stops when a specified sequence of bytes is received. Specify a Timeout (in seconds) after which the receive action produces an error in the test log, if the correct number of bytes is not received.
  • Recognizes a regular expression: The receive action stops when a sequence of bytes that matches a regular expression is received. Specify a Timeout (in seconds) after which the receive action produces an error in the test log, if the correct number of bytes is not received.

For end policies that have a Timeout setting, this setting specifies a delay (in seconds) after which the receive action produces an error in the test log if the end policy criteria is not met. The timeout counter starts when the first byte is received.

Except when the Receives until end of stream policy is in force, receive actions produce an error in the test log when the connection is closed by the remote computer.

Data
Specify the data that is received through the connection. Bytes are expressed as 7-bit alphanumeric characters or two-digit hexadecimal bytes preceded with \x. Additionally, \r and \n respectively stand for carriage-return and line-feed, while \\ represents the backslash character.
Link data size
When Receives exact number of bytes is selected as the End Policy, if Link data size is enabled, the receive action expects the number of bytes displayed in the Data area. If the option is disabled, the receive action expects the number of bytes displayed in Bytes.
Bytes
When Link data size is disabled, this specifies the number of bytes expected as data through the connection.


6.6.7. Terminal screen details

In the test editor, terminal screen elements are located inside a TN3270 test and represent the display of the terminal screen.

Connection
Socket connection that is used to receive the data. Click the link to navigate to the socket connection test element.
Response Timeout
The maximum delay to receive the first byte of the response. If no data is received before the end of the response timeout delay, the receive action produces an error in the test log. The response timeout counter starts when the receive action starts after the think time; the counter is interrupted when the first byte is received.
Terminal Screen
This area displays the TN3270 terminal screen as it was received. To create a verification point on an portion of text from the display, select the text, right-click the selection, and click Add Verification Point. Click the Add Verification Point to automatically create a verification from a significant portion of text.


6.6.8. Terminal input details

In the test editor, terminal input elements are located inside a TN3270 test and represent the user input that is sent to the server.

Connection
Socket connection that is used to send the data. Click the link to navigate to the socket connection test element.
Think Time (ms)
Specify the programmatically calculated time delay that is observed before executing the current test element. Think time is a statistical emulation of the amount of time actual users spend reading or thinking before performing an action.
Terminal Screen
Use this area to edit user input on the TN3270 screen as you would on the actual terminal screen. You can use data correlation on fields displayed on the terminal screen.
Input Key
Specifies the key that is pressed to send the current input.


6.6.9. Socket content verification point details

Content verification points enable you to test that the data received from a connection matches the expected data.

Content verification points return a Pass status when the received data matches the criteria specified in the verification point.

Comparison operator
Specify the criteria to use to perform the verification, among the following operators:
Equals
The verification point returns a Pass status if the received data exactly matches the text or binary content that is specified in the Data area.
Contains
The verification point returns a Pass status if the text or binary content that is specified in the Data area occurs at least once in the received data.
Starts with
The verification point returns a Pass status if the text or binary content that is specified in the Data area occurs at the beginning of the received data.
Ends with
The verification point returns a Pass status if the text or binary content that is specified in the Data area occurs at the end of the received data.
Differs from
The verification point returns a Pass status if the received data does not exactly match the text or binary content that is specified in the Data area.
Does not contain
The verification point returns a Pass status if the text or binary content that is specified in the Data area does not occur at least once in the received data.
Does not start with
The verification point returns a Pass status if the text or binary content that is specified in the Data area does not occur at the beginning of the received data.
Data
Specify the data that is expected to be received through the connection.
Binary
In this view, edit the expected content as binary data.
Raw ASCII
In this view, edit the expected content as raw ASCII data. Bytes are expressed as 7-bit alphanumeric characters or two-digit hexadecimal bytes preceded with \x. Additionally, \r and \n stand for carriage-return and line-feed, while \\ represents the backslash character.


6.6.10. Socket size verification point details

Size verification points enable you to test that the size of the data received from a connection matches an expected number of bytes.

Size verification points return a Pass status when the received data matches the criteria specified in the verification point.

Comparison operator
Specify the criteria that is used to perform the verification with these operators:

  • Is
  • Is less than
  • Is less or equals
  • Is more than
  • Is more than or equal to
  • Is not
Value (bytes)
Specify the size criteria for the verification point.


6.6.11. Socket custom verification point details

Custom verification points enable you to perform advanced checks through a user-defined custom Java. class.

Custom verification points return a Pass status when the custom class returns a Pass status after performing a verification written in Java code.

Class name
Specify the name of a Java class located in your workspace. The class must use the Rational Performance Tester API. See Executing test execution with custom code for more information.
Generate Code
Click this button to automatically create a Java class using the API template. You can extend this Java class to perform any advanced verification on the received data.
View Code
Click this button to open the class in the Java editor.


6.6.12. Terminal content verification point details

With terminal content verification points, you can test whether the text that is displayed on the terminal screen matches the expected text during a TN3270 test.

Comparison operator
Specify the criteria to use to perform the verification, among the following operators:
Equals at location
The verification point returns a Pass status if the terminal screen displays the expected text string starting at the point that isspecified byLine and Column.
Differs at location
The verification point returns a Pass status if the terminal screen does not display the expected text string at the point that isspecified byLine and Column.
Contains on line
The verification point returns a Pass status if the terminal screen displays the expected text string anywhere on the area that isspecified inLine.
Contains in line range
The verification point returns a Pass status if the terminal screen displays the expected text string anywhere inbetween the area that isspecified byFirst line and Last line.
Does not contain in line range
The verification point returns a Pass status if the terminal screen does not display the expected text string anywhere inbetween the area that isspecified byFirst line and Last line.
Matches regular expression at location
The verification point returns a Pass status if the terminal screen displays a text string that matches the specified Java regular expression starting at the point that isspecified byLine and Column.
Does not match regular expression at location
The verification point returns a Pass status if the terminal screen displays a text string that does not match the specified Java regular expression starting at the pointspecified byLine and Column.
Text
Expected text string or regular expression.


7. Schedule editor reference

Most schedule editor settings apply either to the entire schedule or to individual user groups.


7.1. Schedule properties

When you open a schedule, you can set its properties.


User Load page

Right-click in the table, and select Add to add a stage. To modify a stage, select the row, and then clickEdit or click the user icon in the first column.

Users
Enter the total number of users to be active in the stage (not the number of users to add or subtract to those currently running).
Run for specified period of time
Enter the length of time (and the time units) for the stage to run. When the specified number of users is achieved, the users will run for up to this time. When the time expires, the users continue to run if they are required for the next stage; otherwise, they are stopped gracefully.

Click Show Advanced to set further options to prepare the system under test before the users actually enter the stage:

Change Rate
Enter a number to set a delay between adding or removing each user, rather than adding them or subtracting them all at once. Staggering users avoids overloading the system, which can cause connection timeouts. The User Load Preview shows this delay in black.
Settle Time
A system under test might react to a sudden change in user population. With a defined settle time, which starts when the target number of users is reached, the system under test can settle into a steady state so that it can accurately reflect the user population. The User Load Preview shows this time in black.
Time limit for a user to respond to a stop request
Optionally enter a value. When a virtual user is asked to stop, it completes its current action (such as an HTTP request) and then finishes. If a virtual user has not finished within the specified time limit, the user is forced to finish.
User Load Preview
Previews the user population stages over time. The red line segments indicate that the total number of users has been achieved for the state.


Think Time page

Use the recorded think time
Select to play back a test at the same rate that it was recorded. This option has no effect on the think time.
Specify a fixed think time
Each user's think time is exactly the same value: the value that you specify. Although this does not emulate users accurately, it is useful if you want to play a test back quickly.
Increase/decrease the think time by a percentage
Type a percentage in Think time scale. Each user's think time is multiplied by that percentage. A value of 100 causes no change in think times. A value of 200 doubles the think times, so the schedule plays back half as fast as it was recorded. A value of 50 reduces the think times by half, so the schedule plays back twice as fast. A value of 0 indicates no delays.
Vary the think time by a random percentage
Each user's think time is randomly generated within the upper and lower bounds of the percentages that you supply. The percentage is based on the recorded think time. For example, if you enter 10 in Lower limit and enter 90 in Upper limit, the think times will be between 10 percent and 90 percent of the original recorded think time. The random time is uniformly distributed within this range.
Maximum think time
Set a maximum think time is useful with tests that emulate actual think times. By setting a maximum, you do not have to search for and edit each long think time within a test. Numerous factors can generate long think times, for example, you might be interrupted while recording. To restore the original think times, clear this check box.


Resource Monitoring page

Enable resource monitoring
Select to activate resource monitoring. The available data sources are captured from these sources:

  • Apache HTTP Server Managed Beans
  • Apache Tomcat Managed Beans
  • IBM Tivoli monitoring agents
  • IBM DB2 snapshot monitors
  • The IBM WebSphere Performance Monitoring Infrastructure
  • JBoss Application Server Managed Beans
  • Java. Virtual Machine Managed Beans
  • Oracle Database
  • Oracle WebLogic Server Managed Beans
  • SAP NetWeaver Managed Beans
  • the UNIX rstatd monitor
  • Simple Network Management Protocol (SNMP) agents
  • Windows Performance Monitor
Resource monitoring data can provide a more complete view of a system to aid in problem determination.
Ignore invalid resources when executing the schedule
Select this setting to suppress any error messages that invalid resources cause, such as unreachable hosts or invalid host names. If you select this option, you must view logs to see error messages.


Statistics page

Statistics log level
These options are listed in order of the increasing amount of data that they collect for the test log.
None
Collects minimal statistical data. Use this option to run a schedule quickly for testing purposes.
Schedule Actions
Reports the number of active and completed users in the run.
Primary Test Actions
For HTTP tests, this option reports page-related actions (attempts, hits, and verification points). For SAP tests, this option reports information on SAP screens.
Secondary Test Actions
For HTTP tests, this option reports information that is related to page elements. This option does not apply to SAP tests.
All
Provides statistics for all actions.
Statistics sample interval
Sets the sampling interval for reports. When you run a schedule, the reports show such information as response time during a specific interval, the frequency of requests being transferred during an interval, and the average response trend during an interval. You set this sampling interval here.
Only store All Hosts statistics
Select this option unless you are running a performance test over different WANs, and you are interested in seeing the data from each remote computer.


Variable Initialization

Use this page to initialize variables at the schedule level. When you initialize variables at the schedule level, all the user groups in the schedule use the variable initial values, except those for which a specific value is defined.

Add
Add a variable and initialize a value. The Used by column displays the test name that uses the corresponding variable. A warning icon is displayed for a variable that overrides the value specified at the schedule level or user group level and uses the value defined at the test level with the visibility set to This test only. Hover the cursor over the warning icon to view the tests that override the variable initial values.
Export
Export the variables defined at the schedule level to a file.
Use variable initial values file
Select this check box to use the variable values from a file. Click Browse to select an existing file or click New to create a file.


Performance Requirements page

Enable Performance Requirements
Select to enable the use of performance requirements for this schedule.
Name
Name of this set of performance requirements. This name is used in the Performance Requirements report. By default, the name is Performance Schedule -schedule_name.
Use Defaults
Click to reset Name to the default value.
Performance Requirement
All performance requirements are displayed in the table. Shaded requirements are not defined for this schedule. To define a requirement, set an Operator and Value.
Operator
Click this field to display a list of mathematical operators. Select an operator for this performance requirement.
Value
Click this field to set a value for the requirement.
Standard
Select to mark the requirement as standard. If a standard requirement is not met, the schedule run will have a verdict of fail, and this verdict will roll up to the entire run, like a verification point failure. Clear to make the requirement supplemental. In general, supplemental requirements are those that are tracked internally. A supplemental requirement cannot cause a run to fail, and its results are restricted to one page of the Performance Requirements report.
Hide Undefined Requirements
Select to see only the requirements that you have defined. This hides the shaded rows.
Clear
Select one or more requirements and click to remove the definition. The requirement is still available and can be redefined.


Test Log page

The default setting, to log all errors and warnings and primary test actions, fits most purposes. However, you can log any type of information, from no information to all information from all users, although neither is typical.

If you are debugging a test, you might set all three What to Log fields to All or Action Details. These settings produce large test logs, especially if your tests are long or you are running a large number of users. Large test logs, in turn, increase the test log transfer time, and might even cause your computer to run out of disk space.

To reduce transfer times and the likelihood of running out of disk space, sample information from a very small subset of users; smaller even than the default of 5 users per user group. A fixed sampling rate samples the same number of virtual users from each group. A percentage sampling rate samples a percentage of virtual users from each group, but guarantees that at least one user is sampled from a group.


Response Time Breakdown page

Enable collection of response time data
Select to activate response time breakdown collection. This data shows you the response time breakdown for each page element.
Detail level
Select Low or Medium to limit the amount of collected data.
Only sample information from a subset of users
If you set the detail level to High or Medium, set a sampling rate to prevent the log from getting too large.
Fixed number of users
The number that you select is sampled from each user group. Unless you have specific reasons to collect data from multiple users, select Fixed number of users, and specify one user per user group.
Percentage of users
The percentage that you select is sampled from each user group, but at least one user is sampled from each user group.


Problem Determination page

Problem determination log level
In general, change the problem determination level only when asked to by IBM Software Support. However, under certain conditions, you might want to change the problem determination level. For example, if problems occur when a run reaches a certain number of users, you might increase the level to Config, which is the most detailed level to use without consulting IBM Software Support.
Only sample information from a subset of users
Select this option to set a sampling rate.
Fixed number of users
Specify the number of users to sample from each user group.
Percentage of users
The percentage that you select is sampled from each user group, but at least one user is sampled from each group.


7.2. User group properties

When you open a user group, you can set these properties.

Group size
Specifies either an absolute number of users, or a percentage of users, which you control dynamically.


Locations

Run this group on the local computer
Indicates that the user group should be run on your computer.
Run this group at the following locations
Indicates that the user group should be run on one or more remote computers, at the indicated locations. Typically, you run a user group at a remote location if you are running a large number of virtual users.


Options

Use the Options page to override the think time behavior of your schedule for a specific user group and to specify protocol specific options.

Override think time option
Select this check box to specify a think time behavior for the current user group.
Use the recorded think time
Select to play back a test at the same rate that it was recorded. This option has no effect on the think time.
Specify a fixed think time
Each user's think time is exactly the same value: the value that you specify. Although this does not emulate users accurately, it is useful if you want to play a test back quickly.
Increase/decrease the think time by a percentage
Type a percentage in the Think time scale. Each user's think time is multiplied by that percentage. A value of 100 causes no change in think times. A value of 200 doubles the think times, so the schedule plays back half as fast as it was recorded. A value of 50 reduces the think times by half, so the schedule plays back twice as fast. A value of 0 indicates no delays.
Vary the think time by a random percentage
Each user's think time is randomly generated within the upper and lower bounds of the percentages that you supply. The percentage is based on the recorded think time. For example, if you select a Lower limit of 10 and an Upper limit of 90, the think times will be between 10 percent and 90 percent of the original recorded think time. The random time is uniformly distributed within this range.
Limit think times to a maximum value
Set a maximum think time is useful with tests that emulate actual think times. By setting a maximum, you do not have to search for and edit each long think time within a test, if, for example, you are interrupted during recording. No think time used will be greater than the maximum limit you set, even if you have chosen to vary the think time by a percentage that would exceed this maximum. To restore the original think times, clear this box.
Protocol-specific options
Click Edit options to set protocol-specific options for all tests in the user group. These settings override the protocol-specific options of the schedule.


Variable Initialization

Use this page to initialize variables at the user group level. When you initialize variables at the user group level, all the tests in the user group use the variables. If the same variable is defined at the schedule level, precedence is given to the variable at the user group level.

Add
Add a variable and initialize a value. The Used by column displays the test name that uses the corresponding variable. A warning icon is displayed for a variable that override the value specified at the schedule level or user group level and uses the value defined at the test level with the visibility set to This test only. Hover the cursor over the warning icon to view the tests that overrides the variable initial values.
Export
Export the variables defined at the user group level to a file.
Use variable initial values file
Select this check box to use the variable values from a file. Click Browse to select an existing file or click New to create a file.


8. Citrix monitoring panel reference

The Citrix monitoring panel is an optional panel that displays detailed information and control commands for each virtual user during the run of a schedule. When enabled, the Citrix monitoring panel is available during the run of a schedule.

Monitoring Panel
This panel displays information about the execution of each virtual user.
Pool Name
Displays the name of the virtual user pool. There is one pool per location and user group.
Active Virtual Users
Displays the number of virtual users currently active. This value is updated permanently during the run of the schedule.
User Action Rate
Displays the number of Citrix user key or mouse actions that were simulated during the last 5 second interval.
Total Elapsed Time
Displays the total time elapsed since the start of the schedule run.
Current Action
Displays the last user action executed in the test.
Timeouts
Displays the number of synchronization timeouts for the virtual user. The color represents the status of the timeout:

  • Green: ok.
  • Yellow: a timeout occurred on a conditional synchronization.
  • Red: a timeout occurred on a mandatory synchronization.
Elapsed Time
Displays the time elapsed since the start of the virtual user run.
Status
Displays the execution status of the virtual user.
Go To
Click to display the Citrix session of the selected virtual user.
Pause or Play
Click to pause or resume the execution of the selected virtual user. You can also pause the execution by setting breakpoints in the test.
Step
When the test is on pause, click to execute each user input action in the test, step by step. To pause test execution, you can either click the Pause button or set breakpoints in the test. Click Play to resume the test.
Interact
When the test is on pause, click to allow manual actions in the virtual user session. Use this feature if a test fails to synchronize or gets stuck in an unexpected state. To pause test execution, you can either click the Pause button or set breakpoints in the test. Click Play again to resume the test execution at the point where it was paused.
Stop
Click to stop the execution of the selected virtual user. When all virtual users are stopped, the schedule ends.