Automated component testing
The automated component testing features in the Rational Developer products allow you to create, edit, deploy, and run automated tests for Java components, EJBs, and Web services. These features comply with the UML Testing Profile standard and use the JUnit testing framework.
With these features you can perform the following actions:
- Work with industry-standard JUnit test scripts
- Create test projects for Java components, EJBs, and Web services
- Create and edit tests and test data
- Use stub classes to replace dependency classes for Java components, EJBs, and Web services
- Deploy and run your tests
- Analyze test results
- Run regression tests
Working with JUnit test scripts
All tests that you create with the Rational Developer products are extensions of JUnit tests. In addition, you can import, edit, and execute existing JUnit tests. The automated component testing features extend JUnit with the following families of primitives:
- Initialization points (IP): initialize variables or attributes of a component-under-test (CUT).
- Validation actions (VA): verify the validity of a variable.
- Timing constraints (TC): measure the duration of method calls.
A major difference between validation actions and the original JUnit assert methods is that, with validation actions, failed assertions do not stop the execution of the entire JUnit test suite.
More information on JUnit can be found at the following locations:
Creating test projects
To test your components, first create a test project.
The test project is linked to one or several development projects that contain the components you want to test. Development projects can include Java development projects, Enterprise Application projects, or Dynamic Web projects. The components targeted for each test are known as the CUT (component-under-test).
Test projects contain execution-oriented elements (test suites and runs) and code-oriented elements (test behavior scripts and stubs). You can browse and edit test suites and test runs in the Test Navigator view, while test behavior scripts and stubs are shown in the Package Explorer view.
Creating and editing tests and test data
You create tests with the Component Test wizard. After you create the test, you can then edit the resulting test assets, which include a test behavior script, a test suite, test cases, and a test data table.
- A test behavior script is a JUnit file that defines the test behavior. The script is based on a test pattern that you select while using the wizard. You can view and edit the test behavior script in the Java code view.
- A test suite is an abstract construct containing a set of individual test cases and deployment information. The test cases in the test suite are linked to the CUT. Test suites can be deployed and run independently from each other. The Test perspective provides a Test Suite editor that you can use to edit the contents of a test suite.
- A test case is the expression of a test behavior within a test suite. The test behavior is a formal description of how the test case stimulates the CUT. Each test case is implemented as a JUnit test method in the test behavior script. A test case can, therefore, be seen as a link between the test suite and a particular method in the test behavior script.
- You supply test data to the generated test cases through the test data table view. Because each test data table maps to a particular test case in your test code, a separate test data table is created automatically for each test case. Each row in the test data table represents an object or expression in your code, and each column represents a data set (equivalence class).
Using stubs
In testing, it is often necessary to stub out components that the CUT interacts with. This lets you test the CUT in isolation so that you can be certain you are testing the CUT and not the other component. Like a test, a stub is defined by behavior and data. Stub behavior is defined in the stub's user code class, which can be viewed and edited in the Java code view. Stub data is supplied in the stub data table, which defines the output behavior of a stubbed class in response to certain inputs. With the stub data table, you simulate the stubbed class by specifying the actual input and return values for each stubbed method.
Stubs can be generated automatically through analysis of the CUT during test creation.
Deploying and running tests
Test deployment is the phase in which you specify the conditions in which the test is executed. Mainly for EJBs, test deployment data includes application server information. Use the Test Suite editor to choose the server configuration to deploy the CUT. Server configurations are defined in the Server perspective.
You can specify a launch configuration for any component of a test project (a test suite, a test case, or a single equivalence class), so that the test can be run with or without profiling or debug options. During test execution, a data collector monitors the CUT to retrieve test results.
Analyzing test results
Test execution produces a test run, which can be seen in the Test Navigator view. You can expand a test run to see the individual tests and their verdicts (Pass, Fail, Inconclusive, or Error). From an individual test, you can go to the result details displayed in the Test Data Comparator view to see the actual test results compared to expected results.
Contextual cross-navigation is provided between the test verdicts, test data, test suites, test behavior scripts, and the relevant portions of code in the CUT.
Running regression tests
The main benefit of generating and setting up a persistent test environment is to facilitate reuse of your tests for regression testing. Regular regression testing is a reliable means of ensuring that new defects are not introduced and that existing ones are fixed.
A good practice is to create your tests early in the development project as it helps in the interpretation of product specifications. You can then run the same tests periodically during the development process in order to detect unexpected errors that may appear every time the code is modified or moved into a new environment.
Related tasks
Creating new component test projects
Running component tests
Viewing component test results