The JUnit++ Testing Tool

JUnit++ is a freely available Java unit test framework that includes a test data repository, command-line arguments, and a TestRunner class that supports a built-in repetition counter and multithreading at the command line.


February 01, 2001
URL:http://www.drdobbs.com/jvm/the-junit-testing-tool/184404480

Feb01: The JUnit++ Testing Tool

Java integration, system, load, and stress testing

Siegfried is a technical architect for Software Daten Service. He can be contacted at [email protected].


JUnit is a freely available Java unit test framework created by Erich Gamma and Kent Beck. Currently at Version 3.2, JUnit is extensively used in the Java community because of its elegance of design and ease of use. Over time, JUnit (available at http://members.pingnet.ch/gamma/junit.htm) has been extended to address issues related to integration, system, load, and stress testing. However, these extensions have inevitably resulted in shortcomings such as the lack of a test data repository, which stems from the original design. The absence of a test data repository results in hard-coded test data that increases the maintenance costs of testing database-driven systems. This should not be underestimated since the test code should have the same quality requirements as the application code. It is common to ship the regression test suite as part of the deliverables to increase customer confidence and supplement the documentation. To address this and other limitations, we've extended the JUnit test framework to provide a test data repository, command-line arguments, and an improved TestRunner class that supports a built-in repetition counter and multithreading on the command line. This improved version, called "JUnit++" (available electronically; see "Resource Center," page 5), has been used for more than a year to test an online banking application — currently at more than 100,000 lines of code — at Software Daten Service (Vienna, Austria).

Using the JUnit Extensions

The main improvement of JUnit++ is the provision of a test data repository accessed by subclassing junit.extensions.ConfigurableTestCase. Listing One illustrates how you can use the test data repository for a simple test. The class ConfigurableTestCase uses java.util.Properties to load the test data. The loading of the property file is triggered in the constructor of ConfigurableTestCase. For a master test suite that only invokes other test suites, this strategy doesn't work because no constructor is invoked. In this case, loading the property file has to be done manually by invoking initTestProperties() in the suite() method; see Listing Two, which shows how to use the test data repository for a master test suite.

How is the property file found when an instance of ConfigurableTestCase is executed? It is assumed that the property file has the same name as the class file but a different extension; that is, the property file for the FooTest.class would be named FooTest.ini. The property file is either looked up in the current directory or in the list of directories defined in the classpath variable.

In some projects this approach might not work (for instance, using a single property file for all test suites); therefore, the system property junit.data either defines a property file or a starting directory for the search; see Example 1(a). How is an entry in the property file defined? To access the test data repository and the class name, the name of the test case and the property name are concatenated to generate the key in the following order:

1. Fully qualified class name plus test name plus property name.

2. Class name without package name plus test name plus property name.

3. Class name without package name plus property name.

4. Property name.

This implementation allows the reuse of test data definitions shared by one or more test suites. For example, Example 1(b) shows the name of the server used for testing a client/server application with multiple test suites. In the case of a master test suite, the corresponding property file contains references to the property files of the contained test suites, as in Example 1(c), which are loaded recursively.

What happens if a different type of test data repository already exists (as in a relational database)? In JUnit++, the implementation of a test data repository is determined and instantiated at run time. Hence, it is possible to provide a different implementation of a test data repository by defining the implementation class at the command line, as in Example 2(a). After a successful test execution, you might reuse the test suite to gather performance data either by simply measuring the execution time or profiling the application. The profiling gives a good indication of how much time and resources are spent in an ORB run-time library or a Servlet engine. It is necessary to fine tune the number of test executions to find a balance between the effect of one-time initializations distorting the result and time required to generate the profiling data. By using junit.extensions.TestRunnerEx, the number of test repetitions can be defined on the command line in Example 2(b).

If the profiling of a client/server application is done on one computer, it will become overloaded. In this case, a delay of the test case invocations can be defined on the command line in Example 2(c). The next step might be overloading the server until it crashes due to race conditions and/or resource shortage. This can be accomplished by invoking one or more test suites using multiple threads, repetition counters, and test case invocation delays to simulate a more realistic user behavior; see Example 2(d).

Some components require command-line parameters for initialization. JUnit++ provides a uniform way of defining application-specific command-line parameters by passing them as a system property junit.argv. Some development environments are unable to process quoted arguments when invoking the debugger. This is why multiple arguments can be separated by the pipe character, as in Example 2(e).

Sometimes passing arguments via command line is clumsy, for example, when initializing a JDBC connection for multiple test suites. In this case, a ConfigurableTestCase can be used (see Listing Three), where the parameters are stored in a corresponding property file.

If a test suite does a lot of printing, it is useful to get a more verbose output of a test run. This can be accomplished by turning on the verbose mode of TestRunnerEx, as in Example 2(f).

Testing an Online Banking Application

The online banking application is a satellite system of the Global Entity Ordering System (GEOS), a host-based application consisting of 6 million lines of code written in C. It provides multiple front ends (Swing, HTML, WAP) and is implemented in Java utilizing CORBA, servlets, and XML. Figure 1 illustrates the application's architecture.

The GUI framework is based on the JWAM architecture (see "Frameworkbasierte Anwendungsentwicklung," by Guido Gryczan, Carola Lilienthal, Martin Lippert, Stefan Roock, Henning Wolf, and Heinz Züllighofen, OBJEKT Spektrum, January 1999) and generates requests which are delegated to a Datasource. The Datasource, in turn, delegates the requests to the Transport Layer, which shields the client from the underlying middleware. The requests are transmitted to the CORBA-based Business Server implemented by a CorbaBusinessObject, which maps the requests to various Data Access Objects. The CorbaBusinessObject and SecurityManager are part of an application framework developed in-house. The Data Access Object is ultimately responsible for contacting the host-based GEOS, which uses a relational database. The results are propagated back through the various layers until they are displayed by the front end.

Using a bottom-up test strategy, the architectural layers are tested with the help of JUnit++ in reverse order, starting with the Data Access Objects, the CORBA application framework, and the Datasources.

The regression tests for the Data Access Objects are organized by means of a hierarchy of test suites. Each test suite is derived from ConfigurableTestCase and retrieves the test data from a single property file for each database. By simply defining a different property file on the command line, the test suites can be run using a different database.

The CORBA application framework regression test consists of 11 individual test suites that are invoked through a master test suite. The test suites are not only used for regular regression testing but also for stress testing the ORB and optimizing the performance.

Two Examples

The implementation of a data conversion module copying internal data structures to IDL data types was considered too slow — 10 iterations of the original regression test suite took 32 seconds to execute. By optimizing the code using a Java profiler, we were able to reduce the execution time significantly:

The regression test for the Datasources also acts as an integration and system test because this test exercises multiple layers of the application — the transport layer, middleware, business server, and Data Access Objects. By using JUnit++, this test can be easily turned into a stress test executed over the weekend.

In a second series of tests, we used a test configuration to simulate the load of 35,000 clients within nine hours so that all server processes and infrastructure components failed. A closer inspection revealed the following faults:

Having multiple regression tests, it is reasonable to use the test suites in order to create a Daily Build and Smoke Test (see Rapid Development, by Steve McConnell, Microsoft Press 1996). This guarantees a certain level of quality because the regression tests cover a large portion of the functionality and span multiple layers of software.

Conclusion

JUnit++ makes it possible to separate test data from test code, which is essential for testing database-driven applications. Omitting this separation results in hard-coded test data, which increases the tests maintenance costs and decreases the acceptance of regression tests.

The ease of generating an arbitrary load under various conditions encourages developers to use stress tests for components early in development, which in turn, increases the reliability of the final product. This observation is particularly relevant for distributed systems because it might require an expert's knowledge to trace a failure back to a single component.

DDJ

Listing One

public class FooTest extends ConfigurableTestCase {
    public FooTest ( String name ) {
        super(name);
    }
    public void testFoo() {
        String stringValue   = getString( "key1" );
        Boolean booleanValue = getBoolean( "key2" );
        Integer integerValue = getInteger( "key3" );
        Double doubleValue   = getDouble( "key4" );
    }
}

Back to Article

Listing Two

public class AllTests extends ConfigurableTestCase {
    public AllTests ( String name ) {
        super(name);
    }
    public static Test suite() {
        // load property file
        initTestProperties( AllTests.class );

        TestSuite suite= new TestSuite();
        suite.addTest( FooTest.class );
        suite.addTest( BarTest.class );
        return suite;
    }
}

Back to Article

Listing Three

public class JDBCTestSetup extends ConfigurableTestSetup {
    static private Connection connection = null;
    public JDBCTestSetup () {}
    public void setUp() {
       connection = DriverManager.getConnection(
            getString( "url" ), 
            getString( "user" ),
            getString( "password" ); 
    }
    public void tearDown() {
        connection.close();
    }
    public Connection getConnection() {
        return connection;
    }
  }
}


Back to Article

Feb01: The JUnit++ Testing Tool


(a)
java -Djunit.data=MyFooTest.ini junit.swingui.TestRunner FooTest
java -Djunit.data=./test/data junit.swingui.TestRunner FooTest

(b)
# FooTest.ini
foo.FooTest.testFoo.key1=XYZ
FooTest.testFoo.key2=true
FooTest.key3=9999
Key4=3.1415927

(c)
 # AllTests.ini
 .default.0=FooTest.ini 
 .default.1=BarTest.ini

Example 1: (a) Specifying a test data repository; (b) content of a test data repository; (c) content of a test data repository for a master suite.

Feb01: The JUnit++ Testing Tool


(a)
java -Djunit.data.class=xyz.JDBCTestProperties
junit.extensions.TestRunnerEx  sds.corba.datasource.AllTests

(b) 
java junit.extensions.TestRunnerEx -X 10 sds.corba.datasource.AllTests.

(c)
java junit.extensions.TestRunnerEx -X 10  -W 100  sds.corba.datasource.AllTests

(d)
java junit.extensions.TestRunnerEx -X 1000 -T 10 sds.geos.datasource.AllTests

(e)
java -Djunit.argv="-c foo.ini" junit.textui.TestRunner FooTest
java -Djunit.argv=-c|foo.ini junit.textui.TestRunner FooTest

(f)
java junit.extensions.TestRunnerEx -V junit.extensions.test.ConfigurableTestCaseTest

Example 2: (a) Using a different implementation of a test data repository; (b) Starting TestRunnerEx with 10 repetitions; (c) executing tests with 10 repetitions and 100-ms invocation delay; (d) server crash test with a client executing 1000 times with 10 threads simultaneously; (e) defining command-line arguments; (f) using the verbose mode.

Feb01: The JUnit++ Testing Tool

Figure 1: Application architecture.

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.