One of the main principles of extreme Programming (XP) is that programmers must perform regular unit tests, and must constantly integrate changes into similar product environments. In addition, XP recommends automating this process to the fullest extent possible. After all, if a developer creates a test case as eagerly as the production code, the process will be relatively easy.
If the unit test is done well, you should be satisfied with each piece of code (especially if it is indicated by the individual class) to function correctly. Performing a persistent or unstable compilation gives you an idea of how the code will work when it is integrated into a production environment. Integrating unit Testing and regular automated compilation during the development cycle assures you and your customers that the code is reliable when it is released.
In this article, I'll take a practical approach to automating the compilation and testing process. Using the Ant 1.3 and JUnit test framework, you will show you how to automate a process that captures information about the operation of each test suite, generates attractive reports, and sends the report in e-mail. Although many of these features are implied in Ant, I have modified a number of core tasks to better meet the requirements. These modifications are central to this article, and they will all be incorporated into Ant's next release.
Why use Ant and JUnit?
Ant 1.3 is the actual standard in the compilation tool. Written in the Java language, Ant is open source, can run on a variety of platforms, and provides a lot of flexibility for the compilation process. The JUnit test framework is also open source, it is widely used, and is integrated with Ant's build process (you want to learn more about Ant and JUnit, see Resources).
ANT 1.3 plus the optional <junit> and <junitreport> tasks, you can start the automated process of basic compilation and testing without modification. This process is as follows:
Run the JUnit Test utility
Capturing test results
Create an attractive HTML summary report
Once the results are captured, you can use the XML formatter to bring together the number of failures and errors that each test suite runs, along with the package and class name and the execution time of the test suite. For each test suite, the following information is captured:
The name of the test case
Duration of execution
Type of failure or error (if applicable)
Details of any failures or errors
What's missing from this idea?
Although functionally perfect, the automation described above is neither ideal nor complete. By modifying several of the JUnit tasks, we can create an automated process that runs as follows.
Performing JUnit tests
Record results to an XML file or to a file in another format
Translate the results into a test report based on XSL format
Convert this report to HTML format
Send a report in e-mail
Please give me more data
In addition to extending ANT and JUnit to the process of automating compilation and testing, I added standard data that was captured during the test. That is, I need to know what operating system is in use, the date/time of the test, the JVM version that supports the test run, and the classpath.
To capture this information, I made a simple change to the four classes in Ant's JUnit-related classes: Junittask, Junittest, Junittestrunner, and Xmljunitresultformatter. You will find these changes in the accompanying source files.
As an incidental benefit, when you extend the captured data, you end up capturing not only information about the specific state at the time the test suite is running, but also the entire set of operating properties for Ant. It contains system properties and internal Ant properties, such as user-defined properties.