Case Study

Automated Regression Testing Trade Data Delivered By SOA Web Services


RTTS’ client, a major investment bank, delivers portfolio management functionality to internal applications wrapped as a SOA Web service. The Web service allows users to manage promissory notes, positions of the notes as well as entire portfolios. Additionally, the web service publishes methods to create notes, positions or portfolios, to edit related information, to search for current and historic market information and to delete entries.


How can application data be verified, when the application is a Web Service without a Graphical User Interface (GUI)?


Implement an automated test user” using an industry-standard test automation tool and standard SOAP/HTTP components.


The client had two web services that required testing with approximately 110 different methods such as Save (Create), Update, Delete and Search (Historic and Current).

To test the client application, RTTS used a standard SOAP/HTTP package to build a custom SOAP request-response engine, along with code to load the response messages into the XML DOM for extraction of result data for comparison to baseline values.

The baselines for the web services were created by querying the underlying database to produce outputs that were expected from the various web service functionalities. These results were then compared against the SOAP/HTTP results. To account for the dynamic nature of the data, each regression run executed the baseline queries in parallel with the SOAP queries, so that constant updates to data (data entry, replication from production, live market feeds) did not affect test results.

With each build, the development effort added new SOAP methods, and the testing effort prepared test cases for the new functionality by adding baseline and corresponding SOAP queries.

In most cases, RTTS was able to test the two web services, analyze results and enter defects within one day. As a result, developers received defect information prior to the start of the following day. With the quick turnaround, the development team was able to fix defects early and prepare for new builds. As shown in Figure 1, in most development cycles, with each build release, more test cases were created and executed and fewer defects were found.

SOA Trade Data

With these SOAP web service tools, RTTS engineers were able to start testing the application after the creation of the first application build, well before the development of the application user interface.

As an added benefit of the approach, when GUI applications were built to consume the Web Services, test automation for the GUI applications was developed. When this was run along-side the automated SOAP data verification suites, the combination provided a tool for rapid localization of defects to either the application middle-tier or the presentation layer.

The quick turnover of defects helped move the project forward at a strong pace. RTTS’ approach allowed multiple successful releases to production, both before and after the development of the application user interface.


  • Starting the testing effort in the early stages of development allowed RTTS to find a significant portion of the defects at an early stage of the project, saving downstream costs and development time.
  • The test automation process allowed a rapid quality assessment of each build throughout the life cycle of the project. Without automation, the testing process would be more labor-intensive because of the need to manually test and compare the results.
  • The QA and development teams were able to have a build-by-build comparison during the development cycle.
  • RTTS’ approach provided a tool for rapid localization of defects.