Case Study

Load Testing A Geographic Information System (GIS)


RTTS’ client of nearly 15,000 employees and $40 billion dollars in assets was implementing a GIS-based application for its operational and customer service teams that would provide account and demographic information of its 3.2 million customers by way of an interactive and customizable map. The web-based application was implemented via a mashup using Google Maps and Pitney Bowes MapXtreme® software. 

The primary goal of the project was to find the maximum concurrent users that the application could support, while providing a positive end-user experience. Ancillary goals were related to initiating a long-term testing process that could be used for the lifetime of the application.


Since the application would be used by numerous concurrent employees, RTTS was engaged to help its customer by proactively identifying processing bottlenecks and areas for remediation. Additionally, since the public utility was regulated for compliance and regularly audited, processes were needed to verify key decision making points throughout the application’s lifecycle.

Our Strategy

Using RTTS’ performance testing methodology along with the IBM Rational Jazz platform for software delivery collaboration, performance requirements were captured, a test plan was crafted, critical test cases were created and automated, numerous performance tests were executed and analyzed, and all non-functional defects were proactively managed. The primary Jazz products that were used included IBM Rational Quality Manager (RQM) and IBM Rational Performance Tester (RPT). Additionally, there was integration with Microsoft Visual Studio Team Foundation Server (TFS) for the purpose of source code control.


Using a series of automated test scripts, RPT enabled the automation of the pertinent navigational workflows by emulating the HTTP conversation that occurred between a user’s web browser and the application’s web server layer. The automated test scripts provided the mechanisms to emulate mutually exclusive user sessions, connections, etc., while intelligently interacting with the application and subscribing to any of its data requirements. Additionally, the RPT automated test scripts collected key performance metrics relevant to the end-user experience with the application, such as response times and application reliability. 

RPT test schedules used the automated RPT test scripts to model a particular test scenario for the application, such as the maximum number of concurrent users, the load pattern, and the length of the test. Additionally, the test schedules enabled the collection of system resource statistics from the application’s server machine. The collection of end-user performance metrics along with server health metrics permitted effective root cause analysis of any anomalies that may have occurred. 

RQM provided the test management interface from which tests were tracked and executed for the purpose of project health, as well as where all of the performance- related findings were logged and tracked. 

RTTS executed a battery of load tests that assisted in determining that a single- server application environment could support a maximum of 25 concurrent users, at which point the server CPU usage became the limiting and critical factor. As presented in the following chart, the CPU utilization on the target server became the critical bottleneck regarding capacity when supporting 25 users at which point web page response times started degrading; CPU utilization on the server was near 81% and response times increased by a factor of three (3) when compared to a 5‑user test. 

Performance Testing CPU utilization page response time

Additional findings included the following:

  1. Exporting data out of the maps and into Microsoft Excel would intermittently lead to an error message stating COM+ Activation Failure”. It was observed that very large files upwards of 200MB were correlated to the times at which these errors would occur. This behavior was flagged for review and optimizations would be discussed in a subsequent release.
  2. Using ancillary tools, including Fiddler, it was observed that each MapXtreme® map tile was taking 0.5 seconds to download and that multiple map tiles were not downloaded in parallel. Therefore, for map renderings that sometimes required downloading 15 map tiles, web page response times were a minimum of 7.5 seconds. For comparison, the map tiles delivered via Google Maps were an order of magnitude faster; i.e. 0.05 seconds. The difference in response times was most likely related to the additional business logic required to render the MapXtreme® map tiles. This issue was tabled for another release.
  3. Full table scans were frequently occurring within the database. As this behavior is known to create latency in a multi-user environment, SQL optimization was highlighted as an avenue for performance remediation in a future release.


RTTS was able to successfully demonstrate that a single-server implementation of the application could feasibly support 25 concurrent users. The need to support additional users would require additional hardware, application changes that would decrease the CPU usage and/or architectural solutions that would partition the web/application server layer and the database server layer. Additional avenues for remediation included SQL optimization, tweaks within the exporting features, and optimizing the latency when delivering MapXtreme® map tiles.