RTTS’ client of 15,000+ employees and 3+ million utility customers had engaged in a project that would consolidate many of its disparate reporting systems into one web-based business intelligence (BI) application. The BI application was to support the decision making needs across various organizational units, such as accounts payable, accounts receivable, purchasing, accounting, field services, etc., and therefore required validation that that it would support multiple concurrent users running complex reports. The BI implementation was using Oracle’s Business Intelligence Enterprise Edition (OBIEE) along with a clustered RAC-enabled Oracle 11g database and Oracle WebLogic. Many of the reports were published within graphical dashboards, provided interactive analytical capabilities, and were delivered to the end-user via AJAX technology.
RTTS was tasked with validating that the BI implementation could support 250 concurrent users that were actively running and interacting with its reporting dashboards. Ancillary goals were related to initiating a long-term testing process that could be used for the lifetime of the application.
Since the BI implementation was highly visible within the customer’s organization and exposed a high level of risk regarding decision making capabilities, RTTS was employed to lead the effort of proactively characterizing the performance and scalability of the application, as well as assisting with root cause analysis and remediation.
Using RTTS’ performance testing methodology and best practices, a testing initiative was constructed that would assess the performance requirements and risks, devise a structured testing roadmap, implement a judicious suite of performance tests, and effectively manage all of the non-functional defects that were discovered. RTTS was able to implement its methodology using the IBM Rational Jazz platform for establishing requirements, devising test plans, orchestrating test execution, and managing defects. The primary Jazz products that were used included IBM Rational Quality Manager (RQM) and IBM Rational Performance Tester (RPT). Additionally, there was integration with Visual Studio Team Foundation Server (TFS) for the purpose of source code control. Ancillary tools, such as Fiddler, were used in order to provide transaction profiling between the web client and the web server.
RQM was used to capture the performance requirements, create a Test Plan, create the Test Cases, execute the RPT Performance Schedules, and manage defects. RPT Performance Schedules were used to implement a workload model that consisted of a cross-section of 50 RPT Performance Tests (i.e. business transactions) that reflected the individual reporting dashboards that various organizational units might run. Additionally, the RPT Performance Schedules enabled the collection of system resource statistics from the BI application, so that effective root cause analysis could be performed if any anomalies may have occurred. The entire application stack was monitored for system health, including Linux, WebLogic, and the Oracle database.
RTTS executed a collection of performance tests that identified issues throughout the different layers of the application relating to application design, configuration, and infrastructure. The following items are several of the findings:
- It was determined that very large amounts (i.e. 50+ KB) of XML content were being sent by the web client when users were interacting with the reporting dashboards.
- When testing complex reports and/or reports that involved large datasets, WebLogic execution threads would often become stuck and limit scalability.
- Database table space was tuned in order to support the large datasets of some of the reports.
- JDBC connection pools were tuned in order to support the number of concurrent user sessions.
- Administrative processes were highlighted regarding the need to make sure that the database statistics were up to date and that the query optimizer was choosing the appropriate SQL execution plans, especially after running an ETL job from an external system.
RTTS’ performance testing methodology and expertise was successful in proactively uncovering performance-related issues with the BI application that impacted the end-user experience and the scalability of the system. If solutions were not implemented to address the findings, remediation plans were put into place and expectations were set with the end-users; all issues were tracked within RQM. By the conclusion of the testing the BI application was able to support 250 concurrent users over the course of a business day while exhibiting one (1) to two (2) second response times.