Case Study

Testing a Portfolio Management System with Quality Center and TestPartner

Background

A brokerage firm has an existing Windows-based Portfolio Management System which is used to facilitate trades for various clients. Sales traders manage incoming portfolios that contain globally traded securities. These portfolios are sent to various downstream trading systems based on region. Because of the global nature of the system it needs to be up for 24 hours a day / 6 days a week. 

The Portfolio Management System is the centerpiece of the firm’s portfolio workflow and handles communications between the clients, trading, reporting and booking systems. A major software release is deployed to production about twice per quarter with minor patch releases deployed in between. Each release will contain several enhancements based on the needs/wants of the traders, as well as fixes to issues within the system. 

The goal of the software quality effort is to verify the functional integrity of each new release. Since we are dealing with potential high value trades and a system that communicates directly with clients it is imperative that the software be of the highest quality.

Challenges

How can new releases for a Portfolio Management System be tested in a timely manner while also preserving the highest quality in software and performance?

Strategy

Test any new changes and ensure they meet all of the requirements provided by the Business Analyst for each release. Verify the existing functionality of the system is not adversely affected by running and maintaining an automated Regression Suite.

Solution

  • Create and maintain a Modular Automated System which incorporates a test suite of regression cases stored Mercury’s Quality Center and executed using MicroFocus Test Partner.
  • Focus on requirements provided by Business Analysts to ensure that new items will be fully tested when new builds are received.
  • Coordinate with development when testing new builds to ensure a complete understanding of how they implemented their changes.
  • Create or modify automated test cases to add to the existing regression suite when new functionality is introduced.
  • Maintain a UAT testing environment and coordinate with teams from other systems for end-to-end testing.
  • Maintain an additional testing environment which will contain the production-equivalent version to compare the new builds against existing behavior. This will also provide a separate environment.

The system itself was quite complex and had connections with several other systems within the brokerage firm. For each release, QA had to coordinate testing between several teams in order verify the new changes did not affect any of the upstream or downstream systems.

Portfolio Management System Brokerage

Trade data sent between the systems was done using the Financial Information eXchange (FIX) Protocol. The system recorded all incoming and outgoing FIX messages in log files located on a share drive. We were able to use these logs to verify information was being sent out properly to each of the downstream systems. Our automation was adapted to read in, record and compare these messages to ensure they were sending the proper information. 

Being that this system was the centerpiece of the portfolio workflow it was heavily dependent upon the other systems for most of the test cases (i.e. receiving files from a downstream system or incoming new order single messages from clients). It was sometimes difficult to coordinate testing with these systems as they were in different regions around the globe. To allow more control over our testing we were able to record FIX messages from several of these systems and use our automation to play in these messages. The system under test used WebSphere MQ to communicate between the different applications. We were able to replay the recorded FIX messages by placing them onto these queues to simulate an execution or a new order. This greatly reduced the dependencies on the other systems and allowed us to test our functionality anytime we desired. 

Communication between the front end and backend of the application under test was done using TIBCO Rendezvous protocol. We were able to simulate several of these calls to the backend which helped automate processes such as accepting and submitting portfolios to downstream systems. This lessened the need for manual tester interaction and resulted in test cases that were fully automated. 

Working with the development team we were able to set up two environments for testing. The first was a UAT environment which contained the version under test. The second environment contained the current production version of the application and could be used for comparison. The other advantage of this second environment was that clients and other systems could test their changes without disrupting testing in our UAT environment.

Benefits

  • A total of 1,400 test cases were automated and included in the Regression Test Suite.
  • The automated regression suite allowed for a quicker turn around on the Regression Test Cycle while also providing greater testing coverage.
  • Because of the simulated FIX Protocol messages, time was not wasted waiting to coordinate testing with other systems.
  • Not once did a defect get released to production where there was a monetary impact to a client’s trade.
  • Clients of the firm were able to conduct testing in a production like environment before enabling changes on their end without impacting our testing