Performance Testing High-Scale Load Generators in the Cloud
A robust educational institution, serving over 230,000 undergraduate students, undertook an effort to migrate their admissions application from Microsoft Azure to Amazon Web Services (AWS). To ensure a seamless transition for the upcoming fall registration period, RTTS was tasked with characterizing the performance, scalability, and reliability of the application against the new infrastructure.
With the web-based undergraduate admissions application migrating to a new cloud-based platform, one of the exit criteria was to stress the infrastructure above and beyond anticipated user load. The client did not have the infrastructure required to execute high-scale load tests.
Our plan was to perform cost-benefit analysis of utilizing a commercial SaaS performance testing solution or building our own load generators in the cloud.
There are several commercial SaaS options to choose from, however, they all had some limitation on the types of tests that could be executed. For example, some plans limit the maximum number of virtual users and others limit the number of executed tests.
For this engagement, the best option was to set up our own load generators in the cloud and utilize the distributed testing functionality of Apache JMeter to scale out as needed.
The cloud provider that RTTS utilized was Microsoft Azure. To help speed up the process of scaling out the load generators on demand, a virtual machine image was created with all the necessary software installed, which new VMs could quickly be deployed from.
By setting up our own load generators hosted in Microsoft Azure, we had the capacity to simulate up to 75K virtual users at the cost of $60 per hour with no limit to the duration and number of tests executed. To reduce costs when load tests were not actively being performed, the VMs were put into a stopped state, which had a maintenance cost of approximately $15/day.
RTTS’ performance testing methodology and expertise was successful in proactively uncovering performance-related issues with the application that impacted the end-user experience and the scalability of the system. The application was originally unable to support more than 37,000 concurrent users. By the conclusion of the testing the application was able to support 75,000 concurrent users over 2 hours while exhibiting low response times.
RTTS is the premier pure-play QA & Testing organization that specializes in Test Automation. Headquartered in New York, RTTS has had 1,000+ successful engagements at over 700 corporations since 1996