Jon has made performance and scalability testing the focus of his career since 1988. He has completed hundreds of engagements encompassing thousands of tests against applications from every major industry and then some. To him, technology is a playground to which the rules of the game are constantly changing... Jon works out of the Arizona office (360+ sunny days a year; sorry east coasters) and lives in Goodyear with his wife and son.
Following Directions and Asking the Right Questions
I write quite a few Statements of Work (SOW); some call them proposals and I'm sure there are a number of other terms as well. The RTTS format I follow fits perfectly with its intended response; acceptance by the client or winning the project in a competitive situation. I take the client’s information and requirements, sift through them for relevance to the project at hand, and match the details against what can and cannot be done with automated performance and scalability testing. When communicating with potential clients and putting together the SOW, there are two aspects that must be covered even before getting to the “how long” and “how much” part. The first is that we understand their business motivations and have the methodologies in place to address them successfully. The second is that we understand their technical requirements and have the tools, skill sets, methodologies and a plan to successfully address them. However, I have found that not all companies maintain relevancy to the project at hand.
Recently, a prospective client issued a Request For Proposal (RFP) and supplied quite a bit of information regarding the testing environment, the application, business transaction volume requirements, expectations and instructions on what to include and exclude from the proposal and strategy presentation. Following the initial RFP was the opportunity for all responding vendors to ask questions. The unusual part, or at least out of the ordinary as I have not had this occur before, is that all of the questions and their answers were combined into a single response document and returned to all of the vendors. I believe this was in response to one of the questions asked by another vendor asking for access to all of the questions and answers (maybe to size up the competition, huh...).
After going through all of the questions and answers, I can surmise that there were maybe four or possibly five vendors involved. Some of the questions were duplicates that I had asked as well. Some of the questions, though valid, were coming out of left field from my perspective. One of the questions referred to additional services and tools when the RFP explicitly stated to include only tools and services needed to test the application. Another question asked for the procedure for reporting issues. As I can see it is an important point once the project starts, I cannot see its relevance in forming a testing strategy. One of the questions asked if the demonstration was against the client’s application. One of the requirements of the presentation was to do a demonstration of the tools selected for the testing with an overall length of the strategy presentation and demo not to exceed two hours. If you've ever performed a demo against a live application you have never seen before, it can take hours if not longer just set it up, not to mention the possibility of discovering the nuances of the script modifications needed to get it to play back correctly. There were a number of questions asked where the answers referred vendors back to information contained in the RFP. I can only guess whether the people asking the questions were testers or salesmen.
Anyway, from my perspective, it's all about decoding information, inferring, making educated hypotheses and putting together the best SOW possible. And, there are always good questions to ask, just make sure they are the right questions.
Posted by Jon Harris on Sunday, March 23, 2008 12:04 PM EDT
Leave a comment