pnp.gif

How To: Conduct Performance Testing Using Automated Load Test Tools

Mark Tomlinson

Applies To

Summary

This document describes an approach to conducting Performance Testing for your applications using an automated load testing tool. Performance testing your application is very valuable to assessing the quality of the application, identifying serious bottlenecks and improving the scalability of the underlying systems that support your applications. You will also find the results from performance testing can help you to estimate the hardware configuration required to support the application before you go live to production.

Contents

Objectives


Overview

Performance testing can be described as identifying how an application utilizes system resources when a component or application is loaded with some type of activity. To accomplish this, you will typically use a performance testing tool that will emulate increasing quantities of end-user activity on your application. These test tools require you to develop multiple, individual performance test scripts in order to cover the most important parts of the application. By examining your application's behavior under emulated load conditions, you can typically identify whether your application performance is good, or bad – or whether it will meet the specified requirements for performance.

The basic approach to conducting performance testing starts with test planning and preparation steps, followed by the test execution and performance analysis. It is important to spend proper attention during the planning and preparation steps to ensure you have everything ready when it comes to executing the tests on your application. These steps can be used for testing projects that are at the smallest unit-level testing or a large-scale production simulation and capacity planning initiative. The same overall approach and activities can be applied equally to the varying sized project scope.

The most common reasons for conducting performance testing can be summarized as follows: <reference here to Types of Performance Testing>

Input

Here’s a list of the documented information commonly required to start performance testing:

Output

Here’s a list of the common output items that result from conducting performance testing:

Steps

Step 1. Performance Test Planning

Just as with any project, your Performance Test project should begin with a Performance Test Plan. The test plan is a document that captures all the information required to conduct a performance test. Creating a test plan enables you to make key decisions and is critical to the success of the whole testing effort. Take extra time to think critically about each part of the test plan, and be as specific as possible.

Here are the items you should consider including in a performance test plan:
It is important to capture your performance test plan in a way that can be updated and shared throughout the entire testing effort; which may not necessarily be single document. Using new collaboration tools are a great way to share information and decisions as you work on the plan. Make sure the test plan is available as you work on the other steps in this approach.

<Insert reference to Identify Performance Testing Objectives>
< How To: Quantify End-User Response Time Goals>
<Insert reference to Performance Test Planning Explain>?
<Insert reference to Model the Workload for Web Applications>

Step 2. Developing Performance Tests

Using the information from the test plan, you should start developing the performance test using a test tool or utility. For each component to be tested, you should write a test to make calls or generate load for that component. Also, you should include the test criteria for each component where you plan to measure response time or throughput of data.

Here are some of the steps used to develop a test:
Many of the test tools you may be using will have built-in features to capture response time and throughput measures from the tests you write. Be sure to familiarize yourself with the testing tool you are using, especially for end-user time measurements, data parameter settings and end-user simulation configuration.

In addition to developing test scripts, you will also configure the testing tool with the workload profile you created in the test plan. Most test tools enable you to combine several scripts together and execute them simultaneously, thus you can generate a comprehensive load on your application. Here are the steps for creating the load test project or scenario:

There are four basic metrics that should be included in performance monitoring:
<Insert reference to Step Through Creating a Load Test In VS 2005>
<Insert reference to How To - Create a Load Test Plug-in to Control the Number of Test Iterations During a Load Test in VS.NET 2005>

Step 3. Build Testing Environment

From the test plan you have documented the existing or planned architecture for your application. This document should include the hardware, software and network architecture required to support the application. To ensure accurate test results, the test environment should be configured like the production environment, as closely as possible.

The following tasks may be included when building the test environment:

Step 4. Execute Tests

Test execution is the most exciting part of conducting a performance test. Coordination and change control are the key skills to doing test execution well. It is important to coordinate the beginning and ending of each test run and to keep a comprehensive test log. Change control should be used to manage all changes between each test run and communicating each change with everyone involved is also very important.

Here are some general tips to help you get started:
Test execution will continue and repeat according to the performance testing objectives listed in the test plan. Test execution phases may include the following types of test runs:

At the end of each test run you should gather a quick summary of what happened during the test and add these comments to the test log. This can include machine failures, application exceptions and errors, network problems or exhausting disk space or logs. When you complete your final test run, be sure to save all the test results and performance logs before you tear down the test environment.

<Insert reference to Load Test Web Applications>
<Insert reference to Transactional Stress Test in Web Applications>
<Insert reference to Tune Performance of Web Apps>

Step 5. Analyze Test Results

Analyzing the results to find performance bottlenecks can be performed between each test run, or after all the test execution has been completed. This requires training and experience with graphing the performance measurements for system resource utilization and the ability to correlate the graph data with end-user measurements. The testing tool will typically have capabilities for displaying all these results in an organized way.

Using the results from the tool, you should produce the following graphs:
You must correlate the data between the graphs, looking for points during the test where the results are similar or dissimilar. For example, you may observe a slower end-user response time at the same time you see application requests queuing up on the server. The end goal is to determine the root cause of application request queuing which could be a slow or blocking transaction in the back-end database, or the system has maxed out the CPU or Disk throughput.

For tuning and optimization projects, you will commonly do this analysis quickly between each test run (as in Step 4.). As you find more opportunities for code optimization or system tuning, you will analyze the results, implement the optimizations and retest to determine the new results.

Step 6. Create Test Reports

Once you have completed all the test execution and analysis on the performance results, you should prepare a report which summarizes and details the entire performance testing experience. This report should be comprehensive and communicate the total understanding of the results of the test, without additional explanation from the testing team. Save all the graphs and reports used in your analysis and collect them to include in the report as the proof for all your recommendations and learning.

Good performance test reports include the following components:

Additional Resources

<<TBD>>