Generally speaking, storage performance testing (SPT) is the art and science used to determine the operating characteristics of a storage system. A storage product is placed under actual operating conditions, or conditions are simulated as closely as possible, within a controlled environment, and a variety of operating parameters are measured and reported. Typical operating parameters might include data throughput or I/Os per second (IOps). In many cases, performance is tested across a variety of workload patterns and intensity levels to gauge the system's response against a range of conditions. For example, storage system performance can be evaluated with read-intensive, write-intensive or mixed workloads.
"Storage performance testing matters during the purchasing cycle and during tuning and troubleshooting efforts," says Brian Garrett, industry analyst with the Enterprise Strategy Group. Suppose that a new storage system is being considered for an environment with 200 users, but another 200 users may be added over the next two years. SPT could be used to simulate the workload expected from up to 400 users. Or, it could be used to investigate the maximum number of users that a storage system will adequately support before performance degrades to an unacceptable level. Test results can provide an early indication of that system's future performance. Analysts note that SPT can be used to "break" a storage system or find its practical limitations before making a purchase.
SPT also has an important role when conducting an upgrade or change analysis. An administrator may need to know how the performance of a storage system will be impacted when a hardware device is changed or a software version is upgraded. SPT can be applied to almost any "what if" scenario, allowing an IT team to evaluate any benefit, or impact, from an upgrade or new product before a rollout is ever executed. Consequently, storage performance testing is not an everyday activity. It should only be performed when specific needs dictate.
As a general rule, SPT cannot, and should not, be performed on a production network. Not only would this unnecessarily put production data at risk, but the results would be skewed by normal network activity, which in turn would be adversely impacted by the added testing workload. SPT should be approached as a lab activity where equipment and software can be allocated for testing purposes -- a requirement that many organizations are simply not prepared to handle. "It's just a big endeavor to create something that's real world and can emulate the problems you're having in a benchmark environment," Garrett says. "For the large majority of storage users, they just don't have the resources or the time to be thinking about this stuff."
Some organizations turn to third-party testing services when there simply isn't enough space, equipment or available talent to tackle a testing project in-house. Independent testing facilities, such as Diogenes Analytical Laboratories Inc or Lionbridge Technologies Inc., are usually quite good at designing experiments, employing a variety of available test tools and analyzing the volumes of test results that are generated. Outside firms also bring a level of expertise to the test process that may be absent on the actual user's site. For example, a data center with significant EMC Corp. system experience may have trouble running accurate comparative tests on a storage array from Hitachi Data Systems Inc. (HDS) simply because there isn't enough familiarity with the new product. By comparison, an outside testing house can usually bring a broad range of expertise across various hardware and software platforms.
Ideally, an independent test center should be able to avoid the "politics of preference" and report accurate data regardless of the system being tested. However, some vendors engage test facilities for comparative tests that can sometimes lead to bias -- potentially compromising the test center's reputation. "Vendors will hire a particular lab to run a series of tests, and those tests may or may not be biased," Schulz says. "There's a lot of suspicion whenever a lab comes out and declares a particular product as being the best." Vendor-sponsored testing can generate a lot of skepticism regarding the results.
As with in-house testing, outside testing is not something to be undertaken lightly. A testing cycle can take anywhere from several weeks to several months, depending on the complexity of the individual project. The costs for outside testing can also vary wildly, ranging from about $15,000 to several hundred thousand dollars per engagement. Actual costs are influenced by the amount of labor, hardware needs and analytical detail required by the client.
Reliance on vendors and the SPC
In actual practice, the commitment of labor, time and equipment may be too much for an organization to bear, and there simply may not be enough capital in the IT budget to support testing from an outside organization. When this is the case, storage administrators often turn to test results from the Storage Performance Council (SPC) for general performance data about prospective products.
Industry governing bodies like the SPC do not actually perform testing, but they develop standardized test suites that vendors can utilize to perform their own testing. Those results are then audited by the SPC and posted to the SPC Web site for easy reference. Thus, standardized test results may be readily available for storage products currently under consideration. As you might expect, this system has its limitations. Vendors are not obliged to submit their results, so the results that do appear are often "best case" results for individual products. Some of the most notable names in the industry,including EMC, do not participate in SPC testing.
Ultimately, most storage administrators forego the investment of performance testing and rely on the individual product vendor for testing results, best practice guidelines and configuration assistance within their own environment. "For small and midsized companies, they probably go to their system or storage providers and have them [vendors] help solve the problem because they have the resources and they've already been through the sizing issues," Garrett says.
Go to the next part of this article: Storage performance testing: Strengths and weaknesses
Or skip to the section of interest:
This was first published in March 2006