The Rise of Storage Benchmarks

As new storage technologies gain acceptance, a method to make valid performance comparisons is crucial.

Elliot King

December 12, 2004

4 Min Read
ITPro Today logo in a gray background | ITPro Today

Last month, Datacore--a 7-year-old storage software company specializing in storage control, management, and consolidation--released what it described as the first results for the Storage Performance Council SP-1 benchmark involving the iSCSI standard for storage connectivity. According to the company, the results indicate that in comparison to its own Fibre Channel mark, the iSCSI results achieved about half the performance at about half the price. Moreover, the company argued, the absolute performance of the iSCSI configuration compared favorably to many more expensive "name brand" Fibre Channel arrays at a fraction of the cost.

The Datacore announcement raises several interesting concerns for storage administrators. Most important, the announcement might bring increased attention to the use of benchmarks in the storage community. If benchmarks become more widely used and understood, they could help smaller companies crack the tier-1 storage arena.

The industry considers the Storage Performance Council SP-1 benchmark to be the only objective test to evaluate enterprise storage subsystems. Announced after 4 years of development at the Computer Measurement Group conference in December 2001 and made available the following year, the benchmark gives vendors, Value Added Resellers (VARs), and IT managers a standardized, vendor-neutral methodology for measuring and comparing performance on storage subsystems. The need for a standard method of comparison emerged as storage networks became more pervasive. Storage networks are made up of many different components, and the performance capacities of a single component can't predict the end-to-end performance of an entire system.

The benchmarks were modeled after the Transaction Processing Performance Council (TPC) benchmarks, which measure the performance of database-centric information infrastructures. Two baseline statistics are generated. The first is Input/Output Operations per Second (IOP/s), a raw performance measure to gauge the level of workload a storage infrastructure can manage and its ability to keep up with demands during peak periods. The second summary statistic is Least Time Response (LTR), which measures the length of time necessary to complete tasks (e.g., backup and restore operations, rebuilding large databases) that are composed of a series of I/O requests.

As with the TPC benchmarks, the performance statistics can be compared according to the capacity of the system as well as the cost. In this way, end users can conduct apples-to-apples comparisons when evaluating storage infrastructures.

Although establishing the SPC benchmark tests has been a long and arduous process, the tests seem to be gaining momentum. Over the past year, 16 tests have been submitted for review. (Storage vendors themselves can conduct benchmarking tests, which are then are independently audited before the Storage Performance Council accepts by them for publication.) Major storage heavyweights--including Fujitsu, HP, IBM, StorageTek, and Sun--have published SPC-1 benchmark results this year. However, industry leader EMC hasn't yet posted any benchmark results.

According to Ziya Aral, chief technology officer and chairman of Datacore, the use of storage benchmarking has until now faced several challenges. The testing for the standard, he noted, wasn't trivial. He also argued that the concept of storage as a separate infrastructure that needs to be tested independently is only now gaining widespread attention. Finally, except for enterprises at the very high end, knowledge about storage benchmarks hasn't percolated widely through the storage community.

These days, the storage infrastructure represents an increasingly significant portion of the total IT environment. And while computer-processing power roughly doubles every 18 months, storage performance levels aren't increasing nearly as fast. Consequently, said Aral, the storage infrastructure can become a performance bottleneck.

The Datacore benchmark, Aral said, makes a compelling price/performance case for the use of iSCSI SANs. He noted that the test was completed on a network that uses true commodity components. Having the benchmark results, he added, gives smaller companies such as his more credibility as they introduce innovative technologies into arenas dominated by much larger companies.

Benchmarks, of course, aren't a panacea. As veterans of the TPC benchmarking wars know, the test environment is unlikely to match any individual company's production environment. In the database world, benchmarks should be the first step in the evaluation process, not the only or final step. And while the Storage Performance Council is working on SPC-2, the TPC has developed benchmarks for a wide array of different application environments. So, there's still work to be done in storage.

Nonetheless, the use of industry-accepted benchmarks could be an important step forward. As new storage technologies gain acceptance, a method to make valid performance comparisons is crucial.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like