Calculating TCO for NAS and SAN Solutions

Read why determining the total cost of ownership (TCO) for mission-critical systems, such as those that form the storage infrastructure in data-intensive operations, is particularly important.

Elliot King

January 13, 2002

3 Min Read
ITPro Today logo in a gray background | ITPro Today

For 2001, total cost of ownership (TCO) was the runner-up in the buzzword-of-the-year contest (Return on Investment—ROI—was the winner). The TCO metric represents an attempt to anticipate and measure the complete cost of using a specific technical solution over the entire life of its implementation. The TCO for a storage solution includes the initial cost of hardware and software plus the upgrades that a business will need to purchase over time; implementation costs, which include both installation and training; support, operation, and maintenance costs; and, significantly, the business cost of system downtime. Determining the TCO for mission-critical systems, such as those that form the storage infrastructure in data-intensive operations, is particularly important.

In a recently released study, INPUT, a leading provider of Web-based e-business market research and marketing services, found that on a per-gigabyte basis, Network Attached Storage (NAS) installations using equipment from Network Appliances provided a 60 percent to 80 percent lower TCO over the lifecycle of the implementation compared to SAN solutions from Compaq, EMC, and Hitachi Data Systems (HDS).

INPUT based its results on a survey of users who employed each product in real-world operating environments, generally as a storage solution associated with an Oracle database environment. The study focused on data availability in addition to system availability: The file system and the application server needed to be operating along with the disk subsystem to qualify for the study's performance statistics. The data-storage requirements in each setting were roughly equal.

Several factors drove NAS's lower TCO. Perhaps the most significant metric, however, was data availability. According to INPUT, 67 percent of the Network Appliance installations achieved a data-availability service level of 99.5 percent or more. Only 25 percent of the HDS installations and 19 percent of the Compaq installations reached that level. None of the respondents using EMC technology reported a 99.5 percent level of data availability. In fact, in an earlier INPUT study, only 67 percent of the respondents using EMC technology achieved the 99 percent mark.

Even the 99.5 percent mark for data availability means that significant downtime exists. In Web-centric environments that are expected to be up and running 24 hours a day, 7 days a week, 99.5 percent implies nearly 44 hours of downtime, equal to more than a full standard workweek. The 99 percent availability mark implies more than two workweeks of downtime. Such loss of data availability can range from being merely an inconvenience to significantly damaging productivity. Moreover, as enterprises integrate their information systems with the systems of their suppliers and customers, the need for robust data availability grows.

The gap in the data-availability service-level marks for SAN and NAS has several causes. According to the study, the NAS implementations required less downtime as the systems were scaled up. Moreover, restoring a failed system took about one-third as much time in the NAS environment as it did in the SAN environment.

A second key factor driving TCO involves IT staff use. According to the INPUT study, administrators using SAN solutions spent 40 percent to 50 percent of their time performing routine tasks, compared with the 11 percent that NAS administrators spent. At the low end, SAN administrators reported that they spent about 35 percent of their time on backup and recovery routines and 5 percent of their time on version changes. In contrast, NAS administrators spent only about 11 percent of their time on routine activities and spent the rest of their time performing value-added tasks such as performance tuning and architectural improvements.

The implications of this study are clear. In calculating true TCO, the initial investment in hardware and software is only the first step of the process. Over the lifecycle of a storage implementation, the usage factors that drive the cost are data availability and operational factors, such as the ease with which administrators can explore value-added functions.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like