Just What Is TCO, Anyway?

You've been hearing a lot about TCO lately, but do you really know what it is? Here's an examination of TCO: where it comes from, how to calculate it, and how important it can be to your enterprise.

Mark Minasi

June 30, 1998

18 Min Read
ITPro Today logo in a gray background | ITPro Today

Understand what it means, then put it to work in your organization

Buzzwords fly into and out of the language of computing all the time, but only a few stay around to influence the way we buy--and the way vendors sell--computers and computer software. One buzzword that seems destined for longevity is total cost of ownership (TCO). Used well, TCO is a terrific tool for determining how to get the most computing for the least money. Used poorly, TCO is a high-tech way for vendors to prove that, yes, your first impression was right--a big shiny nickel is clearly better than a dinky little dime, so why don't you take this big shiny nickel and we'll get rid of that piddly dime for you?

TCO has made headlines for two reasons. First, the idea of examining relative costs makes sense. Second, Microsoft competitors Sun Microsystems, Oracle, and Netscape attempt to use TCO to demonstrate that Microsoft solutions are not cost-effective. This strategy forces Microsoft to use TCO to counter that its competitors' products are not cost-effective. Careful examination reveals that none of these industry giants computes TCO in the same way or with completely objective criteria, but at least they've brought an important issue to center stage. Understanding TCO is easier if you understand where TCO comes from, what the two most vocal players--Microsoft and Sun--are claiming about TCO, and why accurately computing TCO figures is so hard.

Understanding TCO is easier if you understand where TCO comes from and why accurately computing TCO figures is so hard.To put it simply, should we keep throwing money at desktop computers?

TCO's Story: PCs Cost Too Much
Here's the foundation of TCO: PCs cost too much. That claim might seem obvious today, but it wasn't always. Back in the late 1970s, when the first appliance microcomputers (the ones you didn't have to solder to make work) appeared, the machines were a godsend for scientific and technical users. In those days, as an econometrician (an economist who analyzes economic statistics and builds models that help people understand economic markets), I couldn't compute the simplest statistics on a pile of numbers without having to submit my calculations to the local mainframe as a batch job--a job whose processing might take hours. The advent of simple microcomputers meant I could piece together my own BASIC programs and get my statistics in minutes instead of hours. So of course I wanted a desktop microcomputer.

How did I justify asking my company to buy one of these computers? Cost. Mainframes cost millions of dollars, I argued, but a decent basic desktop microcomputer was available for about $3000, which was less than 10 percent of my salary. If I completed three times as many statistical jobs in a day, wouldn't I be more productive? At the time, productivity was a buzzword, so I got the computer. I was arguing that although buying a desktop computer would cost money, not buying the computer would mean I would waste time, which also costs money; therefore, I'd lower total costs if I had a desktop computer. The mainframe guys didn't like my having a computer, but the fact that a desktop microcomputer delivered a far better response time than their mainframe delivered carried my argument.

In the mid-1980s, nearly every private firm and government organization looked hard at whether to buy PCs. Nearly all of these groups decided to buy-- not because of cost considerations but to facilitate office automation. There wasn't a lot of software to run on the computers of the late 1970s, so many companies didn't buy computers in quantity. With the flood of word processing, spreadsheet, and database programs that appeared in the mid-1980s, all of a sudden the mainframe guys faced another argument: Mainframes simply couldn't do things such as word processing and creating spreadsheets, and graphical computing of any kind on a mainframe system was cost-prohibitive. In this context, PCs weren't competition for mainframes but for calculators and typewriters. The initial cost of PCs was more than that of calculators or typewriters, but the productivity argument won again: PCs were cheap compared with the time spent by the people who used them.

Today, nearly 25 years after the first Altair 8080-based microcomputers appeared, PCs are a fixture of home and enterprise life, and US business probably couldn't exist without them. The question today is, should we keep to the path of incessant hardware and software upgrades? Or to put it simply, should we keep throwing money at desktop computers?

Many PC proponents argue that PCs increase productivity. However, documenting such increases is hard. For example, from the end of World War II to 1974, US nonfarm productivity rose over 3.4 percent per year. But from 1974 (the year the Altair was introduced) to today, annual productivity growth has averaged only about 1.2 percent. It's hard to argue that PCs have increased productivity if you use annual productivity growth figures as evidence. It's even harder to argue that PCs increase productivity if you analyze computing costs now that computing's focus has shifted from centralized mainframes to decentralized PCs and LANs. For example, consider that in the 1960s, the average Fortune 500 company spent about 1 percent of its gross income on computing; today that company spends about 3 percent of its gross income on computing.

One of the fastest-growing areas of employment is in jobs with titles such as PC support, network specialist, NT administrator, and email administrator. All these people cost something, even though their jobs barely existed 10 years ago. Support folks aren't the only PC-related cost component. Sure, we probably save time by using word processors, spreadsheets, and email. But stack those gains against entire days spent installing new versions of operating systems (OSs) and applications, and hours re-creating lost work, figuring out software upgrades, surfing the Web, and--let's face it--playing Solitaire. Add to the purchase price of PCs the salaries and other costs of the people who keep the networks and PCs running, and you can see why some feel that PCs are less than a productivity panacea.

Is there a solution to the problem of time wasted when employees fiddle with their computers? An employer can decide not to upgrade OS or application software, or to restrict upgrades to once every 5 years. That employer can also ban computer games and formulate a well-enforced company Web usage oversight policy. To cut down on time lost to tweaking and configuration, the employer can simply standardize a set of applications, create a tested configuration using those applications, and then drop the configuration on every desktop, forbidding installation of new applications.

Why don't most companies take these measures? First, most companies are afraid that if they don't upgrade to the latest software, they'll miss out on an important competitive advantage. Second, if employers lock down users' desktops, they face political fallout, because employees see the PCs on their desks as private--like the inside of a wallet or purse--areas an employer has no business controlling. However, employers must look at the value of their PC investment and ask whether constant upgrades are necessary. At what point does the value of standardization outweigh the value of innovation? Users of new technologies have asked these questions, and sooner or later they choose standardization.

The problem of standardization vs. innovation brings us to the real reason why Microsoft, Sun, and others are talking so much about TCO: By developing products that lower TCO, these companies are postponing the day of reckoning when companies say no more upgrades. Behind every ease-of-use feature that's designed to make users' lives happier--graphical interfaces, networks, application integration--lies a structure that's more difficult to support than its predecessor. Which would you rather do: rebuild a config.sys file from memory, or rebuild a Registry?

Sun's Answer
Sun, realizing that Microsoft platforms are popular and powerful but increasingly difficult to support, started the whole TCO discussion with the network computer (NC). In July 1996, Sun, IBM, Apple, Netscape, and Oracle jointly published a reference document describing an NC, with the intention of making this machine standard, cheap, and useful. A basic NC is a computer with a 640 * 480 graphics screen, a mouse and keyboard, a speaker, and a network connection. Its system software includes TCP/IP and Web and Java client support. The NC is a stripped-down Web browser. But what good is that?

The key is Java. Have you ever accessed a Web site with a Java applet on it? You probably don't know, and that's one of the cool things about Java, according to its supporters. A Web page's ticking clock or spinning logo may well be a program written in Java--an applet--on your computer's runtime Java machine. The delivery of Java applets appeals to support technicians, and Sun wants to exploit that appeal. A developer writes a Java applet and attaches it to a Web page. You look at the Web page with your Java-aware browser, and the browser invisibly installs and automatically runs the Java applet. No setup programs. No configuration. No support troubles. Compare that ease of use, NC proponents challenge, to the annoyance of installing Word, Freelance, WordPerfect, and other Windows applications.

Industry analysts have argued, why not take this Java stuff a step further? Rewrite your favorite applications in Java and put them on a Web server inside your company. Then, when you need to run a word processor, you just point your Web browser to a particular URL and the browser will download and install the word processor. The browser then becomes a kind of OS, or at least an operating environment.

The NC hasn't caught on, for several reasons. Most important, Java hasn't proven to be a significantly better way to build big applications, such as word processing programs. For example, Corel tried to implement its PerfectOffice suite in Java but gave up. The Java applets that download and install so easily are small--hence, the name applet. Writing a program in any language that installs unobtrusively is easy if the program is small. A second reason for the slow acceptance of the NC is that it doesn't run the ubiquitous modern-desktop Windows applications. NCs offer benefits, but the price you pay to use an NC is throwing away all your old applications.

Microsoft's Answer
Microsoft's response to the NC challenge is twofold. The company is offering both more support-friendly software (Zero Administration for Windows--ZAW) and hardware (Net PC).

ZAW is a combination of software and jawboning Microsoft hopes will accomplish several simple goals. The first goal is to develop PC OSs that make it easier for users to keep all their data on central servers without worrying about whether the network is up at any given moment. The second goal is to make PC OS software easier to install, and PC application software easier to install without user intervention. The third goal is to make PC software self-healing: An erased DLL or two (or even a hundred) shouldn't keep the application from running. The fourth goal is to make PC software easier to uninstall. A final goal is to give support people an easier way to control user desktops from a central location. Microsoft intends to accomplish these goals with technologies such as IntelliMirror, Server Intelligent Storage, Windows-based Terminal Server, a new Installer service, and the Windows Logo Program. (To learn more about ZAW, see "Zero Administration for Windows," December 1997.)

Will ZAW be cheap? The answer depends. Much of the ZAW technology aims to make NT 5.0 workstation support easier; currently, no ZAW technology works on Windows 98. As a result, to get the full benefits of ZAW, you'd have to throw out all your old systems and replace them with NT 5.0. You can keep your old systems only if they can serve as dumb terminals for Terminal Server. With Terminal Server, a central NT machine does the heavy work--the computer on your desk just displays graphical screens and supplies mouse clicks and keystrokes. Presumably, you can get by with a 386 running Windows for Workgroups. Of course, you'd better be ready to invest in some serious hardware back at the Terminal Server.

On the hardware side, Microsoft in collaboration with a raft of hardware manufacturers has offered an alternative to the NC called the Net PC. A Net PC is a PC that's designed to be cheap and centrally controlled--a machine locked so tight that users can't alter its configuration. (You can read about the Net PC on Microsoft's Web site at http://www.microsoft.com/windows/platform/info/netpc-mb2.htm.)

The low-end design for a Net PC is a locked-box computer with a fixed amount of RAM, hard disk, mouse, network card, Universal Serial Bus (USB) ports, and monitor--no disk drive, CD-ROM player, expansion slots, serial or parallel ports. A Net PC has Dynamic Host Configuration Protocol (DHCP), Trivial File Transfer Protocol (TFTP), and a universal network driver built into its BIOS, all of which let the Net PC get on the network and get noticed before it even needs to access a hard disk. Designed to be a $500 throwaway PC, a Net PC would shine as a Windows terminal machine or as a ZAW client, once Active Directory (AD) servers appear with NT 5.0. The mid-1997 Net PC specifications call for a 133MHz processor and 16MB of RAM, but the cost of speedier processors and RAM has fallen enough that Net PC vendors can probably up the ante to 200MHz and 64MB of RAM. This improvement creates a comfortable configuration for Windows 9x, and an acceptable one for NT 4.0 and possibly NT 5.0.

A lot of people have scorn for this system, but there's much to like about the Net PC idea. First, it's built around the familiar PC Windows and Intel (Wintel) architecture, which reduces the hardware-support learning curve. Second, the locked box means that the Net PC is really a toaster--you just take it out of the box and plug it in. Granted, the instant-on part won't work too well until you run Terminal Server or have an AD server, but the whole idea of the Net PC is to build something that will be a good solution in the future. The Net PC is entirely Plug and Play (PnP is easy when the box has no expansion slots), and you can remotely reconfigure it. Even the familiar Press DEL to enter CMOS doesn't appear on startup, because local users can't access the CMOS settings (although administrators can do so from across the network).

How They Do the Math
TCO is not about setting up a particular computing configuration in your organization--it's about what that configuration will cost. Although this concept seems simple, it's anything but simple. Try to figure the costs of running a Wintel ZAW shop. Then, figure the costs of running a Sun-Java-NC shop and see which total is lower. If you attempt this or a similar exercise, you'll quickly find that many of the costs of running a computerized organization are difficult to quantify. To understand why this problem is so complex, let's examine a study Deloitte & Touche conducted for Microsoft and Digital Equipment in October 1997. The study attempted to show that NT was cheaper, from a TCO perspective, than UNIX. (You can examine this study at http://www.microsoft.com/windows/ platform/info/nt-unix_tco_study.htm.) This study did not compare UNIX Web servers and NCs to NT running Terminal Server and PCs acting as Windows terminals. Rather, it compared UNIX boxes to NT boxes as technical workstations.

The Deloitte & Touche study defined a simple cost model with four cost components. The first component, direct hardware and software costs, is the easiest to define. The second component covers the costs for maintenance and support staff, a figure that's a bit more difficult to nail down. The third component is what the study's authors call opportunity costs, which are the costs of time that users spend doing their own technology support. Opportunity costs are difficult to quantify, unless you're willing to stand over a user night and day for weeks at a time (which will make the user behave differently). I'll discuss the fourth component, the productivity factor, shortly, but first let's look at how the Deloitte & Touche analysts derived the second and third cost components.

The study's authors arrived at the maintenance and support figure by "normalizing the support model across the sample population," which sounds as if they simply took maintenance and support staff costs and divided that figure by the number of users. This approach seems reasonable, but it masks an important part of the analysis: What are the fixed and variable costs? For example, Microsoft Premier Support has a very large fixed component. Premier Support for both a 10-person company and a 50-person company costs about $30,000 a year. Thus, Premier Support will appear expensive in a TCO study of a small company. Suppose a Microsoft competitor decides to offer support with a smaller fixed component and a larger per-seat component. In that case, a TCO analysis for a small company might make the imaginary Microsoft competitor's support package look better than Premier Support. However, if you redo the math for a larger company, Premier Support might average out to be cheaper per seat.

Of the study's first three components, the hardest one to quantify is the opportunity cost: How many hours a day do you spend tinkering with your PC, trying to make it work? The study's authors designed a user questionnaire to find out. Not surprisingly, the numbers the analysts came up with demonstrated that NT was cheaper to support than UNIX. On average, an NT workstation's total 3-year opportunity cost was $66,000. For a UNIX workstation, the cost was $104,000. Sounds pretty impressive. Then you learn about the productivity effect.

The study's first three cost components showed relative costs of $66,000 for NT and $74,000 for UNIX. Not content with merely beating UNIX in three components, however, the study's authors threw in a fourth component: the productivity effect. The study analysts argued that if you have a UNIX box on your desk, you'll need to get a PC anyway for email, word processing, and other applications. The study increased the relative costs of the UNIX desktops an extra $10,000, the per-year cost of a basic networked PC. (The study doesn't specify whether any of the cost flows were discounted to reflect their changing values across time.) All of a sudden, UNIX is a lot more expensive.

Two factors spring to mind that, if incorporated in the study, might have swung the results in either direction. First, what about the cost of user training? The authors of the study might have incorporated user training costs in the opportunity cost component, but the study doesn't make this factor clear. Second, which OS is harder to train people to use? Anyone who's mastered Solitaire is close to being proficient with NT, which leads me to guess that training someone to use NT is cheaper than training someone to use UNIX. And what about the cost of support people? Is finding UNIX support people easier than finding NT support people? I have no idea what UNIX support folks cost or how plentiful they are, but NT support people are both scarce and expensive.

You can sympathize with the people commissioned to do the study, but sympathy shouldn't stop you from suspecting the accuracy of the results. Some cost values in the study are unquestionably shaky, which makes the results just as shaky. The study's methodology leads to a difficult question: If there's no objective way to measure the costs of ownership, is it ethical to publish TCO conclusions based on cost estimates? Not every IS manager will stop and analyze the Deloitte & Touche study's methodology. Some IS managers, forced by the pressure of time, will simply opt for NT over UNIX. That choice might be the right answer for them, but they would have arrived at the answer for the wrong reason.

The Truth About TCO
After you've run the numbers and hashed and rehashed the analyses, you find that no independent and unbiased TCO study can prove that a particular computing alternative based on modern popular computing platforms is radically cheaper or more expensive than any other alternative. That conclusion is based on simple economics, and in particular, simple computer economics. For instance, UNIX boxes are a lot cheaper now than they were 10 years ago. Why? Simple: PCs appeared and forced down UNIX costs. The only other alternatives for UNIX would have been extinction or a retreat into a specific niche with hope that the PC didn't follow. Computer manufacturers who didn't offer lower-cost computing aren't around now to register entries in the desktop TCO sweepstakes.

Microsoft recently trumpeted a GartnerGroup report, "TCO: New Technologies, New Benchmarks." The report compared NCs and Windows terminals and proved that Windows PCs are cheaper than NCs. However, a closer look at that report reveals that one of the subjects of the report was a PC with the proper set of system policies (policies restricting it to run only Internet Explorer--IE--and no other applications), which cost $6469 a year. That PC's competitor in the report, the NC, cost $6547 a year. The cost difference between the Windows PC and the NC was a grand total of $78 per user, per year. The level of uncertainty in quantifying many cost components renders a difference of $78 meaningless. If one computing alternative were radically cheaper than the others, we'd be using it, not discussing it.

So is TCO bogus? Not at all. It's high time we stopped blindly buying the latest hardware and software just because Microsoft, Lotus, Sun, or Netscape tells us, "It's newer, so it must be better--and besides, you'd better buy it because we're going to stop supporting the old version." It's high time we did some dollars-and-cents examination of what we're getting for our computing dollars. And it's high time OS and applications manufacturers focused not only on the needs of end users and programmers, but on the needs of support staff as well.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like