Running Them Up the Flagpole

Performance testing of Web server packages.

Joel Sloss, T.J. Harty

August 31, 1996

3 Min Read
ITPro Today logo

Performance testing these Web server packages was a little simpler,logistically speaking, than analyzing their user-appreciation attributes(although, we had problems with some products not working with our test). Wearchitected a test that represented medium traffic on a typical Web server. Sucha system serves HTML pages, text files, and images, rather than supportingvideo-streaming, CGI programs, and so forth. This approach made the tests easierand faster to run and represents how some people will use the packages. Our testcharacterizes average performance from each server package and demonstrateddifferences in the software.

We ported Silicon Graphics' WebStone 2.0 to Windows NT, tweaked it alittle, and ran it on a four-system setup in the Windows NT MagazineLab, all on an isolated 10 Base T Ethernet network. The setup consisted of aPrimary Domain Controller (PDC), the Web server, and two client systems dividingthe simulated user access load. We used an NEC RISC-Server 2200 as thePDC, configured with a single 200MHz MIPS R4400, 64MB of RAM, and a 2GB SCSIdisk (look for a review of this server in the October issue). We installed NT4.0 beta 2 running DNS.

The Web server (on which we ran all the Web software packages) was anIntergraph InterServe Web-300, with a 150MHz Pentium Pro CPU, 64MB of RAM, andtwo 1GB fast SCSI-2 drives (for a review of this server, see Joel Sloss, "Servingwith Style," on page 45). This system ran NT Server 3.51 with Service Pack4.

A Micron Millennia and a Canon Innova Pro 5400ST were the clients. Weconfigured each system with NT Workstation 3.51 and Service Pack 4, andinstalled Remote Shell and Executive services.

We configured the tests with runs of 20, 25, 30, and 35 client sessionsdistributed between the two workstation systems. Each run lasted 10 minutes, andwe repeated each three times. The clients retrieved a 5120-byte file (which theyused 50% of the time), a 500-byte file (35% of the time), and a 51,200-byte file(15% of the time) from the Web server. Each test took two hours to complete.

To ensure that the test ran properly and that nothing influenced the serverpackages or other containable factors, we reinstalled NT Server for each run bybooting the Web server to NT Workstation and copying a clean version of Serverback to the root directory. This way, we restored the Registry and system filesto their original state for each package.

We extracted some meaningful results from the test by evaluating severalfactors for each server package: server connection rate (connections persecond), server throughput (Mbits per second­Mbps), average response time(in seconds), average client throughput (Kbits per second­Kbps), and totalnumber of pages read. Although we attempted to test all 15 Web servers, only 10servers worked within our test criteria. Table A shows the statistical resultsand Figure A shows the connection rates for the server packages we tested. Werelied on server connection rate for performance characteristics to test eachpackage's capacity under this test. The other values show what Web users canexpect from your system when you run each package.

Microsoft's IIS 2.0 edged out Netscape's SuiteSpot (and FastTrack Server2.0) and Internet Factory's Commerce Builder Pro 1.51. Tight integration with NTcontributed to this winning performance.

Design also had a lot of impact on how each package performed duringtesting. Software such as IIS 2.0 showed little effect on the Web server's CPUutilization during the test (with barely 40% average utilization), whereas otherpackages such as WebSTAR NT/95 from Quarterdeck, maxed out the CPU. Quarterdeckpulled WebSTAR from the market before we finished the review, which is why itwas not included.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like