Banner
Home Page
Services
Benchmarks
Reports
Web Server Reports
Directory Server Reports
Proxy/Cache Reports
White Papers
BangForBuck
Company
Contact Us
Performance Reports

Web Server Performance Testing: The Case for Quality

June 20, 1996

Recently published benchmark reports for Netscape's FastTrack Web server and Microsoft's Internet Information Server (IIS) demonstrate the need for quality performance testing and quality result reporting. Mindcraft's policy and practice is to publish test results and test environment information that are complete enough so that others can verify our work. Our expectation that others do so too has led to the recent defamatory attack by Haynes & Company and Shiloh Consulting on Mindcraft. More about this later.

Quality Testing

Mindcraft, Inc. has been in the software testing business for over 11 years. We have developed millions of lines of portable test suite code as well as providing testing services. Mindcraft's software testing lab has been accredited by the National Voluntary Laboratory Accreditation Program (NVLAP) which is part of the U.S. Government's National Institute of Standards and Technology (NIST). Our quality system meets international standards for software testing.

We have run hundreds of conformance certification tests for our clients, who include most of the major vendors of POSIX-conforming computer systems. This testing is an exacting task that requires that system configuration parameters and test suite configuration parameters be set correctly and recorded, that run-time procedures be followed consistently, and that complete records be managed. We have never had the validity of one of our certifications questioned.

Mindcraft has brought an equally methodical and high quality approach to Web server performance testing.

The Controversy

In May we published a benchmark report of Netscape's FastTrack 1.0 Beta 3 Web server. The testing was done under contract to Netscape using the WebStone 1.1 test suite. This report compared our results with those for the Microsoft Internet Information Server published by Haynes & Company and Shiloh Consulting for an unnamed client. Recently Haynes and Shiloh responded to Mindcraft's report with a smear on Mindcraft's reputation in an apparent attempt to discredit our conclusions.

The real issue here is that the Mindcraft report fully discloses all of the information needed to reproduce our test results while the Haynes/Shiloh report does not.

Let's look at the issues Haynes and Shiloh raise in their so-called review of our report.

Different API Code: As we stated in our May report, the Haynes/Shiloh report did not include enough information to allow others reproduce their tests. Their report did not include the Microsoft ISAPI code they substituted for the nsapi-send.c module of WebStone 1.1. We contacted Shiloh Consulting to ask them for the code. They declined to make the code available. They did not respond to our request that they ask their client to release the code.

Haynes and Shiloh stated that they used modified API test code but would not reveal their changes. In order to make sure that our testing would produce results that were comparable to the Haynes/Shiloh results, we were obliged to try several different approaches that a test consultant might have taken when code had to be changed with no clear specification of the behavior required. Therefore, we ran the NSAPI code from WebStone 1.1. In addition we did several test runs using modified versions of this NSAPI code. We published the results of all of these tests and the code used.

After our report was published, Microsoft sent us ISAPI code that they said was used for the Haynes/Shiloh testing. The code they sent seems to be a straightforward re-casting of the WebStone 1.1 code for a different API.

Server System Configuration: Again, the Haynes/Shiloh report did not include enough information to allow others to reproduce their tests. The system software configuration was not reported in detail. Since their report, like ours, covered only Web server performance, we configured our test system for optimum server performance by disabling unneeded processes and logging out from the console. To do otherwise would have been to produce numbers that understated the performance of the server under test.

Haynes and Shiloh complain that our comparison was unfair because they ran with various NT servers enabled and with other processes running. On this point the Haynes/Shiloh criticism reflects more on their reporting than on our testing procedures.

Client Hardware: One cannot always use the exact same test equipment as another test lab for many reasons. In order to make sure that our test configuration was comparable to that used for the Haynes/Shiloh tests, we tested the released version of Microsoft's IIS under the same conditions we used for Netscape's FastTrack. The performance of IIS on our test network was about two per cent slower than the results published in the Haynes/Shiloh report. We considered that this established a good baseline for comparison. Microsoft's licensing terms for IIS prevented us from publishing these results.

Conclusion

It is vital that published benchmark reports contain enough information so that a test can be reproduced by any interested party. Keeping vital information out of reports is a disservice to the server vendors and to the user community.

 


Copyright © 1997-98. Mindcraft, Inc. All rights reserved.
Mindcraft is a registered trademark of Mindcraft, Inc.
For more information, contact us at: info@mindcraft.com
Phone: +1 (408) 395-2404
Fax: +1 (408) 395-6324