Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

What does SPECweb99 measure?

0
Posted

What does SPECweb99 measure?

0

SPECweb99 measures the maximum number of simultaneous connections, requesting the predefined benchmark workload, that a web server is able to support while still meeting specific throughput and error rate requirements. The connections are made and sustained at a specified maximum bit rate with a maximum segment size intended to more realistically model conditions that will be seen on the Internet during the lifetime of this benchmark.

0

SPECweb99 measures the maximum number of simultaneous connections, requestingthe predefined benchmark workload, that a web server is able to support while still meeting specific throughput and error rate requirements. The connections are made and sustained at a specified maximum bit rate with a maximum segment size intended to more realistically model conditions that will be seen on the Internet during the lifetime of this benchmark.

0
10

The key metric for SPECweb99 is simultaneous connections (conns). That is, how many simultenous conforming web connections (of the specific SPECweb99 “mix”) are sustained over the length of the benchmark. The “synthetic web request mix” is about 70% Static GET, 25% Dynamic GET and 5% Dynamic POST operations. File sizes retrieved from the web server also vary from a trivial 100 Bytes to 1 Million Bytes. File space used for the “served documents” also increases as the size of the benchmark grows. Efforts are even made through the use of special random number distributions to simulate the effect of a localization of accesses to a subset of the directories. This “localization of access” is a natural effect of some things being more interesting than others! Components stressed for this benchmark are the network adapters (usually multiple copies of Gigabit Ethernet Adapters), CPU and Memory. Memory size should be large enough to cache the entire fileset being served or I/O becomes a bottlene

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.