Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Beyond FLOPS: The co-evolving world of computer benchmarking

Lamont Wood | Sept. 19, 2014
It used to be simple: Multiply the microprocessor's clock rate by four, and you could measure a computer's computational power in megaFLOPS (millions of floating point operations per second) or gigaFLOPS (billions of FLOPS.)

But for servers and high-performance machines, three names keep coming up: TPC, SPEC and Linpack.

TPC

Formed in 1988, the Transaction Processing Performance Council (TPC) is a non-profit group of IT vendors. It promotes benchmarks that simulate the performance of a system in an enterprise, especially a stock brokerage (the TPC-E benchmark) or a large warehouse (TPC-C). (The newest TPC benchmark measures Big Data systems.) The scores reflect results specific to that benchmark, such as "trade-result transactions per second" in the case of the TPC-E benchmark, rather than machine speed.

TPC benchmarks typically require significant amounts of hardware, require person-power to monitor, are expensive to set up and may take weeks to run, explains Michael Majdalany, TPC spokesman. Additionally, an independent auditor must certify the results. Consequently, these benchmarking tests are usually carried out by the system manufacturers, he adds.

After results are posted, any other TPC member can challenge the results within 60 days and a technical advisory board will respond, adds Wayne Smith, TPC's general chairman. Most controversies have involved pricing, since benchmarks are often run on machines before the systems — and their prices — are publicly announced, he adds. One that did get some press: In 2009 the TPC reprimanded and fined Oracle $10,000 for advertising benchmarking results that rival IBM complained were not based on audited tests.

The oldest TPC benchmark still in use is the TPC-C for warehouse simulation, going back to the year 2000. Among the more than 350 posted results, scores have varied from 9,112 transactions per minute (using a single-core Pentium-based server in 2001) to more than 30 million (using an Oracle SPARC T3 server with 1,728 cores in 2010). TPC literature says such differences reflect "a truly vast increase in computing power."

The TPC also maintains a list of obsolete benchmarks for reference purposes. Smith recalls that some were rendered obsolete almost overnight. For instance, query times for the TPC-D decision-support benchmark dropped from hours to seconds after various database languages began adopting a function called "materialized views" to create data objects out of frequently-used queries, he recalls.

Smith says that the TPC has decided to move away from massive benchmarks requiring live auditors and towards "express benchmarks" that are based on the results of running code that the vendor can simply download, especially for big data and for virtualization applications.

"But the process of writing and approving a benchmark is still lengthy, in terms of getting everyone to agree," Smith adds.

SPEC

Also founded in 1988, the Standard Performance Evaluation Corporation (SPEC) is a non-profit corporation that promotes standardized benchmarks and publishes the results, selling whichever source code is needed for the tests. Currently, SPEC offers benchmarks for the performance of CPUs, graphics systems, Java environments, mail servers, network file servers, Web servers, power consumption, virtualized environments and various aspects of high-performance computing.

 

Previous Page  1  2  3  4  5  Next Page 

Sign up for CIO Asia eNewsletters.