Grosch's law

Grosch's law is the following observation of computer performance attributed to Herb Grosch in 1965:

There is a fundamental rule, which I modestly call Grosch's law, giving added economy only as the square root of the increase in speed -- that is, to do a calculation 10 times as cheaply you must do it 100 times as fast.

This adage is more commonly stated as

Computer performance increases as the square of the cost. If computer A costs twice as much as computer B, you should expect computer A to be four times as fast as computer B.[1]

Two years before Grosch's statement, Seymour Cray was quoted in Business Week (August 1963) expressing this very same thought:

Computers should obey a square law -- when the price doubles, you should get at least four times as much speed.[2]

The law can also be interpreted as meaning that computers present economies of scale: the more costly is the computer, the price-performance ratio linearly becomes better. This implies that low-cost computers cannot compete in the market. In the end, a few huge machines would serve all the world's computing needs. Supposedly, this might have prompted Thomas J. Watson to predict at the time a total global computing market of five mainframe computers.

Debates

The relevance of Grosch's law today is a debated subject. Paul Strassmann asserted in 1997, that Grosch's law is now "thoroughly disproved" and serves "as a reminder that the history of economics of computing has had an abundance of unsupported misperceptions."[3] Grosch himself has stated that the law was more useful in the 1960s and 1970s than it is today. He originally intended the law to be a "means for pricing computing services."[4] Grosch also explained that more sophisticated ways of figuring out costs for computer installations mean that his law has limited applicability for today's IT managers. However, some scholars have recently rehabilitated Grosch's law, looking at the history of cloud computing and claiming that "Grosch was wrong about the cost model of cloud computing, [but] he was correct in his assumption that significant economies of scale and efficiencies could be achieved by relying on massive, centralized data centers rather than an over-reliance on storage in end units."[5]

Law applied to clusters

For clusters, the original Grosch's law would imply that if a cluster contains 50 machines, and it has another 50 added (twice the cost), the resulting 100-machine cluster has a quadruple processing power, which appears on first inspection to be false. Even a linear advance—100-machine cluster twice as powerful as 50-machine—would be a challenge.

However, the Grosch's Law formulation was for a CPU, not a cluster, as clusters introduce lag times due to instruction and execution allocation, software overhead and physical real estate quantum effects. Grosch's Law is not applicable to clusters directly, any more than Horsepower calculations at the engine apply to Horsepower delivered to the wheels in a gas engined car.

When Google was deciding on the architecture for its Web search service, it concluded that scaling up clusters of large or medium-sized computers as the business grew would be too expensive, and opted for arrays of cheap processors and disk drives.[6]

See also

References

  1. Lobur, Julia; Null, Linda (2006). The Essentials of Computer Organization And Architecture. Jones & Bartlett Pub. p. 589. ISBN 0-7637-3769-0. Retrieved 2008-04-02.
  2. "Computers get faster than ever," Business Week (31 August 1963): p. 28.
  3. Will big spending on computers guarantee profitability?, Paul Strassmann - Excerpts from The Squandered Computer.
  4. Author Of Grosch's Law Going Strong At 87, W. David Gardner, TechWeb News, April 12, 2005 - article discussing Grosch's Law and Herb Grosch's personal career.
  5. Ryan; Merchant; Falvey (2011). "Regulation of the Cloud in India". Journal of Internet Law 15 (4). SSRN 1941494.
  6. The Information Factories - George Gilder's Article on Cloudware