Grosch's law

From Wikipedia, the free encyclopedia

Grosch's law is the following observation about computer performance made by Herb Grosch in 1965:

"There is a fundamental rule, which I modestly call Grosch's law, giving added economy only as the square root of the increase in speed -- that is, to do a calculation 10 times as cheaply you must do it 100 times as fast."

This law is more commonly stated as

Computer performance increases as the square of the cost. If you want to do it twice as cheaply, you have to do it four times faster.

The law can also be interpreted as meaning that computers present economies of scale: Bigger computers are more economical. This contradicts Moore's law, which states that many small computers are better than a few big ones.

The relevance of Grosch's law today is a debated subject. In his book, The Squandered Computer, Paul Strassmann asserts that Grosch's law is now "thoroughly disproved" and serves "as a reminder that the history of economics of computing has had an abundance of unsupported misperceptions." [1] Grosch himself has stated that the law was more useful in the 1960s and 1970s than it is today. He originally intended the law to be a "means for pricing computing services." [2] Grosch also explained that more sophisticated ways of figuring out costs for computer installations mean that his law has limited applicability for today's IT managers.

[edit] References


In other languages