Tyranny of numbers

From Wikipedia, the free encyclopedia

Through the 1960s, computer engineers were faced with the problem of being unable to increase the performance of their designs due to the huge number of components involved. In theory, every component needed to be wired to every other one, and were typically strung and soldered by hand. In order to improve performance, more components would be needed, and it seemed that future designs would consist almost entirely of wiring. This problem became known as the tyranny of numbers.

The problem was eventually solved with the widespread introduction of the integrated circuit (IC). IC's are essentially a number of related components making up one particular function, what had previously been known in computer design as a "module". Unlike the individual components in existing module designs, however, the IC was "wired up" via the process of photoetching, allowing them to be mass produced. Computers now consisted of a number of IC's wired up, dramatically reducing the overall complexity of the machines.

Another problem with the hand-wired designs was the notorious unreliability of the soldering. Any bad component, wire, or solder joint would render the entire module, and thus the computer, inoperable. Although IC's had the same chance of not working as any module, they were so inexpensive to produce that if one didn't work you simply threw it out. Early IC's had failure rates of over 90%, but this meant little because they could be produced on an automated assembly line that completed thousands at a time.

IC's from the early 1960s were not complex enough for general computer use, but as the complexity increased through the 1960s, practically all computers switched to IC-based designs. The result was what are today referred to as the third-generation computers, which became commonplace during the early 1970s. The progeny of the integrated circuit, the microprocessor, eventually superseded the use of individual IC's as well, placing the entire collection of modules onto one chip.

Seymour Cray was particularly well known for making complex designs work in spite of the tyranny of numbers. His attention to detail and ability to fund several attempts at a working design if need be meant that pure engineering effort could overcome the problems they faced. Yet even Cray eventually succumbed to the problem during the CDC 8600 project, which eventually led to him leaving Control Data. Cray nevertheless continued to use the module approach into the 1970s on his Cray-1, perhaps the last first-rate second-generation machine.[citation needed]