|
"Why the future doesn't need us" is an article written by Bill Joy (then Chief Scientist at Sun Microsystems) in the April 2000 issue of Wired magazine. In the article, he argues (quoting the sub title) that "Our most powerful 21st-century technologies — robotics, genetic engineering, and nanotech — are threatening to make humans an endangered species." Joy warns:
While some critics have characterized Joy's stance as obscurantism or neo-Luddism, others share his concerns about the consequences of rapidly expanding technology.[1]
Contents |
Joy argues that developing technologies provide a much greater danger to humanity than any technology before it has ever presented. In particular, he focuses on genetics, nanotechnology and robotics. He argues that 20th century technologies of destruction, such as the nuclear bomb, were limited to large governments, due to the complexity and cost of such devices, as well as the difficulty in acquiring the required materials. He uses the novel The White Plague as a potential nightmare scenario, in which a mad scientist creates a virus capable of wiping out humanity.
He also voices concern about increasing computer power. His worry is that computers will eventually become more intelligent than we are, leading to such dystopian scenarios as robot rebellion. He notably quotes the Unabomber on this topic.
In The Singularity Is Near, Ray Kurzweil questioned the regulation of potentially dangerous technology, asking "Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk that these same technologies may someday be used for malevolent purposes?". However, some authors, such as John Zerzan and Chellis Glendinning, believe that modern technologies are bad for both freedom and the problem of cancer, and that the two issues are connected.[2][3][4]
In the AAAS Science and Technology Policy Yearbook 2001 article titled A Response to Bill Joy and the Doom-and-Gloom Technofuturists, Bill Joy was criticized for having technological tunnel vision on his prediction, by failing to consider social factors.[5]
Martin Ford author of The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future [6] makes the case that the risk posed by accelerating technology may be primarily economic in nature. Ford argues that before technology reaches the point where it represents a physical existential threat, it will become possible to automate nearly all routine and repetitive jobs in the economy. In the absence of a major reform to the capitalist system, this could result in massive unemployment, plunging consumer spending and confidence, and an economic crisis potentially even more severe than the Great Depression. If such a crisis were to occur, subsequent technological progress would dramatically slow because there would be insufficient incentive to invest in innovation.
After the publication of the article, Bill Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.
In the 15th Anniversary issue of Wired Magazine in 2008, Lucas Graves's article reported that the genetics, nanotechnology, and robotics technologies have not reached the level that would make Bill Joy's scenario come true.[7]