Centre for the Study of Existential Risk

Centre for the Study of Existential Risk
Founder Huw Price, Lord Martin Rees, Jaan Tallinn
Purpose Higher Education and Research
Headquarters Cambridge, England
Parent organization
University of Cambridge
Website cser.org

The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology.[1] The co-founders of the centre are Huw Price (a philosophy professor at Cambridge), Martin Rees (a cosmologist, astrophysicist, and former President of the Royal Society) and Jaan Tallinn (a computer programmer and co-founder of Skype).[2] CSER's advisors include philosopher Peter Singer, computer scientist Stuart J. Russell, statistician David Spiegelhalter, and cosmologists Stephen Hawking and Max Tegmark.[3] Their "goal is to steer a small fraction of Cambridge’s great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."[3][4]

Areas of focus

The centre's founding was announced in November, 2012.[5] Its name stems from Oxford philosopher Nick Bostrom's concept of existential risk, or risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential".[6] This includes technologies that might permanently deplete humanity's resources or block further scientific progress, in addition to ones that put the species itself at risk.

Among the global catastrophic risks to be studied by CSER are those stemming from possible future advances in artificial intelligence. The potential dangers of artificial general intelligence have been highlighted in early discussions of CSER, being likened in some press coverage to that of a robot uprising à la The Terminator.[7][8] Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us".[9] Price has also mentioned synthetic biology as being dangerous because "[as a result of] new innovations, the steps necessary to produce a weaponized virus or other bioterror agent have been dramatically simplified" and that consequently “the number of individuals needed to wipe us all out is declining quite steeply.”[10][11]

Other technologies CSER seeks to evaluate include molecular nanotechnology,[12] extreme climate change,[13] and systemic risks from fragile networks.[14][15]

Media coverage

CSER has been covered in many different newspapers (particularly in the United Kingdom),[4][7][8][9][10][14][16] mostly covering different topics of interest. University of Cambridge Research News' coverage of it focuses on risks from artificial general intelligence.[17]

See also

References

  1. Biba, Erin (1 June 2015). "Meet the Co-Founder of an Apocalypse Think Tank". Scientific American. Retrieved 2 July 2016.
  2. Lewsey, Fred (25 November 2012). "Humanity's last invention and our uncertain future". Research News. Retrieved 24 December 2012.
  3. 1 2 "About CSER". Centre for the Study of Existential Risk.
  4. 1 2 Connor, Steve (14 September 2013). "Can We Survive?". The New Zealand Herald.
  5. AP News, Cambridge to study technology's risk to humans, 25 November 2012.
  6. Bostrom, Nick (2002). "Existential Risks: Analyzing Human Extinction Scenarios" (PDF). Journal of Evolution and Technology. 9 (1). Retrieved 27 March 2014.
  7. 1 2 Gaskell, Adi (27 November 2012). "Risk of a Terminator Style Robot Uprising to be Studied". Technorati. Archived from the original on 30 November 2012. Retrieved 2 December 2012.
  8. 1 2 Naughton, John (2 December 2012). "Could robots soon add to mankind's existential threats?". The Observer. Retrieved 24 December 2012.
  9. 1 2 Hui, Sylvia (25 November 2012). "Cambridge to study technology's risks to humans". Associated Press. Retrieved 30 January 2012.
  10. 1 2 Paramaguru, Kharunya (29 November 2012). "Rise of the machines: Cambridge University to study technology’s ‘existential risk’ to mankind". Time. Retrieved 2 May 2014.
  11. "Biological and Biotechnological Risks". Retrieved 29 May 2015.
  12. "Molecular nanotechnology". Centre for the Study of Existential Risk. Retrieved 4 May 2014.
  13. "Extreme Climate Change". Retrieved 29 May 2015.
  14. 1 2 Osborne, Hannah (13 September 2013). "Doomsday list for apocalypse: bioterrorism, cyber-attacks and hostile computers threaten mankind". International Business Times. Retrieved 2 May 2014.
  15. "Systemic risks and fragile networks". Centre for the Study of Existential Risk. Retrieved 2 May 2014.
  16. "CSER media coverage". Centre for the Study of Existential Risk. Archived from the original on 30 June 2014. Retrieved 19 June 2014.
  17. "Humanity's Last Invention and Our Uncertain Future". University of Cambridge Research News.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.