Max More

From Wikipedia, the free encyclopedia

Max More at a 2006 summit at Stanford University.
Enlarge
Max More at a 2006 summit at Stanford University.

Max More (born Max T. O'Connor, January 1964) is a philosopher and futurist who writes, speaks, and consults on advanced decision making and foresight methods for handling the impact of emerging technologies.

Born in Bristol, England, More has a degree in Philosophy, Politics and Economics from St Anne’s College, Oxford University (1987). His 1995 University of Southern California doctoral dissertation "The Diachronic Self: Identity, Continuity, and Transformation" examined several issues that concern transhumanists, including the nature of death, and what it is about each individual that continues despite great change over time.

Founder of Extropy Institute, More has written many articles espousing the philosophy of transhumanism and the transhumanist philosophy of extropy, most importantly his Principles of Extropy (currently version 3.11). In a 1990 essay "Transhumanism: Toward a Futurist Philosophy", he introduced the term "transhumanism" in its modern sense, connecting its high-tech means to its humanist roots.

More is also noted for his writings about the impact of new and emerging technologies on businesses and other organizations. Published critics have often positioned More as a utopian, but a survey of his work suggests a critical, practical approach to the future. Most recently, More wrote "the Proactionary Principle", intended as a balanced guide to the risks and benefits of technological innovation.

According to his personal website, “Max is concerned that our burgeoning technological capabilities are racing far ahead of our standard ways of thinking about future possibilities. His work aims to improve our ability to anticipate, adapt to, and shape the future for the better.” Until 1989, Max More published under his former name, Max T. O'Connor.

[edit] Quote

"People’s freedom to innovate technologically is highly valuable, even critical, to humanity. This implies a range of responsibilities for those considering whether and how to develop, deploy, or restrict new technologies. Assess risks and opportunities using an objective, open, and comprehensive, yet simple decision process based on science rather than collective emotional reactions. Account for the costs of restrictions and lost opportunities as fully as direct effects. Favor measures that are proportionate to the probability and magnitude of impacts, and that have the highest payoff relative to their costs. Give a high priority to people’s freedom to learn, innovate, and advance." -- Max More, "The Proactionary Principle"

[edit] See also

[edit] External links

In other languages