Portal:Philosophy/Selected article/38

From Wikipedia, the free encyclopedia

Singularitarianism is a moral philosophy based upon the belief that a technological singularity — the technological creation of smarter-than-human intelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this type of singularity (often referred to as just the Singularity; capitalized and objectivized to indicate its sheer magnitude as a historical event), Singularitarians believe it is not only possible, but desirable if and only if guided safely. Accordingly, they "dedicate their lives" to acting in ways they believe will contribute to its safe arrival.