Talk:Singularitarianism
From Wikipedia, the free encyclopedia
Contents |
[edit] New Talk
Technological Singularity will be created then, when there will be created intelligence, able to create higher intelligence than itself, and it must not necessarily be smarter than human a being. Any intelligence that could do that, would progress exponentially and create singularity Inyuki 18:43, 26 February 2006 (UTC)
[edit] Old talk
[edit] Sp.
"Singularitarians are Technological Singularity activists, specially people who are dedicated to effecting a positive Singularity." I feel 'specially' is likely a typo here, but I will let Gordon correct it himself as he sees fit rather than put words into his mouth. --4.65.244.206 06:18, 10 Mar 2004 (UTC)
[edit] Origins of term
Minority Report, you give the Singularity Institute too much credit. The term "Singularitarian" predates SIAI and many Singularitarians are not affiliated with them. There are Singularitarians who openly denounce SIAI. If you want to erase an article, please try not to replace it with such a misleading statement. I'm reverting to your previous edit. Also, I'm removing several more of the links, as most of them seem to be only distantly related to the topic of the article. — Schaefer 20:45, 23 Nov 2004 (UTC)
- I find it difficult to find any mention of this word online that isn't obivously directly from these same half dozen or so of people. I'm concerned about making this appear to be a major intellectual movement of some kind, when it has all the appearance of being some kind of coffee club. The ruritanian titles, vastly overinflated speculation based on no factual data, I'm not seeing signs that would enable me to endorse as confirmed the claims in the original article. Show me something that will make me take these people's claims seriously. Some references to papers published in peer-reviewed professional science or engineering journals would convince. --Minority Report (IT or PR enormity) 21:49, 23 Nov 2004 (UTC)
-
- You don't have to take their claims seriously. All of Singularitarianism could be one big delusional cult and it shouldn't matter as far as Wikipedia is concerned. The fact that some people believe it is what's being reported. As the article stands now, it doesn't take any stance on whether this singularity thing is going to happen. There are plenty of people that think the whole idea is absurd, and I and others have tried to represent their views in the criticisms sections of technological singularity and transhumanism (both of these articles, admittedly, have problems regarding unattributed speculation, which is possibly grounds for a neutrality dispute).
- I see where you're coming from, and I agree that these articles may give unfamiliar readers the impression that more people subscribe to these ideas than actually do. The number of actual Singularitarians (that is, people who conciously and explicitly work to bring about the Singularity) is quite small. The number of people who believe in the Singularity, or at least the number of people who buy Singularity-themed futurist books like The Spike and The Age of Spiritual Machines, is many many times larger. I don't have actual numbers, so I don't supply them. However, I must object to the current method you're employing to counteract this possible miscommunication. You use qualifiers like "self-styled" and "so-called" which are, in my opinion, inherently and irredeemably POV. Please see Wikipedia:Words to avoid. The use of quotation marks also contributes to a discrediting tone, possibly more so than you intended. There must be better, more neutral ways to epxress what you're trying to get across.
- Your most recent edit changes the meaning of the word "Singularitarian". A major purpose of this article is to clarify the distinction between Singularitarians and futurists: that futurists speculate about a singularity while Singularitarians work towards it. You changed "dedicated to effecting" to "believes that it is possible to achieve", which eliminates the distinction being stressed. I'm guessing you did this because you felt the original sentence seemed to take the POV that the Singularity is possible, which I don't believe it does. Defining a Frogglebobblist as a person who is dedicated to effecting a global ascension to spiritual oneness with treefrogs does nothing to validate those goals.
- Finally, I'm changing the word "futurist" to the less-common "futurologist" to clear any confusion with the artistic/political movement. — Schaefer 23:15, 23 Nov 2004 (UTC)
-
-
-
-
- "All of Singularitarianism could be one big delusional cult and it shouldn't matter as far as Wikipedia is concerned."
-
- I beg to differ. An encyclopedia should try to maintain a separation from the beliefs (correct or incorrect) of others. When reporting, it should try to distinguish between its own standards of reasoning and that of those upon whom it reports.
- On taking singularitatianism seriously, I'm afraid it really does come down to peer reviewed research papers published and cited, not popular books sold. I would be failing as an encyclopedist if I gave equal space and equal weight to every single human belief, however ill defined and poorly supported. The thoughts of Rene Descartes merit more time and space than those of George Michael. A movement that relies for its chief reasoning tool a misapplication of Moore's Law, well it doesn't merit as much time and space as serious AI research.
-
- "You use qualifiers like "self-styled" and "so-called" which are, in my opinion, inherently and irredeemably POV."
-
- On the contrary, they are neutral. If Yudkowsky's qualifications as an AI researcher extend beyond the bounds of his imagination, then I will feel comfortable to stop calling him a "self-styled" AI reseacher. The use of the term "so-called" was in relation to the use of the term "futurists", which is well known to most educated people as a school of art.
-
- "You changed "dedicated to effecting" to "believes that it is possible to achieve", which eliminates the distinction being stressed."
-
- You and I as encyclopedists cannot possibly know whether singularitarians are or are not capable of achieving such a goal. The most we can do is observe that they appear to have a sincere belief that it is achievable and are doing what they believe will achieve it. Having said that I have no objection per se to the use of the word dedicated, especially in the context in which it appears in the current version. If you go back to my version you will see that I made it plain that they not only believed that the benign transformation would be possible, they were attempting to bring it forward. --Minority Report (IT or PR enormity) 00:10, 24 Nov 2004 (UTC)
-
-
-
-
-
-
- "On taking singularitatianism seriously, I'm afraid it really does come down to peer reviewed research papers published and cited, not popular books sold."
- Again, it doesn't have to be taken seriously. Futurology is by no means an established and respected scientific process, and concepts like technological singularities are obviously the products of futurologists' speculations. I'm not seeing any attempt to pass this off as real science. Upon reading the article Technological singularity, did you think it was real science? Did you read it and get the impression that everyone that's anyone in AI research believes all this? I should hope not. It's clear to everyone that all this is based on futurology, not on falsifiable science. If you think it isn't clear, maybe a few of the especially speculative articles need "In futurology," tacked on the first line. I wouldn't recommend it for this particular one, as it's mostly here to contradistinguish Singularitarianism with futurology.
- "On the contrary, [terms such as "self-styled" and "so-called"] are neutral."
- I strongly disagree, but this is now a non-issue since the word "futurist" has been changed and Yudkowsky no longer has an article.
- "The most we can do is observe that they appear to have a sincere belief that it is achievable and are doing what they believe will achieve it."
- Agreed, and that's all I intend to report. — Schaefer 12:19, 24 Nov 2004 (UTC)
-
-
-
-
-
-
-
-
-
- "Upon reading the article technological singularity, did you think it was real science?"
-
- No, but there is a clear attempt to pass science fiction off as science--perhaps because many of the participants are not really aware or don't care about science. My first impression was that this is a quasi-religious movement that uses some of the concepts of science with the language of religion. Perhaps "science fiction religion" fits best. --Minority Report (IT or PR enormity) 16:37, 24 Nov 2004 (UTC)
-
-
-
-
-
-
-
-
-
- As interesting as this discussion is, I think we've strayed a bit off topic for this talk page. If you'd like to discuss this further, please contact me at edmund.schaefer@REPLACE with the word "REPLACE" replaced with "gmail.com" (I'm paranoid about spam robots). --Schaefer 18:34, 24 Nov 2004 (UTC)
-
-
-
-
[edit] Move to "Singularitarianism"
I'm moving this article to Singularitarianism, for the same reason there's redirects at Buddhist, communist, Nazi, etc. I'll obviously have to change some wording around to fit the title, but I'll try to keep everything fairly close to what it is now. --Schaefer 20:48, 24 Nov 2004 (UTC)
[edit] Religion?
Do singularitarians believe in the Singularity, or do they simply think it is highly likely to happen and act accordingly? The difference between a religion, based on unshakable belief, and philosophical movement based on facts and scientifc method verification of those (i.e. allowing doubt and such) is both clear and important. --Piotr Konieczny aka Prokonsul Piotrus Talk 20:29, 3 May 2005 (UTC)
- Depends. The meaning is not really fixed. As Yudkowsky would have it, the term would refer to those who believe that a Singularity is possible, a good thing, and will work towards making it a reality at all, and sooner rather than later at that. Some other people think it probably will happen, and that it is a bad thing (ie. it is more likely goo will eat us all or an UnFriendly AI will take over than it is likely that a happy ending will occur). As to the difference as to whether the term connotes a religion or philosphical movement: again, that depends on the person, whether they think it is a historical inevitability or the hand of god (in which case it is indeed the "rapture of the nerds") or simply a statistically likely possible set of scenarios, which like any should be preplanned and managed to maximise the benefits (The SI and supporters would fall in this latter camp). Hope this helps; I've been watching this movement almost from the inception of the SI, and this is the best I can explain it. --Maru (talk) 20:29, 3 October 2005 (UTC)
- Believing in the Singularity will happen is like a Conservative believing that we will remove all US National Debt someday given the right circumstances. It isn't a religious belief, but more of a secular one. Just as getting rid of all national debt sounds absurd these days, it is theoretically possible if we really try. That is the way to see singularitism... Not as a faith that the Singularity will happen and all our problems will be magically solved at that point, but rather if we put forth effort at (exscuse the term) an accelerating pace that we will see technological results taht will be so rapidly changing that every day will be a totally different scenario. (as it is now... we are a 5 year scenario... I remember shoping for cell phones in 2000 that had black and white screens... can you buy one today?) Rather, I think those who are unable to adapt to changing technologies at an increasing pace will have extreme problems adapting. I personally believe that in 5-10 years FiOS (Fiber to the Curb)will have a 75% market share in the States. It maybe a bit optimistic and unfounded numbers I'm pulling off the top of my head, but a religion in belief in a magic FiOS god it does not make. -James
-
-
- As it stands, I do believe Singularitarianism is a bit of a religion, and somewhat of a cultish one at that. Don't get me wrong, I rather suspect that there is no real scientific reason why many of these things could not actually happen. I mean, outside of the scientific discovery of a "soul", what's really to stop us all from someday downloading our minds into computer hardware? What's to stop us from doing all sorts of crazy stuff like that which Yudkowsky and the SIAI predict? I tend to agree with the quite succinct and pointed statement, "Believing in the Singularity will happen is like a Conservative believing that we will remove all US National Debt someday given the right circumstances." An excellent analogy. What I tend to take exception to with regards Singularitarianism is that they seem to be going way off the deep end with their futurisms. Computronium? Jupiter Brains? Omega Point theory? Some of these things are really far off, even by the standards of others who embrace the idea of Singularity, such as Transhumanists. Many prominent scientists have written about the possibility of the Singularity, or an event similar to the Singularity. The Universe in a Nutshell, the famous "physics-for-laymen" book by Stephen Hawking has a whole chapter describing an event similar to the beginnings of a Singularity, of course, without using that term in order to avoid a great deal of eye-rolling. Bill Joy described possible negative consequences of newer technology, including Singularity-related consequences, in his infamous Wired article Why the future doesn't need us. Leon Kass, who is on the President's Council on Bioethics, as well as the respected economist Francis Fukuyama, both wrote about and are opposed to many Singularity-based or Singularity-like technological developments, and Fukuyama has had an article published in the popular political magazine Foreign Policy regarding his opposition to Transhumanism as the world's most dangerous idea. Even John von Neumann and Stanislaw Ulam had serious discussion regarding a singularity-like event, according to a quote by the latter. But this is not the point. It's fully possible to believe in the probability of the Singularity, and it is even possible to work towards enabling it either politically or scientifically, directly or indirectly. However, what Singularitarians engage in is much greater than that. It must be remembered that futurism is purely philosophical; that is, it is purely speculation. It is the examination of current trends and the estimation of possible outcomes to those trends, minus any unforeseen or spontaneous events or interruptions. I believe the difference between a Singularitarianische-cult and a Singularity and/or Technophilial-themed group or movement is the conflation of futurism with the idea of Futurology, the "study" of the future. Singularitarians take reasonable, if somewhat grandiose, naturalistic speculation and turn it into a rapture in which the "God" of technology will finally save us and transform us into god-like beings after his own image, so long as we all help each other to help him to do so; A clearly religious scenario, whether or not Singularitarians actually see things this way. Ultimately, the distinction comes down to practicality: Reprogenetic engineering is possible, Artificially Intelligent machines are theoretically possible, nanotechnology is theoretically possible to some extent. Science currently deals with these things; Transhumanists and others engaged in futurism wish to see these technologies to their furthest logical conclusions. Singularitarianism crosses this line and overextends itself into the realm of Science Fiction. -jove
-
-
-
-
- You can argue that singularitarians cross the line into Science Fiction, but remember the cliché about the Moon landing being way out in the realms of sci-fi once. I don't think singularitarianism is that way out. Let's look at its three parts. 1 - thinking that the technological singularity is possible in theory. If you accept that machines may one day become intelligent enough to redesign their own hardware and software to become ever-more intelligent, you accept the singularity as being possible. Is it really so far-fetched? We are intelligent enough to make computers that become ever-more powerful along with Moore's Law. If you think that an AI could, in theory, become as intelligent as a human being (where it will inevitably be able to re-design itself and other machines), you have accepted the singularity as possible. 2 - the singularity is good and desirable. Well, this is a moral issue and comes down to the individual. Some anti-singularitarians and neo-Luddites specialize in horror prophecies about technology becoming "too" advanced, while other futurists concentrate on the possible positive outcomes. I take the pragmatic view that, good or evil, the singularity (or something like it) is probably inevitable at this point. As an atheist and an advocate of transhumanism, I tend to see it as a positive outcome anyway. Which leads us to 3 - working towards the singularity. This can be a much more simple and humble thing than you seem to imply by describing singularitarians crossing a line into sci-fi. It could mean working to further the science of AI in some way. It could mean donating to AI research. You could even just spend time chatting to a jabberwacky bot, growing and refining its database in the hope that it may one day pass the Turing Test. Or it could simply be trying to get your friends and people you know interested in the field, to generally encourage a public appetite for futuristic technology. The Great Wall of China was built one humble brick at a time. In this case even helping to construct one fragment of a brick is a step in one particular evolutionary direction. -Neural 12:10, 8 September 2006 (UTC)
-
-
[edit] Merge
Due to the lack of reliable independent sources cited in the article, I believe this article should be merged into Technological singularity. The academic article cited by Nick Bostrom discusses moral issue surrounding the Singularity, but never actually uses the term "Singularitarianism". A search of his site for the term reveals only two trivial mentions in self-published essays, with no real discussion. Also note that all of the links I recently removed from this article were self-published, with the exception of an article on CNNMoney.com that doesn't mention Singularitarianism. With such a paucity of reliable sources to establish notability, I strongly doubt this article could withstand an AfD. Its content is likely notable enough for a section on Technological singularity, and if there are no objections I'd like to begin moving it there. -- Schaefer (talk) 00:24, 17 June 2007 (UTC)
- Yup. The sentence about originally being defined to be this, now being redefined to be that was a red flag to me as to whether "singulatarianism" was really anything in remotely common usage with agreed upon meaning. I might put it under Singularity Institute as co-founder Yudkowsky's term for the position. --Abu-Fool Danyal ibn Amir al-Makhiri 20:17, 27 June 2007 (UTC)
Oppose to Merge with Singularity Institute. --Procrastinating@talk2me 13:17, 30 August 2007 (UTC)
- Making an argument for why the articles shouldn't be merged is helpful, but just saying "oppose" isn't. Decisions on Wikipedia aren't made (or shouldn't be made) through votes. -- Schaefer (talk) 14:43, 30 August 2007 (UTC)
-
- Actually, the founder's general guide line is to try and achieve consensus and vote whether not possible. some local wikipedias got so beurocratical and hypocritical about it that they actually have inner politics and lobbies erasing each other's debate, trying to achieve as much admins as they can....anyways, your' right, I'm sorry. I thought my position was obvious in this article.
- I oppose this merger, since the singularity, as a concept, or a belief system/framework does not stem and should be affiliated or monopolized by a single institute. They do not claim such monopolization, as can be seen in previous comments. I beleive this to be like trying to merge Veganism with the World Vegan Institue..:) --Procrastinating@talk2me 18:25, 4 September 2007 (UTC)
-
[edit] fact tag
Thx to all for the great work. I added the fact tag after "(now partially obsolete)" because Singularitarian Principles and it's ideas make up a lot of the definition (and the other definitions overlap greatly). I'm pretty familiar with the topic, but still wanted details re in what ways it's obsolete. A newcomer (for whom the article must be written) would be justifiably confused: What parts are obsolete, how much of it? So i think this needs to be explained -- or at least cited. Hope this helps, "alyosha" (talk) 02:12, 18 January 2008 (UTC)
- I removed "(now partially obsolete)", since no one sourced or explained it for almost 3 months. I left the Singularitarian Principles stuff, because it is a historically significant document and definition. If anyone can say and source what about it is obsolete (or for whom), please add that in. Hope this helps, "alyosha" (talk) 03:26, 14 April 2008 (UTC)
[edit] Notability tag
Why's there a notability tag? There are hundreds of labs, institutes, and nonprofit organizations around the globe devoted to the singularity, and hundreds of thousands of people who qualify as singularitarians. 24.252.195.3 (talk) 20:34, 2 June 2008 (UTC)