Talk:Sentience

From Wikipedia, the free encyclopedia

Sentience is part of WikiProject Animal rights, a project to create and improve articles related to animal rights. If you would like to help, please consider joining the project. All interested editors are welcome.

Contents

[edit] Animal Rights Edits

"but allows non-human suffering. However, some animal rights activists hold that many of the distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people, and conclude only what they call irrational speciesism can save this distinction."

I have edited this statement twice, removing

'However, some animal rights activists hold that many of the distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people',

and replacing with simply

'However, many supposed distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people'.

I feel this is a fair edit, as it is not disputed that human cases do exist where in these features are lacking (see 'Animal Liberation' - Peter Singer). If this were not the case, then Singer's argument for animal rights would be flawed. As it is standard scientific belief that intelligence differs from human to human, and that it is possible for humans to lack language ('A Man Without Words' - Susan Schaller), then I fail to see the need to add 'some animal rights activists hold...'. This part of the argument is not a claim from animal rights activists, but rather a scientifically accepted fact (that is that these distinctions vary within humanity) - the animal rights argument comes from the application of this fact.

Secondly, I have edited

'and conclude only what they call irrational speciesism can save this distinction.'

to

'and conclude only that speciesism (prejudice on the basis of species) is the mistaken justification."

Although I grant that this is not the best of wording (and needs to be altered), I stand by the spirit of the edit. Again, the point of the term 'speciesism' is to sum up prejudice on the basis of species. By definition, prejudice is not rational, and so again I fail to see the need to include 'only what they (animal rights supporters) call...'. If you wish to offer disputes (though I don't know of any specifically against this argument) to the conclusion that speciesism is the only justification left available, then by all means do so - but it is not correct to imply that only animal rights activists consider prejudice on the basis of species irrational. This is a given due to the implication of the words, and wouldn't be disputed by either side. Though some may attempt to point out alternative justifications that aren't matched by marginal cases (this is what some philosophers have attempted to do), none would question either that humans vary in the afore mentioned attributes, or that prejudice is irrational.

On this basis, it seems to me that, after agreeing on better wording, the statements should be reverted to more or less what I had written before the edit.otashiro 01:17, 3 July 2006 (UTC)

[edit] Principles of Sentient and Sapient Life

Is there a proposed set of principles that can help determine or define sentient and sapient life?

Principles that might apply are self sacrifice, self awareness, creativity, level of intelligence (and how it would be determined), ability to hope, and even to reget.
These were ideas that I thought might apply.
I don't know of any agreed upon tests, although you seem to be talking about sapience instead of sentience. Humans are included, but some (e.g., Descartes) think only humans have it! (See the great ape project for a different opinion.) MShonle 01:46, 11 Jun 2005 (UTC)
Good point. I think I am looking for some combined definition sentient and sapient life. There may not be a nice compact definition. We will probably end up with a hybrid classification system.
Hmm... It's tricky to define. It seems the general logic is 'if it has emotions, it's sentient'. But then (too) many things can be described as sentient. Suppose you have a computer program that prompts repeatedly for 'Current Emotion Level?' and stores the result in a variable, but doesn't do anything with it. This computer experiences (manually input) emotions, but doesn't do anything with them. So let's add a new condition: 'it' has to act upon its emotions in order to be sentient. Still quite easy: if we go to the moon and visit dual-head universal Turing machines and say some arbitrary tape position represents emotion, you got your sentience. But this machine may actually be a 'smart machine'... in fact it is. But then we get all confused 'cuz we're standing on the wrong side of the road. Why *is* this sentient? Let's say it's not sentient, but we are. What can we do that this can't? Good luck if you find out. The Nobel Prize is waiting... --Ihope127 7 July 2005 22:15 (UTC)
Perhaps a more important question than deciding if other objects are sentient is are we ourselves "sentient?" By our definition, yes, we are. But what if an arbitrary construct decides that sentience, for example, can only exist for creatures of some certain defining quality? Sentience, as it is to us, the ability to reason and ponder our own existence, may not hold true for a being with a completely different physical or mental state. All I'm trying to say is that our idea of sentience is based on what we perceive as being sentient and self-aware. Our "sentience" may not necessarily be another beings "sentience." Nick 01:48, 8 June 2006 (UTC)




I'm not too sure about what previous users have said in this section. I'm pretty sure the definition of sentience is consciousness, that is awareness - sense. What you are commenting on may be more related to sapience.

sen·tient (snshnt, -sh-nt) adj.


1. Having sense perception; conscious: “The living knew themselves just sentient puppets on God's stage” (T.E. Lawrence).

2. Experiencing sensation or feeling. (Dictionary.com)

Sentience does not require emotions, or the ability to act on them as someone said previously (and in your example, it's very debateable over whether or not your computers are experiencing emotion anyway - they'd have to be sentient in the first place in order to feel emotion - in fact this is Descartes point: although animals react emotionally, and appear to be thinking and feeling, as they are not sentient, that is aware (as he claims) - thus they are mechanical - they act exactly as if they were aware of feelings, thoughts and sense, but they are not, simply being robotic. It's also worth noting that Descartes claims are strongly scientifically contested nowadays, biology extremely strongly implying he's wrong (putting aside philosophical debates of knowing other minds) and there are very few scientists who don't believe animals are aware, or sentient today).otashiro 23:23, 28 June 2006 (UTC)

This article makes sentience seem like a black-and-white thing: a creature is either sentient or "non-sentient". It does not mention the idea that perhaps sentience can occur in different degrees. Creatures could have varying amounts of sentience.

[edit] "Woot!" is all I can say

So... I discovered as a direct result of my being human (yes. I *am* human) that the human mind is an odd and fascinating thing. What better thing for an entity to study than others of its class? The mind can have a personality and switch from one to another at will; multiple can even coexist quite peacefully and converse with each other before merging back into a whole (nonphysically of course). My thinking: why should this be limited to humans? We did intelligence tests on lots of animals, but how do we know this is an accurate measure? These animals often have no experience with these tests, however humans do such things all the time. It is tricky to teach these things to an animal, though... but is it really their mind that's at fault? Humans can articulate, and dolphins can click as such (I like that sound :-), but cats, dogs, rats, etc. converse entirely in speech disfluencies. Furthermore their "communication" with the physical world itself is limited: humans and other primates have hands, while cats only have paws. I feel that if animals could make better use of their brains, they would. I could be completely wrong, however--just how did we find that dolphins were as smart as they are? But if we gave them the right toys (Cetacean Linux? Could happen), we might be able to find out a lot. --Ihope127 03:34, 22 August 2005 (UTC)

Cats communicate using a 30-36 sign language, dolphins alone are able to solve "complex" "natural" problems that are not previously taught. nto having hands is a serious hindering factor in the evolution of their brains. The neocortex is almost non-existent in non-human animals.--Procrastinating@talk2me 11:52, 8 June 2006 (UTC)
Actually, that's not true - it is non-existent in fish, but in the other species groups, moving up the evolutionary scale it becomes increasingly larger, and is certainly not almost non-existent in mammals at least. It's also worth noting, just as a side point, that the neo-cortex' size or absence has no bearing on sentience, emotion or the ability to suffer (see 'Animals in Translation' Temple Grandin)otashiro 23:07, 28 June 2006 (UTC)

[edit] Artifical Intelligence

"Some science fiction uses the term sentience to describe a species with human-like intelligence, but a more appropriate term is sapience."

I disagree, the only shows I've seen that consider a robotic "species" to be sentient are those in which they have more than just intelligence. Any programmable machine with enough memory storage can be taught to have "human-like intelligence", however they can't make choices for themselves, nor are they capable of any sort of feeling (as many in Sci-Fi are). --Anon.

In shows that's probably likely, because there are actors or voice actors playing the robots. But I think the sentence is primarily about sci-fi novels. MShonle 14:01, 8 September 2005 (UTC)
Then shouldn't that be stated? The live page simply says Sci-Fi, it doesn't say which medium. --Anon.
Anon, you aren't going to refute Strong AI by simply baldly stating that it is impossible and that things with "human-like intelligence" are nevertheless not conscious. --maru (talk) contribs 20:58, 18 July 2006 (UTC)
Anon, I'm confused. How are you disagreeing? You and the quoted sentence say the very same thing: Sci-Fi works often use "sentient" to mean "able to make judgments or choices." Do you mean that they are using the word correctly? Then you're wrong: sapient = able to make judgments; sentient = able to sense. Are you saying that "human-like intelligence" does not include the ability to make judgments? Fine, but that's not what this sentence is saying. That's what this sentence _assumes_. If that's your problem, change the wording. Solemnavalanche 06:08, 23 November 2006 (UTC)

[edit] Hotly debated?

A paragraph from the text:

Science is making some progress on animal psychology, and evidence of sentience is gradually being seen in animals, such as apes and dolphins. Still, it is a hotly debated issue.

This seems just so stupid. How can people debate if animals can feel? Are those people the blind or non-sentient ones? I was going to remove the paragraph since this should be obvious, and no further scientific research should be needed (it has even been proved that all the mammals share the neural patterns for basic feelings), but I'm letting people remark on this first. Rbarreira 15:45, 21 January 2006 (UTC)

I agree that the paragraph should be removed, but it's not obvious that animals are sentient, and it's not fair to describe it as stupid. I can't even be sure that any human other than myself is sentient. What about insects, bacteria, plants? Where does one draw the line? Very little about sentience is obvious. --Aaron McDaid 18:24, 22 January 2006 (UTC)
Considering the definition of sentience given in the text, I think that one can say that animals are sentient as much as human beings (a bit as you've pointed out). So, I don't think it's fair at all to put a focus on animals.
Considering that there are absolutely no references cited anywhere in the section, this discussion is completely useless. It would be nice if someone could take care of that. 24.251.0.143 03:25, 2 June 2006 (UTC)

[edit] useless debate? rewrite

how does one measure consciousness? are there level of it ? what is the threshold for sapienece? how can you device such an expirement ? what is the difference between selfawareness and consciousnees? are sentient being also sapient? if not how can you regard such a qualia? what did various peopel in history said about this ?

This article is VERY lacking, I marked it for rewrite and refferences.--Procrastinating@talk2me 11:01, 6 June 2006 (UTC)

[edit] Is there a word that means neither sentience nor sapience?

Is there a word other than "non-sentient" or "non-sapient" that exists which means the same thing? --24.18.98.100 02:47, 22 June 2006 (UTC)

Yes, the word is automata and implies robotic - that is not conscious (sentient), which of course entails not self-aware (sapient)

The word "bathtub" means neither sentience nor sapience. There are others. David Olivier 07:57, 7 December 2006 (UTC)

such as Nano-elephants, for example. --Procrastinating@talk2me 16:32, 7 December 2006 (UTC)

[edit] Synthesis via sentience

Sentience is the abductive part of Inquiry relative to Interaction. Data is brought into the body, compared to existing information (deductive), either rejected and/or stored in memory (inductive); or integrated (or synthesized) as knowledge, or sapience.

The integral process of synthesizing tools, including more advanced organs - is relative to conditions, or; conditional (see Interaction). The brain, as does all interconnected systems within our bodies; generates more flexibility, or adaptability to continue the advancement, or synthesis process.

These adaptations are encoded, fused, or synthesized - into our DNA and exercised from generation-to-generation. Major paradigm shifts of more advanced generations constitute oscillations that empower more tools, more synthesis - but not necessarily more code.

Artificial intelligence is a tool, as are we, in our struggle to advance ourselves through the diversity of origin - which constitutes the magnification of the simple synthesis process based on the irrefutable laws of physics. Hypotheses of the integration of AI and our bodies may be modeled by the similarity in nanotechnology patterns, and Synthesis patterns (see Technological Singularity as well).

The synchrony of magnified power has vacillating effects that mirror a fireworks universe. We may be on a path unfolding as vacillating patterns; but hypotheses such as RNA and PAH can empower us to break these patterns toward a more oscillating synthesis with our planet so we can persist..--Dialectic 22:43, 11 July 2006 (UTC)