Intentional stance

From Wikipedia, the free encyclopedia

The intentional stance is a term coined by philosopher Daniel Dennett for the level of abstraction in which we view the behavior of a thing in terms of mental properties. It is part of a theory of mental content proposed by Dennett, which provides the underpinnings of his later works on free will, consciousness, folk psychology, and evolution.

Here is how it works: first you decide to treat the object whose behavior is to be predicted as a rational agent; then you figure out what beliefs that agent ought to have, given its place in the world and its purpose. Then you figure out what desires it ought to have, on the same considerations, and finally you predict that this rational agent will act to further its goals in the light of its beliefs. A little practical reasoning from the chosen set of beliefs and desires will in most instances yield a decision about what the agent ought to do; that is what you predict the agent will do.
Daniel Dennet, The Intentional Stance, p. 17

Dennett's three levels

The core idea is that, when explaining and predicting the behavior of an object, we can choose to view it at varying levels of abstraction. The more concrete the level, the more accurate in principle our predictions are; the more abstract, the greater the computational power we gain by zooming out and skipping over the irrelevant details.

Dennett defines three levels of abstraction:[1]

  • The most concrete is the physical stance, which is the domain of physics and chemistry. At this level, we are concerned with such things as mass, energy, velocity, and chemical composition. When we predict where a ball is going to land based on its current trajectory, we are taking the physical stance. Another example of this stance comes when we look at a strip made up of two types of metal bonded together and predict how it will bend as the temperature changes, based on the physical properties of the two metals.
  • Somewhat more abstract is the design stance, which is the domain of biology and engineering. At this level, we are concerned with such things as purpose, function and design. When we predict that a bird will fly when it flaps its wings on the basis that wings are made for flying, we are taking the design stance. Likewise, we can understand the bimetallic strip as a particular type of thermometer, not concerning ourselves with the details of how this type of thermometer happens to work. We can also recognize the purpose that this thermometer serves inside a thermostat and even generalize to other kinds of thermostats that might use a different sort of thermometer. We can even explain the thermostat in terms of what it's good for, saying that it keeps track of the temperature and turns on the heater whenever it gets below a minimum, turning it off once it reaches a maximum.
  • Most abstract is the intentional stance, which is the domain of software and minds. At this level, we are concerned with such things as belief, thinking and intent. When we predict that the bird will fly away because it knows the cat is coming and is afraid of getting eaten, we are taking the intentional stance. Another example would be when we predict that Mary will leave the theater and drive to the restaurant because she sees that the movie is over and is hungry.

A key point is that switching to a higher level of abstraction has its risks as well as its benefits. For example, when we view both a bimetallic strip and a tube of mercury as thermometers, we can lose track of the fact that they differ in accuracy and temperature range, leading to false predictions as soon as the thermometer is used outside the circumstances for which it was designed. The actions of a mercury thermometer heated to 500°C can no longer be predicted on the basis of treating it as a thermometer; we have to sink down to the physical stance to understand it as a melted and boiled piece of junk. For that matter, the "actions" of a dead bird are not predictable in terms of beliefs or desires.

Even when there is no immediate error, a higher-level stance can simply fail to be useful. If we were to try to understand the thermostat at the level of the intentional stance, ascribing to it beliefs about how hot it is and a desire to keep the temperature just right, we would gain no traction over the problem as compared to staying at the design stance, but we would generate theoretical commitments that expose us to absurdities, such as the possibility of the thermostat not being in the mood to work today because the weather is so nice. Whether to take a particular stance, then, is determined by how successful that stance is when applied.

Dennett argues that it is best to understand human behavior at the level of the intentional stance, without making any specific commitments to any deeper reality of the artifacts of folk psychology. In addition to the controversy inherent in this, there is also some dispute about the extent to which Dennett is committing to realism about mental properties. Initially, Dennett's interpretation was seen as leaning more towards instrumentalism, but over the years, as this idea has been used to support more extensive theories of consciousness, it has been taken as being more like Realism. His own words hint at something in the middle, as he suggests that the self is as real as a center of gravity, "an abstract object, a theorist's fiction", but operationally valid.[2]

Memes

The intentional stance is an epistemic position taken up in response to an unthinkable other. Its purpose is to make something easier to understand. When this is forgotten, the result is fallacious reification. This is thought to have occurred in the case of memetics.

… a cultural meme pool can be thought of as an intentional system only insofar as it remains an object of philosophical enquiry. As soon as it is reified as actually involving intentions, adopting either a "design stance" (What did the individual human meme-maker intend?) or a "physical stance" (What is an intended meme made of?) becomes more appropriate. (p. 95)[3]

The author called this "Dennett's Rule." He showed that, in the case of memes, the intentional stance became increasingly implicit over time: as the idea of an "idea virus" was popularized further, the stance eventually dropped away entirely.

The main difference between Blackmore's replication of the meme and Dennett's … was that Blackmore dropped the intentional stance even as she kept its active interpretation. While the stance had been implicit in Dennett's discussions of the meme, it was absent in Blackmore's. As a result, following the publication of The Meme Machine, the meme was reified completely. (p. 97)[3]

Objections and replies

The most obvious objection to Dennett is the intuition that it "matters" to us whether an object has an inner life or not. The claim is that we don't just imagine the intentional states of other people in order to predict their behaviour; the fact that they have thoughts and feelings just like we do is central to notions such as trust, friendship and love. The Blockhead argument proposes that someone, Jones, has a twin who is in fact not a person but a very sophisticated robot which looks and acts like Jones in every way, but who (it is claimed) somehow does not have any thoughts or feelings at all, just a chip which controls his behaviour; in other words, "the lights are on but no one's home". According to the intentional systems theory (IST), Jones and the robot have precisely the same beliefs and desires, but this is claimed to be false. The IST expert assigns the same mental states to Blockhead as he does to Jones, "whereas in fact [Blockhead] has not a thought in his head." Dennett has argued against this by denying the premise, on the basis that the robot is a philosophical zombie and therefore metaphysically impossible. In other words, if something acts in all ways conscious, it necessarily is, as consciousness is defined in terms of behavioral capacity, not ineffable qualia.[4]

Another objection attacks the premise that treating people as ideally rational creatures will yield the best predictions. Stephen Stich argues that people often have beliefs or desires which are irrational or bizarre, and IST doesn't allow us to say anything about these. Of course if the person's "environmental niche" is examined closely enough, and the possibility of malfunction in their brain (which might affect their reasoning capacities) is looked into, it may be possible to formulate a predictive strategy specific to that person. Indeed this is what we often do when someone is behaving unpredictably we look for the reasons why. In other words, we can only deal with irrationality by contrasting it against the background assumption of rationality. This development significantly undermines the claims of the intentional stance argument.

The rationale behind the intentional stance is based on evolutionary theory, particularly the notion that the ability to make quick predictions of a system's behaviour based on what we think it might be thinking was an evolutionary adaptive advantage. The fact that our predictive powers are not perfect is a further result of the advantages sometimes accrued by acting contrary to expectations.

See also

References

  1. Dennett, D. C., (1987) "Three Kinds of Intentional Psychology", pp. 43–68 in Dennett, D. C., The Intentional Stance, MIT Press, (Cambridge), 1987.
  2. Daniel Dennett. "The Self as a Center of Narrative Gravity". Retrieved 2008-07-03. 
  3. 3.0 3.1 Burman, J. T. (2012). The misunderstanding of memes: Biography of an unscientific object, 1976–1999. Perspectives on Science, 20(1), 75-104. doi:10.1162/POSC_a_00057 (This is an open access article, made freely available courtesy of MIT Press.)
  4. Daniel Dennett, The Unimagined Preposterousness of Zombies

Further reading

  • Daniel C. Dennett (1996), The Intentional Stance (6th printing), Cambridge, Massachusetts: The MIT Press, ISBN 0-262-54053-3  (First published 1987).
  • Daniel C. Dennett (1997), "Chapter 3. True Believers: The Intentional Strategy and Why it Works", in John Haugeland, Mind Design II: Philosophy, Psychology, Artificial Intelligence. Massachusetts: Massachusetts Institute of Technology. ISBN 0-262-08259-4 (first published in Scientific Explanation, 1981, edited by A.F. Heath, Oxford: Oxford University Press; originally presented as a Herbert Spencer lecture at Oxford in November 1979; also published as chapter 2 in Dennett's book The Intentional Stance).
  • Dennett, D. "Three kinds of intentional psychology" (IP) in Heil, J. - Philosophy of Mind: A guide and anthology, Clarendon Press, Oxford, 2004
  • Braddon-Mitchell, D., & Jackson, F. Philosophy of Mind and Cognition, Basil Blackwell, Oxford, 1996
  • Dennett, D. "True Believers" in Dennett, D. The Intentional Stance, MIT Press, Cambridge, Mass., 1987
  • Fodor, J. Psychosemantics, MIT Press, Cambridge, Mass., 1987.
  • Lycan, W. Mind & Cognition, Basil Blackwell, Oxford, 1990
  • Fano, Vincenzo. "Holism and the naturalization of consciousness" in Holism, Massimo Dell'Utri. Quodlibet. 2002.
This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.