Intentional stance

From Wikipedia, the free encyclopedia

The Intentional Stance is a theory of mental content proposed by Daniel C. Dennett. The theory provides the underpinnings of his later works on free will, consciousness, folk psychology, and evolution. The intentional stance is a level of abstraction in which we view the behavior of a thing in terms of mental properties.

"Here is how it works: first you decide to treat the object whose behavior is to be predicted as a rational agent; then you figure out what beliefs that agent ought to have, given its place in the world and its purpose. Then you figure out what desires it ought to have, on the same considerations, and finally you predict that this rational agent will act to further its goals in the light of its beliefs. A little practical reasoning from the chosen set of beliefs and desires will in most instances yield a decision about what the agent ought to do; that is what you predict the agent will do." (Daniel Dennett, The Intentional Stance, p. 17)

Contents

[edit] Dennett's three levels

The core idea is that, when explaining and predicting the behavior of an object, we can choose to view it at varying levels of abstraction. The more concrete the level, the more accurate in principle our predictions are. The more abstract, the greater the computational power we gain by zooming out and skipping over the irrelevant details. Dennett defines three levels of abstraction:

  • The most concrete is the physical stance, which is at the level of physics and chemistry. At this level, we are concerned with things such as mass, energy, velocity, and chemical composition. When we predict where a ball is going to land based on its current trajectory, we are taking the physical stance. Another example of this stance comes when we look at a strip made up of two types of metal bonded together and predict how it will bend as the temperature changes, based on the physical properties of the two metals.
  • Somewhat more abstract is the design stance, which is at the level of biology and engineering. At this level, we are concerned with things such as purpose, function and design. When we predict that a bird will fly when it flaps its wings, on the basis that wings are made for flying, we are taking the design stance. Likewise, we can understand the bimetallic strip as a particular type of thermometer, not concerning ourselves with the details of how this type of thermometer happens to work. We can also recognize the purpose that this thermometer serves inside a thermostat and even generalize to other kinds of thermostats that might use a different sort of thermometer. We can even explain the thermostat in terms of what it's good for, saying that it keeps track of the temperature and turns on the heater whenever it gets below a minimum, turning it off once it reaches a maximum.
  • Most abstract is the intentional stance, which is at the level of software and minds. At this level, we are concerned with things such as belief, thinking and intent. When we predict that the bird will fly away because it knows the cat is coming, we are taking the intentional stance. Another example would be when we predict that Mary will leave the theater and drive to the restaurant because she sees that the movie is over and is hungry.

A key point is that switching to a higher level of abstraction has its risks as well as its benefits. For example, when we view both a bimetallic strip and a tube of mercury as thermometers, we can lose track of the fact that they differ in accuracy and temperature range, leading to false predictions as soon as the thermometer is used outside of the circumstances it was designed for. For example, the actions of a mercury thermometer heated to 500°C can no longer be predicted on the basis of treating it as a thermometer; we have to sink down to the physical stance to understand it as a melted and boiled piece of junk. For that matter, the actions of a dead bird are not predictable in terms of beliefs or desires.

Even when there is no outright error, a higher level stance can simply fail to be useful. If we were to try to understand the thermometer at the level of the intentional stance, ascribing to it beliefs about how hot it is and its desire to keep the temperature just right, we would gain no traction over the problem as compared to staying at the design stance, but we would assume increased risks of error. Whether to take a particular stance, then, is determined by how successful that stance is when applied.

Dennett argues that it is best to understand human beliefs and desires at the level of the intentional stance, without making any specific commitments to any deeper reality to the artifacts of folk psychology. In addition to the controversy inherent in this, there is also some dispute about the extent to which Dennett is committing to Realism about mental properties. Initially, Dennett's interpretation was seen as leaning more towards Instrumentalism, but over the years, as this idea has been used to support more extensive theories of consciousness, it has been taken as being more like Realism. His own words suggest something in the middle, as he suggests that the self is as real as a center of gravity, "an abstract object, a theorist's fiction", but operationally valid. [1]

[edit] Objections and replies

The most obvious objection to Dennett is the instinctive thought that it matters to us whether an object has an inner life or not. You don’t just imagine the intentional states of other people in order to predict their behaviour; the fact that they have thoughts and feelings just like you do is central to notions such as trust, friendship and love. The Blockhead argument, as it is known, proposes that someone, Jones, has a twin who is in fact not a person but a very sophisticated robot which looks and acts like Jones in every way, but who does not have any thoughts or feelings at all, just a chip which controls his behaviour; in other words, "the lights are on but no one’s home." According to the intentional systems theory, Jones and Blockhead have precisely the same beliefs and desires, but this seems false. The IST expert assigns the same mental states to Blockhead as he does to Jones, "whereas in fact (Blockhead) has not a thought in his head."

The problem with this objection is that it assumes that IST claims to tell us true things about our mental states, which it does not. In fact it remains agnostic about what beliefs and desires really are in human beings, and uses intentional terms in a purely technical way to predict behaviour. Discovering that Blockhead is an automaton may have many implications for the people around him, but what it will not do is change the way they should reason about his behaviour.

There is another objection, which attacks the premise that treating people as ideally rational creatures will yield the best predictions. Stephen Stich points out that people often have beliefs or desires which are irrational or bizarre, and IST doesn’t allow us to say anything about these. Of course if the person’s "environmental niche" is examined closely enough, and the possibility of malfunction in their brain (which might affect their reasoning capacities) is looked into, it may be possible to formulate a predictive strategy specific to that person. Indeed this is what we often do when someone is behaving unpredictably - we look for the reasons why. This development takes away from the simplicity of the theory but is not explicitly an argument against it.

A third objection takes the reverse case to the Blockhead example: a person who is completely paralysed. They have no behaviour and so IST should reason that therefore they have no intentional states. The solution to this is problematic: the IST expert looks to their circumstances and says: they probably have the belief that they are paralysed, and the desire that they weren’t, and I predict from these that their behaviour will be nil, hence, IST works. But could anything, then, be an intentional system? What about a lectern? Why not say that a lectern mourns the fact that it used to be a tree, and desires to be one again, but due to its circumstances it just stays where it is? This presents a strong challenge to the claim that IST can adequately account for beliefs and desires, for we surely do not want to say that a lectern is an intentional system.

The rationale behind the intentional stance is based on evolutionary theory, assuming that the ability to make quick predictions of a system’s behaviour based on what we think it might be thinking was an evolutionary adaptive advantage. The fact that our predictive powers are not perfect is a further result of the advantages sometimes accrued by acting contrary to expectations. If we take the intentional stance at face value; that is, as a conceptual tool and not as a literal definition of the mind, then it works in a roughly constructive empiricist fashion to yield accurate results. "What is done, not how it is done, is what counts", and the intentional stance may help us to understand the mind better than many other available theories which make stronger claims but are more easily refuted.

It has been objected that Dennett's theory depends crucially on this assumption that humans are evolutionarily adapted to be rational agents and that the assumption is unwarranted: from the fact that one believes that it would have been desirable and convenient for humans to have evolved as rational beings according to the needs of their environment, it does not follow that the random mutations that are the basis of natural selection would necessarily bring about that result. That is, we cannot maintain that humans are rational agents just because it would have been evolutionarily convenient for them to have evolved as such. As Vincenzo Fano has pointed out, the determination of whether a system is rational or not must be an empirical process and cannot be presupposed, as it is according to the idea of intentional attitudes described by Dennett.

[edit] See also

[edit] References

  • Dennett, D. "Three kinds of intentional psychology" (IP) in Heil, J. - Philosophy of Mind: A guide and anthology, Clarendon Press, Oxford, 2004
  • Braddon-Mitchell, D., & Jackson, F. Philosophy of Mind and Cognition, Basil Blackwell, Oxford, 1996
  • Dennett, D. "True Believers" in Dennett, D. The Intentional Stance, MIT Press, Cambridge, Mass., 1987
  • Fodor, J. Psychosemantics, MIT Press, Cambridge, Mass., 1987.
  • Lycan, W. Mind & Cognition, Basil Blackwell, Oxford, 1990
  • Fano, Vincenzo. "Holism and the naturalization of consciousness" in Holism, Massimo Dell'Utri. Quodlibet. 2002.