B. F. Skinner

B. F. Skinner

B.F. Skinner at the Harvard Psychology Department, c.1950
Born 20 March 1904
Susquehanna, Pennsylvania
Died 18 August 1990 (aged 86)
Cambridge, Massachusetts
Nationality American
Fields Psychology, linguistics, philosophy
Institutions University of Minnesota
Indiana University
Harvard University
Alma mater Hamilton College
Harvard University
Known for Operant conditioning
Influences Charles Darwin
Ivan Pavlov
Ernst Mach
Jacques Loeb
Edward Thorndike
William James
Jean-Jacques Rousseau
Henry David Thoreau
Notable awards National Medal of Science (1968)

Signature

Burrhus Frederic (B. F.) Skinner (March 20, 1904 – August 18, 1990) was an American psychologist, behaviorist, author, inventor, and social philosopher.[1][2][3][4] He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.[5]

Skinner believed that human free will is an illusion and that any human action is the result of the consequences of the same action. If the consequences are bad, there is a high chance that the action will not be repeated; however if the consequences are good, the actions that led to it will become more probable.[6] Skinner called this the principle of reinforcement.[7]

Skinner called the use of reinforcement to strengthen behavior operant conditioning, and he considered the rate of response to be the most effective measure of response strength. To study operant conditioning he invented the operant conditioning chamber, also known as the Skinner Box,[8] and to measure rate he invented the cumulative recorder. Using these tools he and C. B. Ferster produced his most influential experimental work, which appeared in the book Schedules of Reinforcement.[9][10]

Skinner developed a philosophy of science that he called radical behaviorism,[11] and founded a school of experimental research psychology—the experimental analysis of behavior. He imagined the application of his ideas to the design of a human community in his utopian novel Walden Two,[12] and his analysis of human behavior culminated in his work Verbal Behavior.[13]

Skinner was a prolific author who published 21 books and 180 articles.[14][15] Contemporary academia considers Skinner a pioneer of modern behaviorism along with John B. Watson and Ivan Pavlov. A June 2002 survey listed Skinner as the most influential psychologist of the 20th century.[16]

Biography

The grave of B.F. Skinner and his wife Eve at Mount Auburn Cemetery

Skinner was born in Susquehanna, Pennsylvania to William and Grace Skinner. His father was a lawyer. He became an atheist after a Christian teacher tried to assuage his fear of the hell that his grandmother described.[17] His brother Edward, two and a half years younger, died at age sixteen of a cerebral hemorrhage. He attended Hamilton College in New York with the intention of becoming a writer. He found himself at a social disadvantage at Hamilton College due to his intellectual attitude.[18] While attending, he joined Lambda Chi Alpha Fraternity. He wrote for the school paper, but as an atheist, he was critical of the religious school he attended. After receiving his B.A. in English literature in 1926, he attended Harvard University, where he would later research, teach, and eventually become a prestigious board member. While he was at Harvard a fellow student, Fred Keller, convinced Skinner that he could make an experimental science from the study of behavior. This led Skinner to invent his prototype for the Skinner Box and to join Keller in the creation of other tools for small experiments.[18] After graduation, he unsuccessfully tried to write a great novel while he lived with his parents, a period that he later called the Dark Years.[18] He became disillusioned with his literary skills despite encouragement from the renowned poet Robert Frost, concluding that he had little world experience and no strong personal perspective from which to write. His encounter with John B. Watson's Behaviorism led him into graduate study in psychology and to the development of his own version of behaviorism.[19]

Skinner received a Ph.D. from Harvard in 1931, and remained there as a researcher until 1936. He then taught at the University of Minnesota at Minneapolis and later at Indiana University, where he was chair of the psychology department from 1946–1947, before returning to Harvard as a tenured professor in 1948. He remained at Harvard for the rest of his life. In 1973 Skinner was one of the signers of the Humanist Manifesto II.[20]

In 1936, Skinner married Yvonne Blue. The couple had two daughters, Julie (m. Vargas) and Deborah (m. Buzan).[21][22] He died of leukemia on August 18, 1990,[23] and is buried in Mount Auburn Cemetery, Cambridge, Massachusetts.[24] Skinner continued to write and work until just before his death. A few days before Skinner died, he was given a lifetime achievement award by the American Psychological Association and delivered a 15-minute address concerning his work.[25]

A controversial figure, Skinner has been depicted in many different ways. He has been widely revered for bringing a much-needed scientific approach to the study of human behavior; he has also been vilified for attempting apply findings based largely on animal experiments to human behavior in real-life settings.

Contributions to psychological theory

Behaviorism

Main articles: Behaviorism and Radical behaviorism

Skinner called his approach to the study of behavior Radical behaviorism.[26] This philosophy of behavioral science assumes that behavior is a consequence of environmental histories of reinforcement, (see Applied behavior analysis). In contrast to the approach of cognitive science, behaviorism does not accept private events such as thinking, perceptions, and unobservable emotions as causes of an organism's behavior. However, in contrast to methodological behaviorism, Skinner's radical behaviorism did accept thoughts, emotions, and other "private events" as responses subject to the same rules as overt behavior. In his words:

The position can be stated as follows: what is felt or introspectively observed is not some nonphysical world of consciousness, mind, or mental life but the observer's own body. This does not mean, as I shall show later, that introspection is a kind of psychological research, nor does it mean (and this is the heart of the argument) that what are felt or introspectively observed are the causes of the behavior. An organism behaves as it does because of its current structure, but most of this is out of reach of introspection. At the moment we must content ourselves, as the methodological behaviorist insists, with a person's genetic and environment histories. What are introspectively observed are certain collateral products of those histories.


...

In this way we repair the major damage wrought by mentalism. When what a person does [is] attributed to what is going on inside him, investigation is brought to an end. Why explain the explanation? For twenty five hundred years people have been preoccupied with feelings and mental life, but only recently has any interest been shown in a more precise analysis of the role of the environment. Ignorance of that role led in the first place to mental fictions, and it has been perpetuated by the explanatory practices to which they gave rise.[27]

Theoretical structure

Skinner's behavioral theory was largely set forth in his first book, Behavior of Organisms.[28] Here he gave a systematic description of the manner in which environmental variables control behavior. He distinguished two sorts of behavior, which are controlled in different ways. First respondent behaviors, which are elicited by stimuli. These may be modified through respondent conditioning, which is often called "Pavlovian conditioning" or "classical conditioning", in which a neutral stimulus is paired with an eliciting stimulus. Operant behaviors, in contrast, are "emitted," meaning that initially they are not induced by any particular stimulus. They are strengthened through operant conditioning, sometimes called "instrumental conditioning," in which the occurrence of a response yields a reinforcer. Respondents might be measured by their latency or strength, operants by their rate. Both of these sorts of behavior had already been studied experimentally, for example, respondents by Pavlov [29] and operants by Thorndike.[30] Skinner's account differed in some ways from earlier ones,[31] and was one of the first accounts to bring them under one roof.

The idea that behavior is strengthened or weakened by its consequences raises several questions. Among the most important are these: (1) Operant responses are strengthened by reinforcement, but where do they come from in the first place? (2) Once it is in the organism's repertoire, how is a response directed or controlled? (3) How can very complex and seemingly novel behaviors be explained?

The origin of operant behavior

Skinner's answer to the first question was very much like Darwin's answer to the question of the origin of a "new" bodily structure, namely, variation and selection. Similarly, the behavior of an individual varies from moment to moment; a variation that is followed by reinforcement is strengthened and becomes prominent in that individual's behavioral repertoire. "Shaping" was Skinner's term for the gradual modification of behavior by the reinforcement of desired variations. As discussed later in this article, Skinner believed that "superstitious" behavior can arise when a response happens to be followed by reinforcement to which it is actually unrelated.

The control of operant behavior

The second question, "how is operant behavior controlled?" arises because, to begin with, the behavior is "emitted" without reference to any particular stimulus. Skinner answered this question by saying that a stimulus comes to control an operant if it is present when the response is reinforced and absent when it is not. For example, if lever-pressing only brings food when a light is on, a rat, or a child, will learn to press the lever only when the light is on. Skinner summarized this relationship by saying that a discriminative stimulus (e.g. light) sets the occasion for the reinforcement (food) of the operant (lever-press). This "three-term contingency" (stimulus-response-reinforcer) is one of Skinner's most important concepts, and sets his theory apart from theories that use only pair-wise associations.[31]

Explaining complex behavior

Most behavior of humans cannot easily be described in terms of individual responses reinforced one by one, and Skinner devoted a great deal of effort to the problem of behavioral complexity. Some complex behavior can be seen as a sequence of relatively simple responses, and here Skinner invoked the idea of "chaining." Chaining is based on the fact, experimentally demonstrated, that a discriminative stimulus not only sets the occasion for subsequent behavior, but it can also reinforce a behavior that precedes it. That is, a discriminative stimulus is also a "conditioned reinforcer." For example, the light that sets the occasion for lever pressing may be used to reinforce "turning around" in the presence of a noise. This results in the sequence "noise - turn-around - light - press lever - food". Much longer chains can be built by adding more stimuli and responses.

However, Skinner recognized that a great deal of behavior, especially human behavior, cannot be accounted for by gradual shaping or the construction of response sequences.[32] Complex behavior often appears suddenly in its final form, as when a person first finds his way to the elevator by following instructions given at the front desk. To account for such behavior Skinner introduced the concept of rule-governed behavior. First, relatively simple behaviors come under the control of verbal stimuli: the child learns to "jump", "open the book", and so on. After a large number of responses come under such verbal control, a sequence of verbal stimuli can evoke an almost unlimited variety of complex responses.[32]

Reinforcement

Main article: Reinforcement

Reinforcement, a key concept of Behaviorism, is the primary process that shapes and controls behavior, and occurs in two ways, "positive" and "negative". In The Behavior of Organisms (1938), Skinner defined "negative reinforcement" to be synonymous with punishment, that is, the presentation of an aversive stimulus. Subsequently, in Science and Human Behavior (1953), Skinner redefined negative reinforcement. In what has now become the standard set of definitions, positive reinforcement is the strengthening of behavior by the occurrence of some event (e.g., praise after some behavior is performed), whereas negative reinforcement is the strengthening of behavior by the removal or avoidance of some aversive event (e.g., opening and raising an umbrella over your head on a rainy day is reinforced by the cessation of rain falling on you).

Both types of reinforcement strengthen behavior, or increase the probability of a behavior reoccurring; the difference is in whether the reinforcing event is something applied (positive reinforcement) or something removed or avoided (negative reinforcement). Punishment is the application of an aversive stimulus/event (positive punishment or punishment by contingent stimulation) or the removal of a desirable stimulus (negative punishment or punishment by contingent withdrawal). Though punishment is often used to suppress behavior, Skinner argued that this suppression is temporary and has a number of other, often unwanted, consequences.[33] Extinction is the absence of a rewarding stimulus, which weakens behavior.

Writing in 1981, Skinner pointed out that Darwinian natural selection is, like reinforced behavior, "selection by consequences." Though, as he said, natural selection has now "made its case", he regretted that essentially the same process, "reinforcement" was less widely accepted as underlying human behavior.[34]

Schedules of reinforcement

Skinner recognized that behavior is typically reinforced more than once, and, together with C. B. Ferster, he did an extensive analysis of the various ways in which reinforcements could be arranged over time, which he called "schedules of reinforcement".[35]

The most notable schedules of reinforcement studied by Skinner were continuous, interval (fixed or variable) and ratio (fixed or variable). All are methods used in operant conditioning.

Scientific inventions

Operant conditioning chamber

An operant conditioning chamber (also known as a Skinner Box) is a laboratory apparatus used in the experimental analysis of animal behavior. It was invented by Skinner while he was a graduate student at Harvard University, where he received the doctorate in 1931. As used by Skinner, the box had a lever (for rats) or a disk in one wall (for pigeons). A press on this "manipulandum" could deliver food to the animal through an opening in the wall, and responses reinforced in this way increased in frequency. By controlling this reinforcement together with discriminative stimuli such as lights and tones, or punishments such as electric shocks, experimenters have used the operant box to study a wide variety of topics, including schedules of reinforcement, discriminative control, delayed response ("memory"), punishment, and so on. By channeling research in these directions, the operant conditioning chamber has had a huge influence on course of research in animal learning and its applications. It enabled great progress on problems that could be studied by the measuring the rate, probability, or force of a simple, repeatable response. However, it discouraged the study of behavioral processes not easily conceptualized in such terms - spatial learning, in particular, which is now studied in quite different ways, for example by the use of the water maze.[31]

Cumulative recorder

The cumulative recorder makes a pen-and-ink record of simple repeated responses. Skinner designed it for use with the Operant chamber as a convenient way to record and view the rate of responses such as a lever press or a key peck. In this device a sheet of paper gradually unrolls over a cylinder. Each response steps a small pen across the paper, starting at one edge; when the pen reaches the other edge, it quickly resets to the initial side. The slope of the resulting ink line graphically displays the rate of the response; for example rapid responses yield a steeply sloping line on the paper, slow responding yields a line of low slope. The cumulative recorder was a key tool used by Skinner in his analysis of behavior, and it was very widely adopted by other experimenters, gradually falling out of use with the advent of the laboratory computer. Skinner's major experimental exploration of response rates, presented in his book C. B. Ferster, "Schedules of Reinforcement", is full of cumulative records produced by this device.[35]

Air crib

The air crib is an easily cleaned, temperature and humidity-controlled enclosure intended to replace the standard infant crib.[39] Skinner invented the device to help his wife cope with the day-to-day tasks of child rearing. It was designed to make early childcare simpler (by reducing laundry, diaper rash, cradle cap, etc.), while allowing the baby to be more mobile and comfortable, and less prone to cry. Reportedly it had some success in these goals.[40]

The air crib was a controversial invention. It was popularly mischaracterized as a cruel pen, and it was often compared to Skinner's operant conditioning chamber, commonly called the "Skinner Box." This association with laboratory animal experimentation discouraged its commercial success, though several companies attempted production.[40][41]

A 2004 book by Lauren Slater, entitled Opening Skinner's Box: Great Psychology Experiments of the Twentieth Century[42] caused a stir by mentioning the rumors that Skinner had used his baby daughter Deborah in some of his experiments and that she had subsequently committed suicide. Although Slater's book stated that the rumors were false, a reviewer in The Observer in March 2004 misquoted Slater's book as supporting the rumors. This review was read by Deborah Skinner (now Deborah Buzan, an artist and writer living in London) who wrote a vehement riposte in The Guardian.[43]

Teaching machine

The teaching machine, a mechanical invention to automate the task of programmed instruction

The teaching machine was a mechanical device whose purpose was to administer a curriculum of programmed instruction. The machine embodies key elements of Skinner’s theory of learning and had important implications for education in general and classroom instruction in particular.[44]

In one incarnation, the machine was a box that housed a list of questions that could be viewed one at a time through a small window. (See picture). There was also a mechanism through which the learner could respond to each question. Upon delivering a correct answer, the learner would be rewarded.[45]

Skinner advocated the use of teaching machines for a broad range of students (e.g., preschool aged to adult) and instructional purposes (e.g., reading and music). For example, one machine that he envisioned could teach rhythm. He wrote:

A relatively simple device supplies the necessary contingencies. The student taps a rhythmic pattern in unison with the device. "Unison" is specified very loosely at first (the student can be a little early or late at each tap) but the specifications are slowly sharpened. The process is repeated for various speeds and patterns. In another arrangement, the student echoes rhythmic patterns sounded by the machine, though not in unison, and again the specifications for an accurate reproduction are progressively sharpened. Rhythmic patterns can also be brought under the control of a printed score.[46]

The instructional potential of the teaching machine stemmed from several factors: it provided automatic, immediate and regular reinforcement without the use of aversive control; the material presented was coherent, yet varied and novel; the pace of learning could be adjusted to suit the individual. As a result, students were interested, attentive, and learned efficiently by producing the desired behavior, "learning by doing." [47]

Teaching machines, though perhaps rudimentary, were not rigid instruments of instruction. They could be adjusted and improved based upon the students’ performance. For example, if a student made many incorrect responses, the machine could be reprogrammed to provide less advanced prompts or questions- the idea being that students acquire behaviors most efficiently if they make few errors. Multiple-choice formats were not best suited for teaching machines because they tended to increase student mistakes and the contingencies of reinforcement were relatively uncontrolled.

Not only useful in teaching explicit skills, machines could also promote the development of a repertoire of behaviors that Skinner called self-management. Effective self-management means attending to stimuli appropriate to a task, avoiding distractions, reducing the opportunity of reward for competing behaviors, and so on. For example, machines encourage students to pay attention before receiving a reward. Skinner contrasted this with the common classroom practice of initially capturing students’ attention (e.g., with a lively video) and delivering a reward (e.g., entertainment) before the students have actually performed any relevant behavior. This practice fails to reinforce correct behavior and actually counters the development of self-management.

Skinner pioneered the use of teaching machines in the classroom, especially at the primary level. Today computers run software that performs similar teaching tasks, and there has been a resurgence of interest in the topic related to the development of adaptive learning systems.[48]

Pigeon-guided missile

Main article: Project Pigeon

During World War II the US Navy required a weapon effective against surface ships such the German Bismarck class battleships. Although missile and TV technology existed, the size of the primitive guidance systems available rendered automatic guidance impractical. To solve this problem Skinner initiated Project Pigeon[49][50] which was intended to provide a simple and effective guidance system. This system divided the nose cone of a missile into three compartments, putting a pigeon in each. Lenses projected an image of distant objects onto a screen in front of each bird. Thus, when the missile was launched from an aircraft within sight of an enemy ship, an image of the ship would appear on the screen. The screen was hinged such that pecks at the image of the ship would guide the missile toward the ship.[51]

Despite an effective demonstration the project was abandoned, and eventually more conventional solutions, such as those based on radar, became available. Skinner complained that "our problem was no one would take us seriously."[52] It seemed that few people would trust pigeons to guide a missile no matter how reliable the system appeared to be.[53]

Verbal summator

Early in his career Skinner became interested in "latent speech" and experimented with a device he called the "verbal summator."[54] This device can be thought of as an auditory version of the Rorschach inkblots.[54] When using the device, human participants listened to incomprehensible auditory "garbage" but often read meaning into what they heard. Thus, as with the Rorschach blots, the device was intended to yield overt behavior that projected subconscious thoughts. Skinner's interest in projective testing was brief, but he later used observations with the summator in creating his theory of verbal behavior. The device also led other researchers to invent new tests such as the tautophone test, the auditory apperception test, and the Azzageddi test.[55]

Verbal Behavior

Challenged by Alfred North Whitehead during a casual discussion while at Harvard to provide an account of a randomly provided piece of verbal behavior,[56] Skinner set about attempting to extend his then-new functional, inductive, approach to the complexity of human verbal behavior.[57] Developed over two decades, his work appeared in the book Verbal Behavior. Although Noam Chomsky was highly critical of Verbal Behavior, he conceded that Skinner's "S-R psychology" was worth a review. (Behavior analyists reject the "S-R" characterization: operant conditioning involves the emission of a response which then becomes more or less likely dependending upon its consequence - see above.).[58]

Verbal Behavior had an uncharacteristically cool reception, partly as a result of Chomsky's review, partly due to Skinner's failure to address or rebut any of Chomsky's criticisms.[59] Skinner's peers may have been slow to adopt the ideas presented in Verbal Behavior due to the absence of experimental evidence — unlike the empirical density that marked Skinner's experimental work.[60] However, in applied settings there has been a resurgence of interest in Skinner's functional analysis of verbal behavior.[61]

Influence on education

Skinner's views influenced education as well as psychology. Skinner argued that education has two major purposes: (1) to teach repertoires of both verbal and nonverbal behavior; and (2) to interest students in learning. He recommended bringing students’ behavior under appropriate control by providing reinforcement only in the presence of stimuli relevant to the learning task. Because he believed that human behavior can be affected by small consequences, something as simple as “the opportunity to move forward after completing one stage of an activity” can be an effective reinforcer (Skinner, 1961, p. 380). Skinner was convinced that, to learn, a student must engage in behavior, and not just passively receive information. (Skinner, 1961, p. 389).

Skinner believed that effective teaching must be based on positive reinforcement which is, he argued, more effective at changing and establishing behavior than punishment. He suggested that the main thing people learn from being punished is how to avoid punishment. For example, if a child is forced to practice playing an instrument, the child comes to associate practicing with punishment and thus learns to hate and avoid practicing the instrument. This view had obvious implications for the then widespread practice of rote learning and punitive discipline in education. The use of educational activities as punishment may induce rebellious behavior such as vandalism or absence.[62]

Because teachers are primarily responsible for modifying student behavior, Skinner argued that teachers must learn effective ways of teaching. In The Technology of Teaching, Skinner has a chapter on why teachers fail (pages 93–113): He says that teachers have not been given an in-depth understanding of teaching and learning. Without knowing the science underpinning teaching, teachers fall back on procedures that work poorly or not at all, such as:

  • using aversive techniques (which produce escape and avoidance and undesirable emotional effects);
  • relying on telling and explaining ("Unfortunately, a student does not learn simply when he is shown or told." p. 103);
  • failing to adapt learning tasks to the student's current level;
  • failing to provide positive reinforcement frequently enough.

Skinner suggests that any age-appropriate skill can be taught. The steps are

  1. Clearly specify the action or performance the student is to learn.
  2. Break down the task into small achievable steps, going from simple to complex.
  3. Let the student perform each step, reinforcing correct actions.
  4. Adjust so that the student is always successful until finally the goal is reached.
  5. Shift to intermittent reinforcement to maintain the student's performance.

Skinner's views on education are extensively presented in his book The Technology of Teaching. They are also reflected in Fred S. Keller's Personalized System of Instruction and Ogden R. Lindsley's Precision Teaching.

Walden Two and Beyond Freedom and Dignity

Skinner is popularly known mainly for his books Walden Two and Beyond Freedom and Dignity for which he made the cover of TIME Magazine.[63] The former describes a visit to a fictional "experimental community"[64] in 1940s United States, where the productivity and happiness of the citizens is far in advance of that in the outside world because of their practice of scientific social planning and use of operant conditioning in the raising of children.

Walden Two, like Thoreau's Walden, champions a lifestyle that does not support war or foster competition and social strife. It encourages a lifestyle of minimal consumption, rich social relationships, personal happiness, satisfying work and leisure.[65] In 1967, Kat Kinkade founded the Twin Oaks Community, using Walden Two as a blueprint. The community is still exists and continues to use the Planner-Manager system and other aspects described in Skinner's book.

In Beyond Freedom and Dignity, Skinner suggests that a technology of behavior could help to make a better society. We would, however, have to accept that an autonomous agent is not the driving force of our actions. Skinner offers alternatives to punishment and challenges his readers to use science and modern technology to construct a better society.

Political views

Skinner's political writings emphasized his hopes that an effective and human science of behavioral control – a technology of human behavior – could help with problems as yet unsolved and often aggravated by advances in technology such as the atomic bomb. Indeed, one of Skinner's goals was to prevent humanity from destroying itself.[66] He saw political activity as the use of aversive or non-aversive means to control a population. Skinner favored the use of positive reinforcement as a means of control, citing Jean-Jacques Rousseau's novel Emile: or, On Education as an example of literature that "did not fear the power of positive reinforcement".[2]

Skinner's book, Walden Two, presents a vision of a decentralized, localized society, which applies a practical, scientific approach and futuristically advanced behavioral expertise to peacefully deal with social problems. Skinner's utopia is both a thought experiment and a rhetorical piece. In his book, Skinner answers the problem that exists in many utopian novels – "What is the Good Life?" In Walden Two, the answer is a life of friendship, health, art, a healthy balance between work and leisure, a minimum of unpleasantness, and a feeling that one has made worthwhile contributions to a society in which resources are ensured, in part, by a lack of consumption.

If the world is to save any part of its resources for the future, it must reduce not only consumption but the number of consumers.
B. F. Skinner,  Walden Two, p. xi.

The world ethos was to be achieved through behavioral technology, which could offer alternatives to coercion,[2] as good science applied correctly would help society,[3] and allow all people to cooperate with each other peacefully.[2] Skinner described his novel as "my New Atlantis", in reference to Bacon's utopia.[67] He opposed corporal punishment in the school, and wrote a letter to the California Senate that helped lead it to a ban on spanking.[68]

When Milton's Satan falls from heaven, he ends in hell. And what does he say to reassure himself? 'Here, at least, we shall be free.' And that, I think, is the fate of the old-fashioned liberal. He's going to be free, but he's going to find himself in hell.
B. F. Skinner,  from William F. Buckley Jr, On the Firing Line, p. 87.

Superstition in the pigeon

One of Skinner's experiments examined the formation of superstition in one of his favorite experimental animals, the pigeon. Skinner placed a series of hungry pigeons in a cage attached to an automatic mechanism that delivered food to the pigeon "at regular intervals with no reference whatsoever to the bird's behavior." He discovered that the pigeons associated the delivery of the food with whatever chance actions they had been performing as it was delivered, and that they subsequently continued to perform these same actions.[69]

One bird was conditioned to turn counter-clockwise about the cage, making two or three turns between reinforcements. Another repeatedly thrust its head into one of the upper corners of the cage. A third developed a 'tossing' response, as if placing its head beneath an invisible bar and lifting it repeatedly. Two birds developed a pendulum motion of the head and body, in which the head was extended forward and swung from right to left with a sharp movement followed by a somewhat slower return.[70][71]

Skinner suggested that the pigeons behaved as if they were influencing the automatic mechanism with their "rituals" and that this experiment shed light on human behavior:

The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's fortune at cards are good examples. A few accidental connections between a ritual and favorable consequences suffice to set up and maintain the behavior in spite of many unreinforced instances. The bowler who has released a ball down the alley but continues to behave as if she were controlling it by twisting and turning her arm and shoulder is another case in point. These behaviors have, of course, no real effect upon one's luck or upon a ball half way down an alley, just as in the present case the food would appear as often if the pigeon did nothing—or, more strictly speaking, did something else.[70]

Modern behavioral psychologists have disputed Skinner's "superstition" explanation for the behaviors he recorded. Subsequent research (e.g. Staddon and Simmelhag, 1971), while finding similar behavior, failed to find support for Skinner's "adventitious reinforcement" explanation for it. By looking at the timing of different behaviors within the interval, Staddon and Simmelhag were able to distinguish two classes of behavior: the terminal response, which occurred in anticipation of food, and interim responses, that occurred earlier in the interfood interval and were rarely contiguous with food. Terminal responses seem to reflect classical (as opposed to operant) conditioning, rather than adventitious reinforcement, guided by a process like that observed in 1968 by Brown and Jenkins in their "autoshaping" procedures. The causation of interim activities (such as the schedule-induced polydipsia seen in a similar situation with rats) also cannot be traced to adventitious reinforcement and its details are still obscure (Staddon, 1977).[72]

This experiment was also repeated on humans, in a less controlled manner, on the popular British TV series Trick or Treat, leading to similar conclusions to those of Skinner.[73]

B.F. Skinner Quotations

"I do not admire myself as a person. My successes do not override my shortcomings"[74]

"Ethical control may survive in small groups, but the control of the population as a whole must be delegated to specialists—to police, priests, owners, teachers, therapists, and so on, with their specialized reinforcers and their codified contingencies"[75]

"It is a mistake to suppose that the whole issue is how to free man. The issue is to improve the way in which he is controlled"[76]

"Education is what survives when what has been learned has been forgotten."[77]

"As the senses grow dull, the stimulating environment becomes less clear. When reinforcing consequences no longer follow, we are bored, discouraged and depressed."[74]

Criticism

J. E. R. Staddon

As understood by Skinner, ascribing dignity to individuals involves giving them credit for their actions. To say "Skinner is brilliant" means that Skinner is an originating force. If Skinner's determinist theory is right, he is merely the focus of his environment. He is not an originating force and he had no choice in saying the things he said or doing the things he did. Skinner's environment and genetics both allowed and compelled him to write his book. Similarly, the environment and genetic potentials of the advocates of freedom and dignity cause them to resist the reality that their own activities are deterministically grounded. J. E. R. Staddon (The New Behaviorism, 2nd Edition, 2014) has argued the compatibilist position; Skinner's determinism is not in any way contradictory to traditional notions of reward and punishment, as he believed.[78]

Noam Chomsky

Perhaps Skinner's best known critic, Noam Chomsky published a review of Skinner's Verbal Behavior two years after it was published.[79] The 1959 review became better known than the book itself.[80] Chomsky's review has been credited with launching the cognitive movement in psychology and other disciplines. Skinner, who rarely responded directly to critics, never formally replied to Chomsky's critique. Many years later, Kenneth MacCorquodale's reply[81] was endorsed by Skinner.

Chomsky also reviewed Skinner's Beyond Freedom and Dignity, using the same basic motives as his Verbal Behavior review. Among Chomsky's criticisms were that Skinner's laboratory work could not be extended to humans, that when it was extended to humans it represented 'scientistic' behavior attempting to emulate science but which was not scientific, that Skinner was not a scientist because he rejected the hypothetico-deductive model of theory testing, and that Skinner had no science of behavior.[82]

Psychodynamic psychology

Skinner has been repeatedly criticized for his supposed animosity towards Freud, psychoanalysis, and psychodynamic psychology. There is clear evidence, however, that Skinner shared several of Freud's assumptions, and that he was influenced by Freudian points of view in more than one field, among them the analysis of defense mechanisms, such as repression.[83] To study such phenomena, Skinner even designed his own projective test, the "verbal summator" described above.[84]

List of awards and positions

Honorary degrees

Skinner received honorary degrees from:

In popular culture

Writer of The Simpsons Jon Vitti named Principal Skinner character after behavioral psychologist B. F. Skinner.[85]

Bibliography

See also

References

  1. Smith, L. D.; Woodward, W. R. (1996). B. F. Skinner and behaviorism in American culture. Bethlehem, PA: Lehigh University Press. ISBN 0-934223-40-8.
  2. 2.0 2.1 2.2 2.3 Skinner, B. F. (1948). Walden Two. The science of human behavior is used to eliminate poverty, sexual expression, government as we know it, create a lifestyle without that such as war.
  3. 3.0 3.1 Skinner, B. F. (1972). Beyond freedom and dignity. New York: Vintage Books. ISBN 0-553-14372-7. OCLC 34263003.
  4. https://behavioranalysishistory.pbworks.com/w/page/2039033/Skinner%2C%20Burrhus%20Frederic
  5. Muskingum.edu
  6. Schacter, Daniel L., and Gilbert Daniel. (2011). Psychology. (2 ed.). New York, 2011. Web. 22 Mar. 2013.
  7. Schacter, Daniel (2009, 2011). Psychology Second Edition. United States of America: Worth Publishers. p. 17. ISBN 978-1-4292-3719-2. Check date values in: |date= (help)
  8. Schacter D, L., Gilbert D, T., & Wegner D, M. (2011)
  9. B. F. Skinner, (1938) The Behavior of Organisms.
  10. C. B. Ferster & B. F. Skinner, (1957) Schedules of Reinforcement.
  11. B. F. Skinner, About Behaviorism
  12. Skinner, B.F. (1948). Walden Two. Indianapolis: Hackett. ISBN 0-87220-779-X.
  13. Skinner, B. F. (1958) Verbal Behavior. Acton, MA: Copley Publishing Group. ISBN 1-58390-021-7
  14. Lafayette.edu, accessed on 5-20-07.
  15. BFSkinner.org, Smith Morris Bibliography
  16. Haggbloom, Steven J.; Warnick, Jason E.; Jones, Vinessa K.; Yarbrough, Gary L.; Russell, Tenea M.; Borecky, Chris M.; McGahhey, Reagan et al. (2002). "The 100 most eminent psychologists of the 20th century". Review of General Psychology 6 (2): 139–152. doi:10.1037/1089-2680.6.2.139.
  17. "Within a year I had gone to Miss Graves to tell her that I no longer believed in God. 'I know,' she said, 'I have been through that myself.' But her strategy misfired: I never went through it." B.F. Skinner, pp. 387-413, E.G. Boring and G. Lindzey's A History of Psychology in Autobiography (Vol. 5), New York: Appleton Century-Crofts, 1967.
  18. 18.0 18.1 18.2 B.F. Skinner: A Life [Paperback]. by Daniel W. Bjork, ISBN 9781557984166: Amazon.com: Books. N.p., n.d. Web. 04 June 2013.
  19. B. F. Skinner: a Life.
  20. "Humanist Manifesto II". American Humanist Association. Retrieved October 9, 2012.
  21. Skinner, Deborah. "About". Horses by Skinner. Retrieved 4 September 2014.
  22. Buzan, Deborah Skinner (12 March 2004). "I was not a lab rat". The Guardian. Retrieved 4 September 2014.
  23. http://www.humanistsofutah.org/humanists/bfskinner.html
  24. Bjork, D.W. (1993). B.F. Skinner, A Life. New York: Basic Books.
  25. "Skinner, Burrhus Frederic (1904 - 1990).". Credo Reference, Topic Pages. Credo Reference, Gale. Retrieved 1 October 2013.
  26. About Behaviorism Ch. 1 Causes of Behaviour § 3 Radical Behaviorism B. F. Skinner 1974 ISBN 0-394-71618-3
  27. ibid. pp. 18−20 of the paperback edition which had the redacted typo s/it/is/.
  28. Skinner, B.F. (1938). Behavior of Organisms. New York: Appleton-Century-Crofts.
  29. Pavlov, I. P. (1927). Conditioned Reflexes. Oxford: Oxford Univ. Press.
  30. Thorndike, E. L. (1911). Animal Intelligence: Experimental Studies. New York: Macmillan.
  31. 31.0 31.1 31.2 Jenkins, H.M. "Animal Learning and Behavior", Ch. 5, in Hearst, E. "The First Century of Experimental Psychology" (1979) Erlbaum: Hillsdale, N. J. }
  32. 32.0 32.1 Skinner, B. F.(1966) Contingencies of Reinforcement, New York; Appleton-Century-Crofts. reprinted 2013, B. F. Skinner Foundation.
  33. Skinner, B. F. Science and Human Behavior (1953) New York: Macmillan
  34. Skinner, B.F (31 July 1981). "Selection by Consequences" (PDF). Science 213 (4507): 501–504. Bibcode:1981Sci...213..501S. doi:10.1126/science.7244649. PMID 7244649. Archived (PDF) from the original on 2 July 2010. Retrieved 14 August 2010.
  35. 35.0 35.1 Ferster, C. B. and Skinner, B. F. Schedules of Reinforcement. New York: Appleton-Century-Crofts, 1957
  36. "Different Types of Reinforcement Scedules" (PDF). http://autismpdc.fpg.unc.edu/sites/autismpdc.fpg.unc.edu/files/Reinforcement-Table1.pdf''. National Professional Development Center for Autism Spectrum Disorders. Retrieved 14 February 2015.
  37. 37.0 37.1 Psychology 2nd Edition
  38. Daniel L. Schacter, Daniel T. Gilbert, Daniel M. Wegner.(2011).Schedules of Reinforcement. Psychology second edition.
  39. http://www.theatlantic.com/magazine/archive/2012/06/what-man-can-make-of-man/308973/ Air-crib photograph] in "What Man Can Make of Man", by James Bennet. The Atlantic, June 2012.
  40. 40.0 40.1 Snopes.com "One Man and a Baby Box", accessed on 12-29-07.
  41. "Burrhus Fredrick Skinner". Skinner, Burrhus Frederic (1904 - 1990). Gale, Credo Reference. Retrieved 1 October 2013.
  42. Slater, L. (2004) Opening Skinner's Box: Great Psychological Experiments of the Twentieth Century, London, Bloomsbury
  43. Buzan, Deborah Skinner (12 March 2004). "I was not a lab rat". The Guardian. Retrieved 29 May 2012.
  44. Skinner, B. F. (1961). "Why we need teaching machines". Harvard Educational Review 31: 377–398.
  45. "Programmed Instruction and Task Analysis". College of Education, University of Houston.
  46. Skinner,B.F. 1961. "Teaching machines." Scientific American, 205, 90-112. doi:10.2307/1926170, p. 381)
  47. Skinner, B. F. and Holland, J. "The Analysis of Behavior: A Program for Self Instruction", 1961, p.387
  48. Philip McRae, Ph.D.
  49. Skinner, B. F. (1960). Pigeons in a pelican. American Psychologist, 15, 28−37. Reprinted in: Skinner, B. F. (1972). Cumulative record (3rd ed.). New York: Appleton-Century-Crofts,pp. 574−591.
  50. Described throughout Skinner, B. F. (1979). The shaping of a behaviorist: Part two of an autobiography. New York: Knopf.
  51. "Nose Cone, Pigeon-Guided Missile". National Museum of American History, Smithsonian Institution. Archived from the original on 16 May 2008. Retrieved 2008-06-10.
  52. "Skinner's Utopia: Panacea, or Path to Hell?". TIME. September 20, 1971.
  53. Richard Dawkins. "Design for a Faith-Based Missile". Free Inquiry magazine 22 (1). The project was also featured by "Top secret weapons revealed". Military Channel. 2012-08-14.
  54. 54.0 54.1 Skinner, B. F. (1936). "The Verbal Summator and a Method for the Study of Latent Speech". Journal of Psychology 2 (1): 71–107. doi:10.1080/00223980.1936.9917445.
  55. Rutherford, A., B. F. Skinner and the auditory inkblot: The rise and fall of the verbal summator as a projective technique, History of Psychology, 2003,4,362-378.
  56. B. F. Skinner, (1957) Verbal Behavior. The account in the appendix is that he asked Skinner to explain why he said "No black scorpion,Carter is falling upon this table."
  57. "Skinner, Burrhus Frederick(1904 - 1990).". Credo Reference, Gale. Credo Reference, Gale. Retrieved 1 October 2013.
  58. A. N. Chomsky, (1957) "A Review of BF Skinner's Verbal Behavior." in the preface, 2nd paragraph
  59. Richelle, M. (1993). B. F. Skinner: A reappraisal. Hillsdale: Lawrence Erlbaum Associates
  60. Michael, J. (1984). "Verbal behavior". Journal of the Experimental Analysis of Behavior 42 (3): 363–376. doi:10.1901/jeab.1984.42-363. PMC 1348108. PMID 16812395.
  61. The Analysis of Verbal Behavior (Journal)
  62. Holland, J. (1992). B.F Skinner. Pittsburgh: American Psychologist
  63. "B.F. Skinner Sep. 20, 1971." http://www.time.com/time/covers/0,16641,19710920,00.html. Web.
  64. B. F. Skinner, (1968). "The Design of Experimental Communities", International Encyclopedia of the Social Sciences (Volume 16). New York: Macmillan, 1968, pages 271-275.
  65. Ramsey, Richard David, Morning Star: The Values-Communication of Skinner's Walden Two, Ph.D. dissertation, Rensselaer Polytechnic Institute, Troy, NY, December 1979, available from University Microfilms, Ann Arbor, MI. Attempts to analyze Walden Two, Beyond Freedom and Dignity, and other Skinner works in the context of Skinner's life; lists over 500 sources.
  66. see Beyond Freedom and Dignity, 1974 for example
  67. A matter of Consequences, p. 412.
  68. Asimov, Nanette (1996-01-30). "Spanking Debate Hits Assembly". SFGate (San Francisco Chronicle). Retrieved 2008-03-02.
  69. ECON 252, Lecture 8 by Professor Robert Schiller at Yale University
  70. 70.0 70.1 Skinner, B. F. "'Superstition' in the Pigeon," Journal of Experimental Psychology #38, 1947.
  71. Classics in the History of Psychology — Skinner (1948)
  72. Timberlake & Lucas, (1985) "JEAB"
  73. "Derren Brown: Trick or Treat - 4oD". Channel 4. Retrieved 2012-11-07.
  74. 74.0 74.1 Journal of Humanistic Psychology Spring 1991 vol. 31 no. 2 112-113
  75. Beyond Freedom and Dignity, New York: Knopf, 1971 p. 155
  76. "I have been misunderstood" An interview with B.F.Skinner 1972 March/April Center Magazine, pp. 63−65
  77. New methods and new aims in teaching, B.F. Skinner, New Scientist, May 1964, No. 392, pp. 484
  78. Staddon, J. (1995) On responsibility and punishment. The Atlantic Monthly, Feb., 88−94. Staddon, J. (1999) On responsibility in science and law. Social Philosophy and Policy, 16, 146-174. Reprinted in Responsibility. E. F. Paul, F. D. Miller, & J. Paul (eds.), 1999. Cambridge University Press, pp. 146−174.
  79. Chomsky, Noam (1959). "Reviews: Verbal behavior by B. F. Skinner". Language 35 (1): 26–58. JSTOR 411334.
  80. B. F. Skinner, (1970) "On 'Having' A Poem" talks about the poem, its publication, and contains the poem and a reply to it as well. Real Audio mp3 Ogg
  81. On Chomsky's Review of Skinner's Verbal Behavior
  82. A. N. Chomsky, (1972) "The Case Against B. F. Skinner."
  83. Toates, F. (2009). Burrhus F. Skinner: The shaping of behavior. Houndmills, Basingstoke, England: Palgrave Macmillan.
  84. Rutherford, A. (2003). "B. F. Skinner and the auditory inkblot: The rise and fall of the verbal summator as a projective technique". History of Psychology 6 (4): 362–378. doi:10.1037/1093-4510.6.4.362.
  85. Reiss, Mike. (2002). Commentary for "Principal Charming", in The Simpsons: The Complete Second Season [DVD]. 20th Century Fox.

Further reading

External links

Wikiquote has quotations related to: B. F. Skinner
Wikimedia Commons has media related to B. F. Skinner.