Book Read Free

The Psychology Book

Page 10

by DK


  What Lashley found was that no matter which part of the brain he removed, the rats’ memory of the task remained. Their learning and retention of new tasks was impaired, but the amount of impairment depended on the extent, not the location, of the damage. He came to the conclusion that the memory trace is not localized in a particular place, but distributed evenly throughout the cerebral cortex; each part of the brain is therefore equally important, or equipotential. Decades later, he said that his experiment had led him to “sometimes feel…that the necessary conclusion is that learning is just not possible.”

  See also: John B. Watson • Donald Hebb • George Armitage Miller • Daniel Schacter • Roger Brown

  IN CONTEXT

  APPROACH

  Ethology

  BEFORE

  1859 English biologist Charles Darwin publishes On the Origin of Species, describing the theory of natural selection.

  1898 Lorenz’s mentor, German biologist Oskar Heinroth, begins his study of duck and goose behavior, and describes the phenomenon of imprinting.

  AFTER

  1959 Experiments by the German psychologist Eckhard Hess show that in imprinting, what has been learned first is remembered best; whereas in association learning, recent learning is remembered best.

  1969 John Bowlby argues that the attachment of newborn babies to their mothers is a genetic predisposition.

  The Austrian zoologist and doctor Konrad Lorenz was one of the founding fathers of ethology—the comparative study of animal behavior in the natural environment. He began his work observing geese and ducks at his family’s summer house in Altenberg, Austria. He noticed that the young birds rapidly made a bond with their mother after hatching, but could also form the same attachment to a foster parent if the mother was absent. This phenomenon, which Lorenz called “imprinting,” had been observed before, but he was the first to study it systematically. Famously, he even persuaded young geese and ducks to accept him (by imprinting his Wellington boots) as a foster parent.

  What distinguishes imprinting from learning, Lorenz discovered, is that it happens only at a specific stage in an animal’s development, which he called the "critical period." Unlike learning, it is rapid, operates independently of behavior, and appears to be irreversible; imprinting cannot be forgotten.

  Lorenz went on to observe many other stage-linked, instinctive behaviors, such as courtship behavior, and described them as "fixed-action patterns." These remain dormant until triggered by a specific stimulus at a particular critical period. Fixed-action patterns, he emphasized, are not learned but genetically programed, and as such have evolved through the process of natural selection.

  Lorenz discovered that geese and other birds follow and become attached to the first moving object they encounter after emerging from their eggs—in this case, his boots.

  See also: Francis Galton • Ivan Pavlov • Edward Thorndike • Karl Lashley • John Bowlby

  IN CONTEXT

  APPROACH

  Radical behaviorism

  BEFORE

  1890 William James outlines the theories of behaviorism in The Principles of Psychology.

  1890s Ivan Pavlov develops the concept of conditioned stimulus and response.

  1924 John B. Watson lays the foundations for the modern behaviorist movement.

  1930s Zing-Yang Kuo claims that behavior is continually being modified throughout life, and that even so-called innate behavior is influenced by “experiences” as an embryo.

  AFTER

  1950s Joseph Wolpe pioneers systematic desensitization as part of behavior therapy.

  1960s Albert Bandura’s social learning theory is influenced by radical behaviorism.

  Burrhus Frederic Skinner, better known as B.F. Skinner, is possibly the most widely known and influential behaviorist psychologist. He was not, however, a pioneer in the field, but developed the ideas of his predecessors, such as Ivan Pavlov and John B. Watson, by subjecting theories of behaviorism to rigorous experimental scrutiny in order to arrive at his controversial stance of “radical behaviorism.”

  Skinner proved to be an ideal advocate of behaviorism. Not only were his arguments based on the results of scrupulous scientific methodology (so they could be proved), but his experiments tended to involve the use of novel contraptions that the general public found fascinating. Skinner was an inveterate “gadget man” and a provocative self-publicist. But behind the showman image was a serious scientist, whose work helped to finally sever psychology from its introspective philosophical roots and establish it as a scientific discipline in its own right.

  Skinner had once contemplated a career as an author, but he had little time for the philosophical theorizing of many of the early psychologists. Works by Pavlov and Watson were his main influence; he saw psychology as following in the scientific tradition, and anything that could not been seen, measured, and repeated in a rigorously controlled experiment was of no interest to him.

  Processes purely of the mind, therefore, were outside Skinner’s interest and scope. In fact, he reached the conclusion that they must be utterly subjective, and did not exist at all separately from the body. In Skinner’s opinion, the way to carry out psychological research was through observable behavior, rather than through unobservable thoughts.

  Although a strict behaviorist from the outset of his career, Skinner differed from earlier behaviorists in his interpretation of conditioning—in particular, the principle of “classical conditioning” as described by Pavlov. While not disagreeing that a conditioned response could be elicited by repeated training, Skinner felt that this was something of a special case, involving the deliberate, artificial introduction of a conditioning stimulus.

  To Skinner, it seemed that the consequences of an action were more important in shaping behavior than any stimulus that had preceded or coincided with it. He concluded from his experiments that behavior is primarily learned from the results of actions. As with so many great insights, this may appear to be self-evident, but it marked a major turning point in behaviorist psychology.

  "The objection to inner states is not that they do not exist, but that they are not relevant in a functional analysis."

  B.F. Skinner

  Skinner boxes

  While working as a research fellow at Harvard, Skinner carried out a series of experiments on rats, using an invention that later became known as a “Skinner box.” A rat was placed in one of these boxes, which had a special bar fitted on the inside. Every time the rat pressed this bar, it was presented with a food pellet. The rate of bar-pressing was automatically recorded. Initially, the rat might press the bar accidentally, or simply out of curiosity, and as a consequence receive some food. Over time, the rat learned that food appeared whenever the bar was pressed, and began to press it purposefully in order to be fed. Comparing results from rats given the “positive reinforcement” of food for their bar-pressing behavior with those that were not, or were presented with food at different rates, it became clear that when food appeared as a consequence of the rat’s actions, this influenced its future behavior.

  Skinner concluded that animals are conditioned by the responses they receive from their actions and environment. As the rats explored the world around them, some of their actions had a positive consequence (Skinner was careful to avoid the word “reward” with its connotations of being given for “good” behavior), which in turn encouraged them to repeat that behavior. In Skinner’s terms, an “organism” operates on its environment, and encounters a stimulus (a food pellet), which reinforces its operant behavior (pressing on the bar). In order to distinguish this from classical conditioning, he coine
d the term “operant conditioning;” the major distinction being that operant conditioning depends not on a preceding stimulus, but on what follows as a consequence of a particular type of behavior. It is also different in that it represents a two-way process, in which an action, or behavior, is operating on the environment just as much as the environment is shaping that behavior.

  In the course of his experiments, Skinner began to run short of food pellets, forcing him to reschedule the rate at which they were being given to the rats. Some rats now received a food pellet only after they had pressed the bar a number of times repeatedly, either at fixed intervals or randomly. The results of this variation reinforced Skinner’s original findings, but they also led to a further discovery: that while a reinforcing stimulus led to a greater probability of a behavior occurring, if the reinforcing stimulus was then stopped, there was a decrease in the likelihood of that behavior occurring.

  Skinner continued making his experiments ever more varied and sophisticated, including changes of schedule to establish whether the rats could distinguish and respond to differences in the rate of delivery of food pellets. As he suspected, the rats adapted very quickly to the new schedules.

  "The ideal of behaviorism is to eliminate coercion, to apply controls by changing the environment."

  B.F. Skinner

  Skinner boxes were one of many ingenious devices that the psychologist created, giving him total control over the environment of the animals whose behavior he was observing.

  Negative reinforcement

  In later experiments, the floors of the Skinner boxes were each fitted with an electric grid, which would give the rats an unpleasant shock whenever they were activated. This allowed for the investigation of the effect of negative reinforcement on behavior. Again, just as Skinner avoided the word “reward,” he was careful not to describe the electric shock as “punishment,” a distinction that became increasingly important as he examined the implications of his research.

  Negative reinforcement was not a new concept in psychology. As early as 1890, William James had written in Principles of Psychology: “Animals, for example, awaken in a child the opposite impulses of fearing and fondling. But if a child, in his first attempts to pat a dog, gets snapped at or bitten, so that the impulse of fear is strongly aroused, it may be that for years to come no dog will excite in him the impulse to fondle again.” Skinner was to provide the experimental evidence for this idea.

  Winning at gambling often boosts the compulsion to try again, while losing lessens it, just as changes in the rate at which Skinner’s rats were fed made them modify their behavior.

  Positive reinforcement

  As expected, Skinner found that whenever a behavior resulted in the negative consequence of an electric shock, there was a decrease in that behavior. He went on to redesign the Skinner boxes used in the experiment, so that the rats inside were able to switch off the electrified grid by pressing a bar, which provided a form of positive reinforcement arising from the removal of a negative stimulus. The results that followed confirmed Skinner’s theory—if a behavior leads to the removal of a negative stimulus, that behavior increases.

  However, the results also revealed an interesting distinction between behavior learned by positive reinforcement and behavior elicited by negative stimuli. The rats responded better and more quickly to the positive stimuli (as well as the removal of negative stimuli), than when their behavior resulted in a negative response. While still careful to avoid the notions of “reward” and “punishment,” Skinner concluded that behavior was shaped much more efficiently by a program of positive reinforcement. In fact, he came to believe that negative reinforcement could even be counter-productive, with the subject continuing to seek positive responses for a specific behavior, despite this leading to a negative response in the majority of cases.

  This has implications in various areas of human behavior too; for example, in the use of disciplinary measures to teach children. If a boy is continually being punished for something he finds enjoyable, such as picking his nose, he is likely to avoid doing so when adults are around. The child may modify his behavior, but only so far as it enables him to avoid punishment. Skinner himself believed that ultimately all forms of punishment were unsuitable for controlling children’s behavior.

  Positive reinforcement can stimulate particular patterns of behavior, as Skinner demonstrated by placing a rat in one of his specially designed boxes, fitted with a lever or bar. Pellets of food appeared every time the animal pressed the bar, encouraging it to perform this action again and again.

  Genetic predisposition

  The “shaping” of behavior by operant conditioning has striking parallels with Charles Darwin’s theory of natural selection—in essence, that only organisms suited by their genetic make-up to a particular environment will survive to reproduce, ensuring the “success” of their species. The likelihood of a rat behaving in a way that will result in a reinforcing stimulus, triggering the process of operant conditioning, is dependent on the level of its curiosity and intelligence, both of which are determined by genetic make-up. It was this combination of predisposition and conditioning that led Skinner to conclude that “a person’s behavior is controlled by his genetic and environmental histories” —an idea that he explored further in his article The Selection by Consequences, written for the journal Science in 1981.

  In 1936, Skinner took up a post at the University of Minnesota, where he continued to refine his experimental research in operant conditioning and to explore practical applications for his ideas, this time using pigeons instead of rats. With the pigeons, Skinner found that he was able to devise more subtle experiments. Using what he described as a “method of successive approximations”, he could elicit and investigate more complex patterns of behavior.

  Skinner gave the pigeons positive reinforcement for any behavior that was similar to that he was trying to elicit. For example, if he was trying to train a pigeon to fly in a circle clockwise, food would be given for any movement the pigeon made to the right, however small. Once this behavior had been established, the food was only given for longer flights to the right, and the process was repeated until the pigeon had to fly a full circle in order to receive some food.

  Skinner’s pigeon experiments proved that the positive reinforcement of being fed on the achievement of a task helped to speed up and reinforce the learning of new behavior patterns.

  Teaching program

  Skinner’s research led him to question teaching methods used in schools. In the 1950s, when his own children were involved in formal education, students were often given long tasks that involved several stages, and usually had to wait until the teacher had graded work carried out over the entire project before finding out how well they had done. This approach ran contrary to Skinner’s findings about the process of learning and, in his opinion, was holding back progress. In response, Skinner developed a teaching program that gave incremental feedback at every stage of a project—a process that was later incorporated into a number of educational systems. He also invented a “teaching machine” that gave a student encouraging feedback for correct answers given at every stage of a long series of test questions, rather than just at the end. Although it only achieved limited approval at the time, the principles embodied in Skinner’s teaching machine resurfaced decades later in self-education computer programs.

  It has to be said that many of Skinner’s inventions were misunderstood at the time, and gained him a reputation as an eccentric. His “baby tender,” fo
r example, was designed as a crib alternative to keep his infant daughter in a controlled, warm, and draft-free environment. However, the public confused it with a Skinner box, and it was dubbed the “heir conditioner” by the press, amid rumors that Skinner was experimenting on his own children. Nevertheless, the baby tender attracted publicity, and Skinner was never shy of the limelight.

  Praise or encouragement given at frequent intervals during the progress of a piece of work, rather than one large reward at the end, has been shown to boost the rate at which children learn.

  War effort

  Yet another famous experiment called “Project Pigeon” was met with skepticism and some derision. This practical application of Skinner’s work with pigeons was intended as a serious contribution to the war effort in 1944. Missile guidance systems were yet to be invented, so Skinner devised a nose cone that could be attached to a bomb and steered by three pigeons placed inside it. The birds had been trained, using operant conditioning, to peck at an image of the bomb’s target, which was projected into the nose cone via a lens at the front. This pecking controlled the flight-path of the missile. The National Defense Research Committee helped fund the project, but it was never used in combat, because it was considered too eccentric and impractical. The suspicion was that Skinner, with his passion for gadgets, was more interested in the invention than in its application. When asked if he thought it right to involve animals in warfare, he replied that he thought it was wrong to involve humans.

 

‹ Prev