Introduction to Operant Conditioning

Rita Chaney's image for:
"Introduction to Operant Conditioning"
Image by: 

There are several different types of learning discussed in psychology literature. Operant conditioning, also known as instrumental conditioning, produces behavior that occurs on its own instead of being cued by external events. Simply put, lifting your hand to hail a taxicab or depositing money in a soda machine to get a soda are examples of operant behaviors. Performing those behaviors mostly likely produces the results we seek, thus operant conditioning occurs.

Operant conditioning was initially discovered and studied by Edward Thorndike, an American teacher and psychologist. Studying how cats learn, he placed them in a puzzle box to observe their actions in various situations. Thorndike noted 2 primary elements of operant conditioning: 1-mammals select a specific response (i.e. the operant response) and concentrate on watching and altering the response, and 2-a consequence comes after the response.

Thorndike distinguished between two types of consequences of behaviors: reinforcers and punishers. Reinforcers follow a behavior and are either positive or negative. When positive reinforcers are applied, the behavior will more than likely recur in the future. When negative reinforcers are applied, the behavior will probably fade away. An example of a positive reinforcer is when the cab stops for us after we raised our arm to signal for it. We will no doubt repeat the act of raising our arm in the future to hail a cab. A negative reinforcer effectively reinforces behavior by removing something undesired from a situation. A rat in a cage will learn to open doors to escape an electric shock, the negative reinforcer.

Punishers contribute something undesired to a behavior, thus diminishing it. When we exceed driving speed limits and get a $200 ticket from the police, we adjust to driving at a slower speed. The ticket is the punishment for the behavior of speeding. Our speeding behavior will most likely slow down or cease completely as a result of the punishment.

Another factor worth mentioning related to operant conditioning is extinction. When a behavior is followed by no consequence, it may fizzle out or "extinguish". The behavior will stop. When little Sally whines and cries to get a piece of candy, Mother ignores her. Thus, the whining and crying will probably cease because Sally received no consequence for those behaviors. The behavior extinguishes.

B.F. Skinner, another American psychologist who studied operant conditioning, devised a cage in which to examine learning in rats and pigeons. This cage or "box" is known as the "Skinner box" and is still used today in laboratory research on how mammals acquire responses.

To summarize, when behavior operates on the environment and provides the desired results, operant conditioning has taken place. Consequences (be they positive or negative reinforcers) or punishers follow the behavior and either strengthen or weaken it. If no consequence comes after the behavior, the behavior will probably extinguish. Thorndike's formulation of operant conditioning is one type of learning studied in psychology.

Morris, Charles. G., 1996. Psychology, An Introduction. Ninth Edition. Prentice Hall: Upper Saddle River, New Jersey.

More about this author: Rita Chaney

From Around the Web