An Introduction to Operant (Instrumental) ConditioningCitation: Huitt, W., & Hummel, J. (1997). An introduction to operant (instrumental) conditioning. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved [date] from, http://www.edpsycinteractive.org/topics/behavior/operant.html Show
Return to: | An Overview of Behavioral Psychology | EdPsyc Topics | A human being fashions his consequences as surely as he fashions his goods or his dwelling. Nothing that he says, thinks or does is without consequences. The major theorists for the development of operant conditioning are Edward Thorndike, John Watson, and B. F. Skinner. This approach to behaviorism played a major role in the development of the science of psychology, especially in the United States. They proposed that learning is the result of the application of consequences; that is, learners begin to connect certain responses with certain stimuli. This connection causes the probability of the response to change (i.e., learning occurs.) Thorndike labeled this type of learning instrumental. Using consequences, he taught kittens to manipulate a latch (e.g., an instrument). Skinner renamed instrumental as operant because it is more descriptive (i.e., in this learning, one is "operating" on, and is influenced by, the environment). Where classical conditioning illustrates S-->R learning, operant conditioning is often viewed as R-->S learning since it is the consequence that follows the response that influences whether the response is likely or unlikely to occur again. It is through operant conditioning that voluntary responses are learned. The 3-term model of operant conditioning (S--> R -->S) incorporates the concept that responses cannot occur without an environmental event (e.g., an antecedent stimulus) preceding it. While the antecedent stimulus in operant conditioning does not elicit or cause the response (as it does in classical), it can influence it. When the antecedent does influence the likelihood of a response occurring, it is technically called a discriminative stimulus. It is the stimulus that follows a voluntary response (i.e., the response's consequence) that changes the probability of whether the response is likely or unlikely to occur again. There are two types of consequences: positive (sometimes called pleasant) and negative (sometimes called aversive). These can be added to or taken away from the environment in order to change the probability of a given response occurring again. General Principles
Schedules of consequences
In summary, the schedules of consequences are often called schedules of reinforcements because there is only one schedule that is appropriate for administering response cost and punishment: continuous or fixed ratio of one. In fact, certainty of the application of a consequence is the most important aspect of using response cost and punishment. Learners must know, without a doubt, that an undesired or inappropriate target behavior will be followed by removal of a positive/pleasant stimulus or the addition of a negative/aversive stimulus. Using an intermittent schedule when one is attempting to reduce a behavior may actually lead to a strengthening of the behavior, certainly an unwanted end result. Premack Principle
Analyzing Examples of Operant Conditioning
Rules in analyzing examples. The following questions can help in determining whether operant conditioning has occurred.
Examples. The following examples are provided to assist you in analyzing examples of operant conditioning.
Applications of Operant Conditioning to Education:
Tutorials:
Return to:
All materials on this website [http://www.edpsycinteractive.org] are, unless otherwise stated, the property of William G. Huitt. Copyright and other intellectual property laws protect these materials. Reproduction or retransmission of the materials, in whole or in part, in any manner, without the prior written consent of the copyright holder, is a violation of copyright law. When a consequence that follows a behavior decreases the likelihood of that behavior occurring again this is known as?Punishment is defined as a consequence that follows an operant response that decreases (or attempts to decrease) the likelihood of that response occurring in the future.
When the consequences of a behavior increases the likelihood that a behavior will occur again the behavior is?Reinforcer. A behavior (operant response) is sometimes more likely to occur in the future as a result of the consequences that follow that behavior. Events that increase the likelihood of a behavior occurring in the future are called reinforcers.
What will increase the likelihood of a behavior?Learning Objectives. Is a consequence that decreases the likelihood that a behavior will occur quizlet?Punishment is a consequence that decreases the likelihood that a behavior will occur. In punishment, a response decreases because of its unpleasant consequences.
|