Glossary of Behavioural Terms

advertisement
Glossary of Behavioral Terms
Glossary of Terms for the
Experimental Analysis of
Behavior
Authors of the glossary: Brittan Barker, Joy Kreider, Jessie Peissig, Greta Sokoloff,
Maura Stansfield
http://www.psychology.uiowa.edu/Faculty/wasserman/Glossary/index%20set.html
Attention
Skinner (1953) defined attention behaviorally as "...a controlling relation -- the relation
between a response and a discriminative stimulus (p.123)."
For some authors, like Skinner, attention is simply synonymous with stimulus control. For
others, attention refers to selective stimulus control. Here, only a portion of the available
discriminative stimuli in the environment (so-called nominal stimuli) actually control
operant behavior (so-called functional stimuli). Experimental analysis can disclose
precisely which of the available stimuli in the environment actually control operant
behavior.
In operant conditioning, attention can be experimentally examined by varying the
contingencies of reinforcement for making an operant response in the presence of
different discriminative stimuli. Three key attentional effects in operant conditioning are:
overshadowing, blocking, and relative validity. Each effect reveals that only a subset of the
available stimuli may effectively control operant behavior.
Behavioural Steam
A further important insight is that, without making an appropriate observing response,
discriminative stimuli cannot exert control over operant behavior. A motorist who runs a
red light might be said to have been inattentive; perhaps a better statement might be that
he failed to look at the traffic light, thus precluding his coming under the control of its color.
Page 1 of 19
Glossary of Behavioral Terms
All living things behave. The fact that we and other animals are always doing something
throughout our lives led Schoenfeld and Farmer (1970) to describe the ongoing flow of
responding as the behavior stream..
The experimental analysis of behavior divides that behavior stream into discrete parts for
identifying the fundamental laws of behavior.
One of the first psychologists to divide the behavior stream into discrete parts for
experimental analysis was B. F. Skinner. The three analytical terms that Skinner devised
are included in his notion of the three-term contingency.
Blocking
If a discriminative stimulus (A) is first presented alone and is followed by reinforcement,
then that stimulus may gain strong stimulus control over operant behavior. If that
discriminative stimulus is later combined with a novel stimulus (X) and the stimulus
compound (AX) is followed by the same reinforcement, then little or no control may be
exerted by the second stimulus (X) when tests with with it alone are conducted.
An additional comparison is critical to show that stimulus control by X has been blocked by
prior training with A. Here, only AX training is given. Stimulus control by X alone in this
comparison condition greatly exceeds that in the first condition.
For example, the printed word "stop" on a black sign might easily be able to control a
motorist's braking response. But, after a long history of braking at red stop signs, a
motorist might very well speed past a black sign with "stop" printed on it. We would then
say that the color of the stop sign had blocked the lettering on it in controlling the
motorist's braking behavior.
Conceptual Behavior
Conceptual behavior is seen when an organism makes the same response to a group of
discriminably different objects. An organism is said to respond conceptually when it
responds similarly to stimuli within the same class and when it responds differently to
stimuli in different classes.
Thus, a child who responds "cat" to different examples of cats and who responds "dog" to
different examples of dogs is exhibiting conceptual behavior.
Conceptual behavior involves both generalization and discrimination: generalization within
classes and discrimination between classes.
Page 2 of 19
Glossary of Behavioral Terms
Conditioned Reinforcer
A conditioned reinforcer is a previously neutral stimulus. If the neutral stimulus is paired
with a primary reinforcer it acquires the same reinforcement properties associated with the
primary reinforcer.
Money is a conditioned reinforcer. The actual paper bills are not themselves reinforcing.
However, the paper bills can be used to acquire primary reinforcers such as food, water,
and shelter. Therefore, the paper bills become reinforcers as a result of pairing them with
the acquisition of food, water, and shelter.
Conditioned response
A conditioned response in Pavlovian conditioning is the response that the conditioned
stimulus elicits after it has been repeatedly paired with an unconditioned stimulus. The
conditioned response may be similar in form to the unconditioned response. For example,
the eye blink to the tone conditioned stimulus may involve the same bodily musculature as
the eye blink to the puff of air to the cornea.
Conditioned stimulus
A conditioned stimulus in Pavlovian conditioning is an initially neutral stimulus that is
paired with the unconditioned stimulus. For example, a tone sounded just prior to the puff
of air being delivered to the cornea of the eye. Without prior training, the tone does not
elicit an eye blink: however, after a number of tone-puff pairings, the tone alone comes to
elicit the blinking response.
Discrete Trial
A discrete trial represents an isolated opportunity for an organism to make a single
operant response to a discriminative stimulus. Successive trials are separated by intertrial
intervals during which no discriminative stimuli are presented and operant responses are
either precluded or are not reinforced.
Thorndike's puzzle box, the Skinner Box, and the T-maze are all apparatuses that can be
used for experimental designs involving discrete trials.
An example of a discrete trial procedure can be illustrated with the Skinner Box. When it is
inserted into the box, pressing a lever can deliver food to a hungry rat; otherwise, the lever
is unavailable and no presses can produce food.
Page 3 of 19
Glossary of Behavioral Terms
Discriminated Operant
The discriminated operant is an operant response that is under the stimulus control of a
discriminative stimulus. Such control is established by reinforcing the response in the
presence of that discriminative stimulus.
For example, after appropriate training, your dog will lift his paw to the verbal command
"shake."
Discrimination
Different consequences may follow the same behavior in different situations. When we
respond differently in those different situations, we have formed a discrimination between
the situations. For instance, when you tell a ribald tale to friends at a party, but refrain from
doing so at a church gathering, this is an example of discrimination. A past history of
positive reinforcement in the first case and a past history of positive punishment in the
second could clearly be responsible for this illustration of discriminative responding.
Failure to have discriminated between these different situations would represent a case of
inappropriate stimulus generalization.
Discrimination comes about when you chose the content of your joke depending on who is
the listener (e.g., friend versus priest). Based on the joke that you tell, the positive
reinforcement of the listener's laughter or the positive punishment of the listener's frown
can tell you whether or not you made the right choice in the joke told.
Discriminative Stimulus
A discriminative stimulus influences the occurrence of an operant response because of
the contingencies of schedules of reinforcement or paradigms of
reinforcement/punishment that are or have been associated with that response. Many
authors further suggest that discriminative stimuli provide information to the organism,
allowing it to respond appropriately in the presence of different stimuli. An observing
response is sometimes necessary for presentation of the discriminative stimulus/stimuli.
For example, different individuals can serve as discriminative stimuli in a joke-telling
situation. The jokes that you tell your priest are probably different from the jokes that you
tell your best friend because of your past history of telling jokes to both people.
Experimentally, we can observe the discrimination of stimuli by associating different
discriminative stimuli with different schedules of reinforcement or paradigms of
reinforcement/punishment.
Page 4 of 19
Glossary of Behavioral Terms
For instance, in the laboratory, a pigeon could be required in the presence of a steady
chamber light to peck a key on a Fixed Interval schedule to produce food, whereas it could
be required in the presence of a blinking chamber light to pull a chain on a Variable Ratio
schedule to turn off a loud noise. The discriminative stimuli clarify the "rules of the game,"
making each prevailing three-term contingency unambiguous.
Eliciting Stimulus
An eliciting stimulus is a change in the environment that is highly correlated with the
occurrence of a later response.
An eliciting stimulus is an essential component of Pavlovian conditioning.
For example, if a piece of chocolate (unconditioned stimulus) is placed into your mouth,
then you will probably salivate copiously (unconditioned response).
Placing the piece of chocolate into the mouth is said to elicit salivation.
Emotional Stimulus
Some stimuli may produce an emotional reaction which may influence the occurrence of
behavior.
For example, a game of backgammon might be interrupted by news of the unexpected
death of a famous politician.
Extinction
A special and important schedule of reinforcement is extinction, in which the reinforcement
of a response is discontinued. Discontinuation of reinforcement leads to the progressive
decline in the occurrence of a previously reinforced response.
Free Operant
Once an operant response occurs, it may be "free" or available to occur again without
obstacle or delay. This would be the case, for example, of someone picking up a stone
from a rocky beach and skipping it across the water.
Other operants are only available for very limited periods of time and cannot be freely
repeated. This would be the case, for example, of someone wishing their friend a happy
birthday.
Page 5 of 19
Glossary of Behavioral Terms
Functional Operant
Not all responses are able to be modified, but a functional operant is a response which is
modifiable. It is important to demonstrate that a response is able to be modified by its
consequences before it is considered an operant response.
For example, when learning to drive a car with a standard transmission you must learn to
release the clutch and press the gas with precise timing or the car will stall. Only when you
release the clutch and press the gas at the proper time will the car's gears appropriately
shift.
Functional Stimulus
The functional stimulus refers to the specific attributes of the discriminative stimulus that
exert control over the organism's behavior.
In the above example, your friend may have been particularly interested in the car's color.
Generalization
Discrimination results when different situations occasion different responses based on the
contingencies of reinforcement. Inappropriate stimulus generalization occurs when those
different situations fail to produce discriminative operant responding. Generalization is not
always inappropriate and occurs when you respond the same to two stimuli that are not
identical.
For example, a child may learn to say "dog" when it sees the drawing of a rottweiler in a
book. If the child later says "dog" when it sees a schnauzer on the street, it has
generalized between the two distinct stimuli (the rottweiler and the schnauzer).
Inhibition
People and animals learn to suppress responses to a stimulus if it is not followed by
reinforcement.
Normally, when you put money in a candy machine and press a button, you receive
candy. If on several occassions you put money in a candy machine and press the button
without receiving any candy, you will not continue to use the candy machine. However,
you continue to use the soda machine located beside the same candy machine.
Page 6 of 19
Glossary of Behavioral Terms
Interval Schedule
Interval schedules require a minimum amount of time that must pass between successive
reinforced responses (e.g. 5 minutes). Responses which are made before this time has
elapsed are not reinforced. Interval schedules may specify a fixed time period between
reinforcers (Fixed Interval schedule) or a variable time period between reinforcers
(Variable Interval schedule).
Fixed Interval schedules produce an accelerated rate of response as the time of
reinforcement approaches. Students' visits to the university library show a decided
increase in rate as the time of final examinations approaches.
Variable Interval schedules produce a steady rate of response. Presses of the "redial"
button on the telephone are sustained at a steady rate when you are trying to reach your
parents and get a "busy" signal on the other end of the line.
Negative Patterning
The negative patterning procedure also involves presenting simultaneous compound
stimuli. However, the simultaneous compound stimuli are not paired with reinforcement
(AB-) while the stimuli presented individually are paired with reinforcement (A+, B+).
Negative patterning decreases responding to the compound stimuli and increases
responding to the stimuli presented individually.
Students frequently encounter examples of negative patterning. As a result of poor
performance on a midterm you are prompted to study intensely for the final to raise your
class grade. As a result of poor performance on quizzes in another class, you are again
prompted to study intensely for the final to raise your grade. However, if you perform
poorly on the midterm and on quizzes in the same class, you will not be prompted to study
at all for the final because a high grade on the final can't raise your overall grade in the
class. Therefore, you will be more likely to study for final exams when you have performed
poorly on a midterm in one class and quizzes in another.
Negative Punisher
A negative punisher is an appetitive event whose removal follows an operant response.
The negative punisher decreases the likelihood of that behavior occurring again under the
same circumstances.
Page 7 of 19
Glossary of Behavioral Terms
Negative Punishment
In an attempt to decrease the likelihood of a behavior occurring in the future, an operant
response is followed by the removal of an appetitive stimulus. This is negative
punishment.
When a child "talks back" to his/her mother, the child may lose the privilege of watching
her favorite television program. Therefore, the loss of viewing privileges will act as a
negative punisher and decrease the likelihood of the child talking back in the future.
Negative Reinforcement
In an attempt to increase the likelihood of a behavior occurring in the future, an operant
response is followed by the removal of an aversive stimulus. This is negative
reinforcement.
When a child says "please" and "thank you" to his/her mother, the child may not have to
engage in his/her dreaded chore of setting the table. Therefore, not having to set the table
will act as a negative reinforcer and increase the likelihood of the child saying "please"
and "thank you" in the future.
Negative Reinforcer
A negative reinforcer is an aversive event whose removal follows an operant response.
The negative reinforcer increases the likelihood of that behavior occurring again under the
same circumstances.
Nominal Stimulus
A discriminative stimulus may have many identifiable attributes. Although we can readily
observe the organism's response to the whole stimulus, it may not be clear exactly which
attributes of the stimulus are controlling the behavior (see functional stimulus). The
unanalyzed stimulus as a whole is said to be the nominal stimulus.
For example, your friend asks you to look at a passing sports car. It is not clear just what
your friend wanted you to note about the car: its color, make, speed, location, driver, etc.
Observing response
An observing response is a response that leads to exposure to a discriminative stimulus or
discriminative stimuli. An observing response can also be viewed as a type of attention.
Page 8 of 19
Glossary of Behavioral Terms
For example, before you can take money from your account at an ATM, you have to enter
your personal identification number (PIN). Entering the PIN is the observing response
required to view the transaction choices.
Operant Conditioning
Operant conditioning, also called instrumental conditioning, is a method for modifying
behavior (an operant) which utilizes contingencies between a discriminative stimulus, an
operant response, and a reinforcer to change the probability of a response occurring again
in that situation. This method is based on Skinner's three-term contingency and it differs
from the method of Pavlovian conditioning.
An everyday illustration of operant conditioning involves training your dog to "shake" on
command. Using the operant conditioning technique of shaping, you speak the command
to "shake" (the discriminative stimulus) and then wait until your dog moves one of his
forepaws a bit (operant response). Following this behavior, you give your dog a tasty treat
(positive reinforcer). After demanding ever closer approximations to shaking your hand,
your dog finally comes to perform the desired response to the verbal command "shake."
Skinner is famous for the invention of the Skinner box, an experimental apparatus which
he designed to modify animal behavior within an operant conditioning paradigm.
Operant Level
Operant behaviors occur at some base rate prior to reinforcement. This unconditioned
level of responding is called the operant level.
The operant level is one of a number of ways (like the yoked control) in which we can
determine whether or not changes in the occurrence of the operant response are due to
the prevailing contingencies of reinforcement.
Operant Response
An operant response is a behavior that is modifiable by its consequences. When behavior
is modified by its consequences, the probability of that behavior occurring again may
either increase (in the case of reinforcement) or decrease (in the case of punishment).
For example, speeding through a red light may lead to getting struck broadside by another
vehicle. If this consequence follows such a response, then the likelihood of a person's
responding in the same way under similar conditions should drop significantly.
Page 9 of 19
Glossary of Behavioral Terms
It is also possible for temporal or topographical properties of behavior to be modified by
reinforcement (in the case of response differentiation).
Overshadowing
A discriminative stimulus when it is presented alone may exert strong stimulus control
over operant behavior. However, if that discriminative stimulus is accompanied by
another, then stimulus control by the first (or overshadowed) stimulus may be reduced or
eliminated by the second (or overshadowing) stimulus.
For example, I might easily recognize my son by the cowlick in his hair. But, I would be
more likely to recognize him from his distinctive gait when he is walking in the playground.
We would say that control by his cowlick was overshadowed by control by his gait.
Basic Paradigms of Reinforcement/Punishment
Operant
Response
Increases
Operant
Response
Decreases
Stimulus
Positive
Positive
Presentation Reinforcement Punishment
Stimulus
Removal
Negative
Negative
Reinforcement Punishment
Pavlovian Conditioning
Pavlovian conditioning is an important form of learning that involves the pairing of stimuli
independent of an organism's behavior. The key stimulus and response elements of
Pavlovian conditioning are:
Unconditioned stimulus
This type of stimulus unconditionally elicits a response, also referred to as a respondent.
For example, a puff of air to the cornea of the eye is an unconditioned stimulus that
produces a blinking response.
Page 10 of 19
Glossary of Behavioral Terms
Unconditioned response
This type of response occurs to an unconditioned stimulus without prior conditioning. The
blinking response after a puff of air to the cornea of the eye is an example of an
unconditioned response.
Conditioned stimulus
A conditioned stimulus in Pavlovian conditioning is an initially neutral stimulus that is
paired with the unconditioned stimulus. For example, a tone sounded just prior to the puff
of air being delivered to the cornea of the eye. Without prior training, the tone does not
elicit an eye blink: however, after a number of tone-puff pairings, the tone alone comes to
elicit the blinking response.
Conditioned response
A conditioned response in Pavlovian conditioning is the response that the conditioned
stimulus elicits after it has been repeatedly paired with an unconditioned stimulus. The
conditioned response may be similar in form to the unconditioned response. For example,
the eye blink to the tone conditioned stimulus may involve the same bodily musculature as
the eye blink to the puff of air to the cornea.
Positive Patterning
The positive patterning procedure involves presenting simultaneous compound stimuli that
are paired with reinforcement (AB+) while withholding reinforcement when a single
stimulus is presented (A-, B-). Positive patterning produces an increase in responding to
the compound stimuli and a decrease in responding to a single stimulus.
An example of positive patterning is present in the daily medicinal routine of an asthmatic.
To prevent an asthma attack, the person has to take two medicines: one that opens the
patient's lungs and another to enhance breathing capabilities. When the medicines are
taken simultaneously, the asthmatic's lungs are opened, breathing is improved, and
therefore an asthma attack is prevented. However, if each medicine is taken alone an
asthma attack may still occur. As a result of the medicines preventing an attack when
taken simultaneously, the patient is more likely to take the drugs together than separately.
Positive Punisher
A positive punisher is an aversive event whose presentation follows an operant response.
The positive punisher decreases the likelihood of the behavior occurring again under the
same circumstances.
Page 11 of 19
Glossary of Behavioral Terms
If you stroke a cat's fur in a manner that the cat finds unpleasant, the cat may attempt to
bite you. Therefore, the presentation of the cat's bite will act as a positive punisher and
decrease the likelihood that you will stroke the cat in that same manner in the future.
Positive Punishment
In an attempt to decrease the likelihood of a behavior occurring in the future, an operant
response is followed by the presentation of an aversive stimulus. This is positive
punishment.
If you stroke a cat's fur in a manner that the cat finds unpleasant, the cat may attempt to
bite you. Therefore, the presentation of the cat's bite will act as a positive punisher and
decrease the likelihood that you will stroke the cat in that same manner in the future.
Positive Reinforcement
In an attempt to increase the likelihood of a behavior occurring in the future, an operant
response is followed by the presentation of an appetitive stimulus. This is positive
reinforcement.
If you stroke a cat's fur in a manner that is pleasing to the cat it will purr. The cat's purring
may act as a positive reinforcer, causing you to stroke the cat's fur in the same manner in
the future.
Punisher
A behavior (operant response) is sometimes less likely to occur in the future as a result of
the consequences that follow that behavior. Events that decrease the likelihood of a
behavior occurring in the future are called punishers.
Punishment
Punishment is defined as a consequence that follows an operant response that decreases
(or attempts to decrease) the likelihood of that response occurring in the future.
Puzzle Box
The puzzle box is the laboratory device that E. L. Thorndike invented in order to study
instrumental or operant conditioning in cats. Hungry cats were individually placed into a
box that could be opened by the animal via a device such as a latch. Once outside of the
box, the cats gained access to food (a positive reinforcer). Thorndike found that the cats
Page 12 of 19
Glossary of Behavioral Terms
took less and less time to get out of the box the more trials of training had been given. He
referred to this reinforcement of latch opening as documenting the Law of Effect.
This was the first experimental apparatus designed to study operant behavior and was
later followed by the invention of the Skinner box.
Ratio Schedule
Ratio schedule require a certain number of operant responses (e.g., 10 responses) to
produce the next reinforcer. The required number of responses may be fixed from one
reinforcer to the next (Fixed Ratio schedule) or it may vary from one reinforcer to the next
(Variable Ratio schedule).
Fixed Ratio schedules support a high rate of response until a reinforcer is received, after
which a discernible pause in responding may be seen, especially with large ratios. Sales
people who are paid on a "commission" basis may work feverously to reach their sales
quota, after which they take a break from sales for a few days.
Variable Ratio schedules support a high and steady rate of response. The power of this
schedule of reinforcement is illustrated by the gambler who persistently inserts coins and
pulls the handle of a "one-armed bandit."
Reflex
A response may be produced with very high probability after a specific stimulus. This type
of stimulus-response relation -- or reflex -- does not require prior learning.
The reflex is the building block of Pavlovian conditioning. The unconditioned stimulus and
unconditioned response together comprise the reflex.
The eye blink to a puff of air to the cornea is an example of a reflex.
Reinforcement
Reinforcement is defined as a consequence that follows an operant response that
increase (or attempts to increase) the likelihood of that response occurring in the future.
Reinforcer
A behavior (operant response) is sometimes more likely to occur in the future as a result
of the consequences that follow that behavior. Events that increase the likelihood of a
behavior occurring in the future are called reinforcers.
Page 13 of 19
Glossary of Behavioral Terms
Reinforcing Stimulus
A reinforcing stimulus is one that increases the occurrence of behaviors that it follows.
For instance, the receipt of a trophy may increase the chances of a young girl competing
in a yearly road race.
Relative Validity
If multiple discriminative stimuli are differentially correlated with reinforcement, then
stimulus control may be stronger to those stimuli that more reliably signal the occurrence
and nonoccurrence of reinforcement. Such stimuli are often said to be more valid or
diagnostic cues.
For example, suppose that in Case 1, compound discriminative stimuli AX and BX are
explicitly associated with reinforcement and extinction, respectively. We would expect
behavioral control by the A and B elements to exceed that by the X element.
Now, suppose that in Case 2, the compound discriminative stimuli AX and BX are equally
often associated with reinforcement and extinction. Here, we would expect that control by
the A and B elements would be similar to that by the X element.
These expectations have been borne out by behavioral evidence in both human beings
and laboratory animals. Critically, behavioral control by the X element in Case 1 is much
lower than in Case 2, despite the fact that the X element is equally often associated with
reinforcement and extinction in each case.
This relative validity effect shows that the behavioral control that is exerted by one
element of a compound discriminative stimulus (here X) depends on the relative
discriminative validity of other stimuli (here A and B) with which it occurs in compound.
Casually speaking, organisms learn to attend to the stimuli that are the most diagnostic of
events of importance to them.
Respondent
Response-Dependent Reinforcement
Response-dependent reinforcement requires the organism to perform an operant
response before receiving reinforcement.
Page 14 of 19
Glossary of Behavioral Terms
An example of response-dependent reinforcement involves a vending machine. If after
inserting your money into the machine you don't push the proper button to deliver your
candy bar, then you will not receive the reinforcer.
Response-Independent Reinforcement
Response-independent reinforcement delivers reinforcers to an organism regardless of its
behavior (see yoked control).
Mass mailings of trial-size new products, like a new cereal, provides a reinforcer to the
recipient, regardless of the recipient's behavior.
Schedules of Reinforcement
Schedules of reinforcement are the precise rules that are used to present (or to remove)
reinforcers (or punishers) following a specified operant behavior. These rules are defined
in terms of the time and/or the number of responses required in order to present (or to
remove) a reinforcer (or a punisher). Different schedules schedules of reinforcement
produce distinctive effects on operant behavior.
Shaping
Shaping modifies behavior by reinforcing behaviors that progressive approximate the
target behavior (operant response). Shaping can be used to train organisms to perform
behaviors that would rarely if ever occur otherwise.
For example, to teach a child to write his or her first name, you initially give praise for
writing the first letter correctly. After the child has mastered that first step, letter-by-letter
you give praise until the entire name is correctly written.
Skinner Box
Prior to the work of Skinner, instrumental learning was typically studied using a maze or a
puzzle box. Learning in these settings is better suited to examining discrete trials or
episodes of behavior, instead of the continuous stream of behavior. The Skinner Box is an
experimental environment that is better suited to examine the more natural flow of
behavior. (The Skinner Box is also referred to as an operant conditioning chamber.)
A Skinner Box is a often small chamber that is used to conduct operant conditioning
research with animals. Within the chamber, there is usually a lever (for rats) or a key (for
pigeons) that an individual animal can operate to obtain a food or water within the
chamber as a reinforcer. The chamber is connected to electronic equipment that records
Page 15 of 19
Glossary of Behavioral Terms
the animal's lever pressing or key pecking, thus allowing for the precise quantification of
behavior.
Stimuli
Stimuli are events in the environment that influence behavior. A single stimulus can serve
many different functions. Listed below are several functions that a stimulus can serve.
Discriminative Stimulus
A discriminative stimulus influences the occurrence of an operant response because of
the contingencies of schedules of reinforcement or paradigms of
reinforcement/punishment that are or have been associated with that response. Many
authors further suggest that discriminative stimuli provide information to the organism,
allowing it to respond appropriately in the presence of different stimuli. An observing
response is sometimes necessary for presentation of the discriminative stimulus/stimuli.
For example, different individuals can serve as discriminative stimuli in a joke-telling
situation. The jokes that you tell your priest are probably different from the jokes that you
tell your best friend because of your past history of telling jokes to both people.
Experimentally, we can observe the discrimination of stimuli by associating different
discriminative stimuli with different schedules of reinforcement or paradigms of
reinforcement/punishment.
For instance, in the laboratory, a pigeon could be required in the presence of a steady
chamber light to peck a key on a Fixed Interval schedule to produce food, whereas it could
be required in the presence of a blinking chamber light to pull a chain on a Variable Ratio
schedule to turn off a loud noise. The discriminative stimuli clarify the "rules of the game,"
making each prevailing three-term contingency unambiguous.
Stimulus Control
We speak of stimulus control when a discriminative stimulus changes the likelihood of an
operant response. The controlling relation between the discriminative stimulus and the
operant response (what Skinner called attention) comes about because of the
reinforcer/punisher that has followed the operant response in the presence of that
discriminative stimulus. Thus, the three-term contingency lies at the root of stimulus
control.
Page 16 of 19
Glossary of Behavioral Terms
Superstitious behavior
Superstitious behavior arises when the delivery of a reinforcer or punisher occurs close
together in time (temporal contiguity) with an independent behavior. Therefore, the
behavior is accidentally reinforced or punished, increasing the likelihood of that behavior
occurring again.
For example, you walk under a ladder and a minute later you trip and fall. It is easy to
attribute your accident to "bad luck" and the irrelevant ladder. The reason an association
is easy to form is because your cultural belief that walking under a ladder will bring bad
luck is positively reinforced by your fall that occurred soon after walking under the ladder.
Temporal Contiguity
Temporal contiguity occurs when two stimuli are experienced close together in time and,
as a result an association may be formed. In Pavlovian conditioning the strength of the
association between the conditioned stimulus (CS) and the unconditioned stimulus (US) is
largely affected by temporal contiguity. In operant conditioning, the association between
the operant behavior and the reinforcer/punisher is also largely affected by temporal
contiguity. Superstitious behavior occurs as a result of the temporal contiguity between a
behavior and a reinforcer/punisher that is independent of that behavior.
You utilize aspects of temporal contiguity each time you make a causal judgment. For
example, when your stomach becomes upset, you typically attempt to figure out why.
When you are trying to find the cause of your stomach ache, it is more likely that you will
place the blame on the Chinese food you ate earlier that afternoon, as opposed to the
fast-food you ate the week before. This is because you ate the Chinese food at a time that
was closer to the occurrence of your stomach ache.
Three-Term Contingency
The famous behavioral scientist B. F. Skinner believed that, in order to experimentally
analyze human and animal behavior, each behavioral act can be broken down into three
key parts. These three parts constitute his three-term contingency: discriminative stimulus,
operant response, and reinforcer/punisher. The three-term contingency is fundamental to
the study of operant conditioning.
To illustrate the operation of behavioral analysis, the behavior of leaving class when the
school day is over can be broken down into the parts of the three-term contingency. The
bell, which serves as the discriminative stimulus, is sounded at the end of the school day.
Page 17 of 19
Glossary of Behavioral Terms
When the bell rings, students exit the classroom. Exiting the classroom is the operant
response. Reinforcement of leaving the classroom at the proper time results from the
other behaviors in which students can engage now that the school day is over.
But, if the same behavior of exiting the classroom occurs prior to the bell's ring (that is, in
the absence of the discriminative stimulus), a student now faces punishment. Punishment
of leaving class early would result because this behavior violates school rules and leads to
a variety of adverse consequences, like staying after school.
Pavlovian Conditioning
Pavlovian conditioning is an important form of learning that involves the pairing of stimuli
independent of an organism's behavior. The key stimulus and response elements of
Pavlovian conditioning are:
Unconditioned stimulus
This type of stimulus unconditionally elicits a response, also referred to as a respondent.
For example, a puff of air to the cornea of the eye is an unconditioned stimulus that
produces a blinking response.
Unconditioned response
This type of response occurs to an unconditioned stimulus without prior conditioning. The
blinking response after a puff of air to the cornea of the eye is an example of an
unconditioned response.
Conditioned stimulus
A conditioned stimulus in Pavlovian conditioning is an initially neutral stimulus that is
paired with the unconditioned stimulus. For example, a tone sounded just prior to the puff
of air being delivered to the cornea of the eye. Without prior training, the tone does not
elicit an eye blink: however, after a number of tone-puff pairings, the tone alone comes to
elicit the blinking response.
Conditioned response
A conditioned response in Pavlovian conditioning is the response that the conditioned
stimulus elicits after it has been repeatedly paired with an unconditioned stimulus. The
conditioned response may be similar in form to the unconditioned response. For example,
the eye blink to the tone conditioned stimulus may involve the same bodily musculature as
the eye blink to the puff of air to the cornea.
Page 18 of 19
Glossary of Behavioral Terms
Yoked Control
Experimental control is essential in operant conditioning. Without baseline measures, such
as the operant level, it is difficult to draw conclusions about the reasons for any observed
changes in operant response rates.
In the yoked control procedure, the rate of responding by an experimental subject is
compared to that by a control subject; the latter is yoked to the former in terms of the
receipt of reinforcement (or punishment). This comparison helps to confirm that changes
in operant responding by the experimental subject are due to the contingencies of
reinforcement between its behavior and its consequences; otherwise, the two subjects
receive reinforcers (or punishers) at the same time.
In the yoked control procedure, then, responding by the experimental subject results in
response-dependent reinforcement. In contrast, the control subject receives responseindependent reinforcement.
For example, parents often give their child a treat as a reinforcer for good grades.
Frequently, if the child has a sibling, parents will give the sibling a gift as well so that child
will not feel ignored. The gift is not dependent on any previous behavior of the sibling.
Page 19 of 19
Download