Uploaded by raynettegibson

RESEARCH PAPER BF SKINNER

advertisement
Founding Father B.F. Skinner |1
Abstract
I choose to write about B. F. Skinner because I am interested in watching human
behavior. I realize that we as individual has some control over another person, but only if we
strategize and practice our strategy we would become better at it. Human likes gifts and
possession and reward. These same things are ways humans today are being controlled, shaped
and molded. I am excited and pleased as to the details and information that I would find during
this research. So I hope you enjoy the context and maybe you would draw a better understanding
to human behaviorism.
Founding Father B.F. Skinner |2
Table of Content
Introduction…………………………………………………………….
Pg. 3
History of B. F. Skinner……………………………………………….
Pg. 4 - 5
Method utilized by B. F. Skinner……………………………………..
Pg. 6 - 8
Discussing the theory and steps in the theory………………………….
Pg. 9-13
Conclusion………………………………………………………………
Pg. 13
Founding Father B.F. Skinner |3
Introduction
B.F. Skinner once said, “Give me a child and I'll shape him into anything.” Skinner
studied operant conditioning by conducting experiments using animals which he placed in a
'Skinner Box' which was similar to Thorndike’s puzzle box. For the record, Skinner was
regarded as the father for this theory. For those of you that do not understand the term operant, it
means roughly changing of behavior by the use of reinforcement which is given after the desired
response (McLeod, 2015). Or, distinguish between a reflex and a response (Swain, 1972).
Skinner’s ability to study both human and animal behavior was remarkable. His methods and
theories are still being used in this 20th century. Today, I will discuss the history of B. F.
Skinner, the method he utilized for his theory, and discuss Skinner’s theory and steps in his
theory.
History of B. F. Skinner
Burrhus Frederic without the k Skinner, also known as B.F. Skinner was born in
Susquehanna, Pennsylvania, on March 20, 1904. His mother nearly died giving birth to him. He
was raised under the roof of both parents and had one brother. However, Skinner’s father was an
Founding Father B.F. Skinner |4
attorney and his mother a retired typist. Additionally, Skinner was also known to his family as
Fred. Skinner who grew up with toys that stimulated his mind. He bolted together devices from
Meccano, and he experimented with a working miniature steam engine (Doc, 2016).
When he was a little older, he carried out his own chemistry and electrical experiments.
He designed but did not build a perpetual motion machine. He built a steam cannon. In eighth
grade, Skinner became a devotee of the works of William Shakespeare and also bought into the
conspiracy theory that the true author of Shakespeare’s works was Francis Bacon. He read
further works by Bacon, including his famous Novum Organum describing the Baconian Method
of doing science (Doc, 2016).
At Susquehanna High School, where mathematics was his favorite subject. He graduated
second in his class, just like both his parents had done before him at the very same school. In
September 1922, at aged 18, Skinner enrolled at Hamilton College, Clinton, New York. His
favorite freshman course was English Composition, and he began publishing his poetry in the
college magazine. During Easter vacation of 1923, his younger brother Edward fell violently ill
and died, this left Skinner and his family in shock for a long time. At the end of his freshman
Founding Father B.F. Skinner |5
year, Skinner wrote an essay about the year his brother died, an essay he did not enjoy writing
(Doc, 2016).
In 1926, Skinner graduated from college with a bachelor’s of arts degree in English
Language and Literature with the intent of becoming a novelist. In November 1927, he decided
to abandon literature and study psychology. He became a graduate student at Harvard
University’s psychology department in the fall of 1928, he was 24 years of age. Three years
later he graduated with a Ph.D. in Psychology. Skinner carried out research at Harvard until
1936 then lectured at the University of Minnesota, becoming chair of Indiana University’s
psychology department in1946. He returned to Harvard as a professor in 1948 and spent the
rest of his career there. (Doc, 2016).
Method utilized by B. F. Skinner
A scientific study of behavior reveals techniques to use in designing a utopian society in which
man can only behave in modes beneficial to both himself and the society. Skinner claims to have
made such a scientific study of behavior, and he believes an absolutely predictable society can be
designed. The term most associated with Skinner's brand of behaviorism is operant conditioning.
Founding Father B.F. Skinner |6
Although he was influenced by earlier S-R (stimulus-response) theories, within his thematically
framework, Skinner has a particular focus. The emphasis is upon the response and not upon the
stimulus. His intent was to show that the cause of behavior is the consequence which follows the
behavior, while avoiding notions of purposive behavior. A complicated terminology is created
with which to state the theory. (Swain, 1972).
Skinner Box
Skinner built on the behaviorist theories of Ivan Pavlov and John Watson as he studied
the connection between stimuli and observable behavior in rats, which led to his eponymous
Skinner box. The Skinner box is unique in that it is intentionally designed to be relatively small
in size, containing only devices essential to the experiment, but allowing the subject room to
move freely. Many are also soundproof in an added effort to minimize outside noise influencing
the subject (study.com, 2003-2018). With its levers and food pellets, the box allowed precise
measurement and control of experimental conditions. Skinner's animal research underscored the
importance of consequences (i.e., rewards or punishments), and of breaking tasks into smaller
Founding Father B.F. Skinner |7
parts and rewarding success on these small parts, in creating behavior change. He believed the
methods could be used to train humans by presenting new subject matter in a series of graduated
steps with feedback at each step (Greengrass, 2004).
Operant Behavior
As mentioned earlier, in his effort to avoid all notions of purposive behavior. Skinner
employs a peculiar language which is, at times, exceedingly difficult to understand. Consider the
following example. He is arguing that behavior which might be called "following a plan" or
"applying a rule" cannot be observed when behavior is considered to be a product of the
contingencies alone. He admits that rules may be formulated from reinforcing contingencies, and
once formulated they may be used as guides; yet, the direct effect of contingencies is different.
(Swain, 1972).
● Reinforcement (Positive and Negative)
Consequently, Skinner identified reinforcement as any event that strengthens the
behavior it follows. The two types of reinforcement he identified were positive reinforcement
(favorable outcomes such as reward or praise) and negative reinforcement (the removal of
Founding Father B.F. Skinner |8
unfavorable outcomes) (Cherry, 2017). Both types of reinforcement strengthens behavior, or
increases the probability of a behavior from reoccurring. Positive reinforcement strengthens the
probability of a behavior by means of the addition of something and negative reinforcement, the
behavior or response is intensified by the removal of something (Sincero, 2011).
● Punishment (Positive and Negative)
Punishment can also play an important role in the operant conditioning process. According to
Skinner, punishment is the application of an adverse outcome that decreases or weakens the
behavior it follows. Positive punishment involves presenting an unfavorable outcome (prison,
spanking, scolding) while negative punishment involves removing a favorable outcome
following a behavior (taking away a favorite toy, getting grounded) (Cherry, 2017).
Discussing the theory and steps in the theory
In this section we will combine all of the methods for Skinner’s theories that were
mentioned earlier. Skinner was inspired to create his operant conditioning chamber as an
extension of the puzzle boxes that Edward Thorndike famously used in his research on the law of
effect. Skinner himself did not refer to this device as a Skinner box, instead preferring the term,
Founding Father B.F. Skinner |9
"lever box." The design of Skinner boxes can vary depending upon the type of animal and the
experimental variables (Cherry, 2017).
B.F. Skinner proposed his theory on operant conditioning by conducting various
experiments on animals. He used a special box known as “Skinner Box” for his experiment on
rats. As the first step to his experiment, he placed a hungry rat inside the Skinner box. The rat
was initially inactive inside the box, but gradually as it began to adapt to the environment of the
box, it began to explore around. Eventually, the rat discovered a lever, upon pressing which;
food was released inside the box. After it filled its hunger, it started exploring the box again, and
after a while it pressed the lever for the second time as it grew hungry again
(Psychestudy, 2018).
This phenomenon continued for the third, fourth and the fifth time, and after a while,
the hungry rat immediately pressed the lever once it was placed in the box. Then the
conditioning was deemed to be complete. Here, the action of pressing the lever is an operant
response/behavior, and the food released inside the chamber is the reward. The experiment is
also known as Instrumental Conditioning Learning as the response is instrumental in getting
F o u n d i n g F a t h e r B . F . S k i n n e r | 10
food. This experiment also deals with and explains the effects of positive reinforcement. Upon
pressing the lever, the hungry rat was served with food, which filled its hunger; hence, it’s a
positive reinforcement (Psychestudy, 2018).
Photo of Skinner Box experiment
Later, Skinner examined what behavior patterns developed in pigeons using the box. The
pigeons pecked at a disc to gain access to food. From these studies, Skinner came to the
conclusion that some form of reinforcement was crucial in learning new behaviors. Skinner
demonstrated that trained pigeons housed in compartments within a missile could guide it on to a
target. Lenses projected an image of the target on to a screen which the pigeons pecked, keeping
F o u n d i n g F a t h e r B . F . S k i n n e r | 11
the missile on track to hit the target. Although it might sound bizarre, Skinner’s invention
actually worked. In the end, however, the military decided to utilize radar and other technologies
to guide their missiles rather than Skinner’s kamikaze pigeons (Doc, 2016).
Subsequently, Skinner once said “Reinforcement is a fundamental concept of Operant
Conditioning, whose major objective is to increase the rate of certain undesired behavior from
occurring again” (Skinner). Reinforcement can be further classified into two major parts, positive
reinforcement and negative reinforcement. The major purpose of both these reinforcement types
is to increase the rate of certain behavior although they have many similarities and differences.
By introducing the concept of reinforcement to an individual, the individual gets encouraged to
perform the behavior in a repeated manner whether to avoid any undesirable stimulus receive the
desirable reinforcement or reward again (Psychestudy, 2018).
Reinforcement is much more than “being rewarded”; a prevailing probability of
reinforcement, particularly under various intermittent schedules, is the important variable. In
other words, we no longer look at behavior and environment as separate things or events but at
the interrelations among them. We look at the contingencies of reinforcement. We can then
F o u n d i n g F a t h e r B . F . S k i n n e r | 12
interpret behavior more successfully. (Skinner, 1969)
Reinforcements may be scheduled in many ways. Each schedule, with given values of the
parameters, generates a characteristic performance. a. Fixed interval. A response is reinforced
only when it occurs after the passage of a period of time (for example, five minutes). Another
period begins immediately after reinforcement. b. A fixed ratio. Every nth response is reinforced.
c. Variable interval or ratio. The interval or number in a. and b. need not be fixed but may vary
over a given range around some average value. d. Multiple schedules. One schedule is in force in
the presence of one stimulus, a different schedule in the presence of another stimulus. For
example, a fixed interval prevails when the key is red, and a variable issue when the key is green.
(A characteristic performance is obtained under each stimulus.) e. Differential reinforcement of
rate of responding. A response is reinforced only if it follows the preceding response after a
specified interval of time (DRL) or before the end of a given interval (DRH). In DRL the interval
might be, for example, 3 minutes; in DRH, one half second (Skinner, 1969).
As we have seen, contingencies of reinforcement are not the most conspicuous aspects of
life, and the experimental analysis which has revealed their nature and their effects is of recent
F o u n d i n g F a t h e r B . F . S k i n n e r | 13
origin. Yet the fact is that reinforcement is extraordinarily important. That is why it is reassuring
to recall that its place was once taken by the concept of purpose; no one is likely to object to a
search for purpose in every human act. The difference is that we are now in a position to search
effectively (Skinner, 1969).
Conclusion
In closing, Skinner theories of operant conditioning has been proven that his methods
really worked. Proven that animals and human beings alike can be taught, shaped and controlled
by using both positive and negative reinforcement. Additionally, while many of his behavioral
theories have fallen out of favor, Skinner's identification of the importance of reinforcement
remains a critical discovery. He believed that positive reinforcement was a great tool for shaping
behavior, an idea still valued in numerous settings including schools today
(B. F. Skinner Biography, 2016).
F o u n d i n g F a t h e r B . F . S k i n n e r | 14
Work Cited
B. F. Skinner. (2014). Conferences OMICS Publishing Group. Retrieved February 13, 2018
B. F. Skinner Biography. (2016, January 20). Retrieved from The Biography.com website:
https://www.biography.com/people/bf-skinner-9485671
Cherry, K. (2017, August 25). A Closer Look at Skinner's Life and Legacy. History and Biogaphies.
Retrieved February 2018, from www.verywellmind.com/b-f-skinnerbiography-1904-1990275543
Cherry, K. (2017, December 14). Skinner Box or Operant Conditioning Chamber. Very Well Mind.
Retrieved February 10, 2018, from https://www.verywellmind.com/what-is-a-skinnerbox2795875
Doc, T. (2016, November 19). "Famous Scientist". Retrieved February 6, 2018, from
www.famousscientist.org/b-f-skinner/>: www.famousscientist.org/b-f-skinner/>
Greengrass, M. (2004, March). 100 years of B. F. Skinner. American Psychological Association, Vol 35,
No. 3, 80. Retrieved February 13, 2018
McLeod, S. A. (2015). Skinner-operant conditioning. Simply psychology.
Psychestudy. (2018). Skinner's Theory on Operant Conditiong. Psychestudy. Retrieved from
www.psychestudy.com/behavioral/learning-memory/operant-conditioning/skinner
Sincero, S. M. (2011, May 10). Explorable. Retrieved February 8, 2018, from
http//explorabe.com/operant-conditioning: https//explorabe.com/operant-conditioning
Skinner, B. F. (1969). Contingencies of Reinforcement, A theoretical Analysis. United Statees: Meredith
Corporation. Retrieved February 17, 2018
study.com. (2003-2018). Study.com. (C. Clause, Editor) Retrieved February 13, 2018, from
www.study.com/academy/lesson/skinner-box-experiment-theory-quiz.html:
www.study.com/academy/lesson/skinner-box-experiment-theory-quiz.html
Swain, E. E. (1972, July 24). B.F Skinner and Carl Rogers on Bahavior and Education. Oregon ASCD
Curriculum Bulletin, V28, 48p. Retrieved February 16, 2018
Download