www.studyguide.pk www. .wordpress.com

advertisement
www.studyguide.pk
www.aspsychology101.wordpress.com
www.studyguide.pk
Contents
Section A: Obedience
1.1 Obedience
3
1.2 Milgram’s Study of Obedience (1963)
4
1.3 Evaluation of Milgram’s Study of Obedience
6
1.4 Variations of the Milgram Experiment
8
1.5 Meeus and Raaijmakers (1986)
9
1.6 Agency Theory
12
1.7 Hofling et al. (1966)
14
Questions: Obedience
17
Section B: Prejudice
1.8 Social Identity Theory as an Explanation of Prejudice
21
1.9 Tajfel et al. (1970, 1971)
23
1.10 Sherif et al. (1954)
25
1.11 Reicher and Haslam (2006)
28
1.12 Asch (1951, 1952, 1956)
31
Questions: Prejudice
23
Unit 1 Key Issues
37
Unit 1 Revision Notes
39
www.aspsychology101.wordpress.com
www.studyguide.pk
An introduction to the themes of the cognitive approach to psychology
THE COGNITIVE APPROACH
Cognition refers to all those processes by which
sensory input is transformed, reduced,
elaborated, stored, recovered and used
The Cognitive Approach is the study of how we take in information from
our world, organise it, and use it to help us function successfully. It is
conerned with the interal operation of the mind, and seeks to understand
the role of mental processes in determining human behaviour.
Research into cognitive psychology can have many benefits to
society. The cognitive approach’s research has helped to improve
educational methods, and has also discovered ways to improve
the reliability of eye-witness testimonies, and other things such as
police interviews.
The diagram to the right outlines a mere minority of the mental
processes which operate inside out minds. Psychologists who
study cognition may be particularly interested in perception,
looking at why we pick up on certain things but not others; or
language, using the tools of our thoughts and communicating
them to others; or even memory, the encoding, storing and
retrieval of information from within the brain. Other areas might
include attention, thinking, problem-solving and reasoning, all
cognitive processes.
ProblemSolving
Attention
Memory
Mental
Processes
Perception
Thoughts
Language
A lot of psychologists in the area argue that cognition involves
absolutely everything a human being does.
KEY TERMS AND ASSUMPTIONS OF THE APPROACH





information processing: involves the input, manipulation and output of information
memory: the ability to retain and reproduce mental or sensory impressions; it involves encoding, storing and
retrieving information
forgetting: losing of putting information away from the memory
storing: the way in which information is retained within the brain after it has been registered
retrieving: the act of locating and extracting stored information from the brain
There are two key assumptions to the cognitive approach; the first one is information processing (above). The focus of
the approach is on information processing and how this might affect behaviour. The flow of information is described as:
INPUT → PROCESS → OUTPUT
The approach assumes information processing to be linear. This means that information flows through the brain in a
logical way, which is a straight forward staged process like above. One example of this is the multi-store model
(Atkinson and Shriffin, 1968).
The second key assumption of the approach is the computer analogy (see 2.2 The Computer Analogy).
www.aspsychology101.wordpress.com
www.studyguide.pk
Strengths and weaknesses of the comparison of our brains to a computer
In the cognitive approach, there are two main assumptions. The first is information processing. This involved the pattern
of encoding, storing and retrieving data being linear. The second is the computer analogy. This assumes that the brain
functions similarly to a computer. As with ICT, with the Input > Process > Output system, human information processing
assumes a similar system:
INPUT
PROCESSING
OUTPUT
STORAGE
SENSES
COGNITION
BEHAVIOUR
MEMORY
However, there are limitations to the assumption. Here are some of the differences between the two:
Computer
A computer receives all input (e.g. via a keyboard)
A computer can do the same calculations repeatedly
A computer cannot lose information (unless data
becomes corrupt or there is damaged loss of data)
You can choose to delete certain information from a
computer permanently
A computer is emotionless
A computer only knows as much as the information
which has been input
Human brain
The brain only pays attention to a very small amount
of information input
The brain can only perform certain calculations at
different times and speeds
The brain can easily misplace information and
experience difficulty recalling information
You cannot push something unpleasant deliberately
from your mind
Emotions have a strong impact on the way our minds
function
The brain can try to piece together memories and fill
in the gaps
www.aspsychology101.wordpress.com
www.studyguide.pk
Atkinson and Shiffrin (1968)
One of the key assumptions of the cognitive approach is information processing, and part of this assumption is that
processing of information is linear. One example of a model which is based on this idea is the multi-store model of
memory. The multi-store model was proposed by Atkinson and Shiffrin (1968).
Researchers for the model looked into three areas:
 capacity – the size of the store
 duration – how long information remains in the store
 mode of representation – the form in which information is stored
The researchers chose to investigate:
 encoding – how memories are encoded (which means how they are registered as memories)
 storage – how memories are stored (which means how they remain memories after being registered)
 retrieval – how we retrieve memories when the output is needed
The model is shown below:
Sensory
register
information
comes in
Shortterm
memory
information
is rehearsed
or lost
Longterm
memory
information
is stored as
it comes
from shortterm
memory
Sensory register: this can last up to around 2 seconds. Information is taken in (input) by our senses. If the information is
not attended to, it will be permanently lost
Short-term memory: (e.g. looking up a phone number and remembering it for the short time it takes to dial it) this lasts
only temporarily, and it is common to rehearse the information. For example, if you are looking up a phone number,
you will say to yourself “01294…” to yourself several times as you walk to the phone to dial it. This type of memory is
mainly auditory and has a limited capacity
Long-term memory: this can last for years and supposedly has an
unlimited storage timeframe. It is mainly encoded in terms of meaning
(semantically-encoded memory). Procedural long-term memory is
often the most difficult to fathom. It is associated with highly-automated
processes, such as tying a shoe lace
Procedural Long-Term Memory the memory used to remember highlyautomated tasks which are done
procedurally rather than thoughtfully, such
as walking or tying a shoe lace
EVALUATION
Strengths:
 There have been many lab experiments which support
the model, such as Glanzer and Cunitz (see right)
because the primacy and recency effects are explained
by it
 Case studies, such as that of Clive Wearing, who noted
an area of the brain (the hippocampus) which, when
damaged, prevents new memories from being laid
down – this provides physiological support
Glanzer and Cunitz (1966)
Glanzer and Cunitz carried out a scientifically-based study
using word lists. They found that the first words in a list
were remembered well, as were the last, but the words in
the middle of the list weren’t remembered quite so well.
They said that the first words were well-rehearsed, and in
the long-term memory (primary effect); the last words
were still in the consciousness of the memory (recency
effect), whereas the middle words were in neither
www.aspsychology101.wordpress.com
www.studyguide.pk
Weaknesses:
 Even though case studies like Clive Wearing have suggested an area of the brain for short-term memory,
another case study (Shallice and Warrington, 1970) showed that a victim of a motorbike accident was able to
add long-term memories even though his short-term was damaged. This goes against the multi-store model
 The experiments that give evidence for the model use artificial tasks, which means that the results might not be
valid
 Craik and Lockhart (1972) proposed their levels-of-processing framework, which they said better explained
primary and recency effects, as their model was designed as an improvement of the multi-store model
www.aspsychology101.wordpress.com
www.studyguide.pk
Craik and Lockhart (1972)
Craik and Lockhart (1972) put forward the levels of processing framework for memory because of problems which they
found with the multi-store model. They suggested that memory was actually dependent upon the level of processing of
the information, rather than being in different stores with different features.
Their framework suggests that information is more readily transferred to the long-term memory if it is considered,
understood and related to past memories to gain meaning (than if it is merely repeated). The degree of consideration of
information was given the term depth of processing, where the deeper the information is processed, the longer the
memory trace would last.
Craik and Lockhart gave three examples of levels which verbal information can be processed:
 structural – this is shallow processing, looking at what the words only look like
 phonetic - processing the sound of the word
 semantic – this is deep processing, considering the meaning of the word
MODIFICATIONS TO THE FRAMEWORK
Many researchers became interested in exactly what produced deep processing:
 elaboration: Craik and Tulving (1975) found complex semantic processing produced better recall
e.g. “The great bird swooped down and carried off the struggling ___” produced better recall than “She cooked the
___”
 distinctiveness: Eysenck and Eysenck (1980) found that words processed phonetically even were better
remembered when they were distinctive or unusual
 effort: Tyler et al. (1979) found there was better recall with difficult anagrams than simple anagrams
e.g. “ORTODC” had better recall than “DOCTRO”
The table below shows a summary of the framework:
Feature
Memory trace
Deeper analysis
Rehearsal in primary memory
When attention is diverted
Explanation
Comes with depth of processing or degree of elaboration: no depth of processing
means no memory trace
Leaves a more persistent memory trace
Holds information but leaves no memory trace
Information is lost at a rate that depends on the level of analysis
EVALUATION OF THE FRAMEWORK
Strengths:
 There is evidence for the framework, such as Craik and Tulving (see 2.5 Craik and Tulving (1975))
 It links research into memory with research into perception and selective attention; it focuses on information
processing and the whole process; this means it is a stronger explanation than the multi-store model, because
more studies can be explained by it
Weaknesses:
 Depth of processing also tends to mean more time spent processing; it might be that it is the length of time spent
processing which affects the memory trace, not the depth of processing
 There may be more effort involved in “deeper” processing and the greater effort might account for the better
recall; the term “deep” is not well defined by Craik and Lockhart, it could be time spent processing, effort involved
or using past experiences and adding meaning
www.aspsychology101.wordpress.com
www.studyguide.pk
Aim: To test the levels of processing framework by looking at trace durability
The levels of processing framework suggests that material which has been processed
semantically (deeply and for meaning) is that which will be best recalled. Craik and
Tulving (1975) carried out a study to test the framework, by testing to see if the
durability of a trace was affected by the depth of processing.
Durability the durability of a trace is
how long it lasts
Forgetting occurs when the memory trace has gone. The aim of the study was to see whether material which had been
more deeply processed would be recalled better. This would mean a greater degree of semantic processing, involving
meaningful processing.
PROCEDURE [THE BASIC STUDY]
1 The participants were put into situations where they used different depths of processing:
- shallow processing involved asking questions about the words themselves (structural processing)
- intermediate processing involved questions about rhyming words (phonemic processing)
- deep processing involved whether a word fit into a particular semantic category (semantic processing)
2 After this encoding phase, there was an unexpected recognition or recall task
3 All ten experiments used the same basic procedure. Participants were tested individually, and were told that the
experiments were about perception and reaction time. A tachistoscope was used, which flashed words onto a
screen
4 Different words were shown, one at a time, for 0.2 seconds. Before the word was shown, participants were
asked a question about the word, which would lead to different levels of processing, from the list above
5 After being asked the question, the participant looks into the tachistoscope and the word is flashed
6 They give a “yes” response with one hand and a “no” response with the other
The questions were designed to have half of them answered “yes” and half “no”
7 After all the words have been completed, the participants had an unexpected recognition assessment: their
hypothesis, that ‘memory performance would vary systematically with depth of processing’
EXPERIMENTAL DETAILS
In Experiment 1, structural, phonemic and semantic
processing was measured, as well as whether or not a
particular word was present. Words were presented at
2-second intervals over the tachistoscope. There were
40 words and 10 conditions. Five questions were asked
Do the words rhyme?
Is the word in capitals?
Does the word fit into this category?
Does the word fit into this sentence?
Is there a word present or not?
Tachistoscope a device which allows an image to be displayed upon
a screen, used here by the experimenter to flash
letters or other stimuli onto the screen for a short
time in sequence
Each question had “yes” and “no” responses, making
ten conditions overall. The results are shown below
2
Response Type
Yes
No
Level of Processing from Least Deep (1) to Deepest (5)
1 Is there a
2 Is the word in
3 Does the word
word?
capitals?
rhyme?
Proportion of words recognised correctly
0.22
0.18
0.78
N/A
0.14
0.36
www.aspsychology101.wordpress.com
4 Does the word
fit into this
category?
5 Does the word
fit into this
sentence?
0.93
0.63
0.96
0.83
www.studyguide.pk
CONCLUSIONS
Deeper encoding (when the participants had to consider whether a word fitted into a particular category or sentence)
took longer and gave higher levels of performance. Questions where the response was “Yes” also produced higher recall
rates than those which were responded with “No”.
It was concluded that the enhanced performance was because of qualitatively different processing, not just because of
extra time studying. Craik and Tulving say “manipulation of levels of processing at the time of input is an extremely
powerful determinant of retention of word events”. It is interesting that “Yes” and “No” answers took the same amount
of processing time, but “Yes” answers led to better recognition rates. This does not seem to be just about levels of
processing and so needs further investigation.
EVALUATION


The experiments were designed carefully with clear
controls and operationalisation of variables. The
study can therefore be replicated and the findings
are likely to be reliable. In fact, by carrying out so
many experiments, Craik and Tulving (1975) have
replicated their own work
The framework is clear and the study takes the ideas
and tests them directly, subsequently feeding back to
the framework: for example, the researchers
recognised that deep processing being measured as
meaningful processing is a circular argument, so they
focused on depth of processing needing longer
processing – focusing on a criticism of the framework
strengthened their study
 One weakness is how to test “depth” – it can be very
vague (there is a circular argument of “deep” meaning
“meaningfully processed” and “meaningfully
processed” means “deep”)
 The tasks are artificial. They involve processing words
in artificial ways and then trying to recognise them.
This is not something that would be done in real life,
so the study could be said to lack validity
www.aspsychology101.wordpress.com
www.studyguide.pk
Baddeley and Hitch (1974, 2000)
Baddeley and Hitch used the multi-store model of memory as the basis for the working memory model. They were
dissatisfied with the multi-store model, but used the idea of the short-term memory and long-term store. This model is
an improvement on the short-term memory of the multi-store model.
Central executive
Articulatory loop
(inner voice)
Visuo-spatial scratch pad
(inner eye)
The original model is shown here. It
consists of three elements: the central
executive (which supervises the system
and controls the flow of information),
the phonological loop (consisting of the
articulatory loop and the primary
acoustic store) and the visuo-spatial
scratch pad (which deals with visual and
spatial information).
Spatial Information Primary acoustic store
(inner ear)
information about where things are
physically located
Visual Information The central executive takes information from several sources and puts it into
one episode. The main function of the central executive is to also control the
other components of the working memory.
information about the shapes, sizes,
colours and details of objects and
images
The phonological loop consists of two separate components which work together. These are called the primary
acoustic store (the “inner ear”) and the articulatory loop (the “inner voice”). The primary acoustic store (also
sometimes known as the short-term phonological store) holds auditory memory traces, which decay very rapidly – they
last for around two seconds. The articulatory loop revives memory traces by rehearsing them.
The visuospatial scratchpad holds the information we see and manipulates spatial information (such as shapes, colours
and positioning of objects). Anything which uses spatial awareness, such as finding your way through a building, uses
the visuospatial scratchpad. The scratchpad is sometimes said to be divided into visual, spatial and kinaesthetic
(movement) parts, and is located in the right hemisphere of the brain.
The purpose of having a separate visuospatial and phonological system in the working memory model, is because it is
difficult to perform two tasks from the same component simultaneously, but separately they can work together. This
means that two auditory tasks cannot be carried out 100% successfully together (such as listening to two different
people speak at the same time); and similarly two visual tasks cannot (you cannot watch two different videos at the
same time and see every detail) – however, you can perform one visual and one phonological task simultaneously, such
as watching someone speak and hearing them.
In 2000, Baddeley returned to the model, unsatisfied it was completed. He added an episodic buffer which provides
time sequencing for visual, spatial and verbal information (for example, the chronological ordering of words or pictures
in a film). It is often considered that the buffer brings in information from the long-term store.
Evidence for the episodic buffer existing comes from people with amnesia, who cannot lay down new memories in the
long-term store, but can recall stories in the short-term store that contained a lot of information. This information was
more than could be retained in the phonological loop.
www.aspsychology101.wordpress.com
www.studyguide.pk
The diagram to the right shows
the improved model of the
working memory with the
addition of the episodic buffer
by Baddeley in 2000.
The model remains as before
in that it shows the visual and
phonological systems are
separated and that these two
systems cannot have two
similar tasks carried out
together at the same time. A
test which requires an
individual to perform two tasks
simultaneously is called a dualtask paradigm.
Central executive
Phonological loop
Articulatory
loop
Episodic buffer
Visuo-spatial scratch pad
(inner eye)
Primary acoustic
store
Language
Episodic LTM
Visual semantics
EVIDENCE FOR THE WORKING MEMORY MODEL
Evidence for the phonological loop is that word lists are better remembered when the words sound nothing alike than
when they all sound the same. If participants are asked to learn a list of words, and at the same time, to say something
aloud, then they will find learning difficult. This is said to be because they are already using the phonological loop. In
this case, the articulatory loop (inner voice) is being used to say something aloud, and is therefore not available to
rehearse the else information.
Visual Cache one part of the visuospatial scratch pad which
stores information about form and colour
Inner Scribe one part of the visuospatial scratch pad which
deals with spatial information and movement,
but also rehearses all visuospatial information
to be transferred to the central executive
Evidence for the visuospatial scratchpad (separated into two
parts: the visual cache, and the inner scribe) is that when
someone tries to perform two spatial tasks simultaneously, it
becomes difficult, but undertaking one visual task and one spatial
task together is possible. Evidence which supports the idea that
the inner scribe and visual cache are separated within the
scratchpad is also present in the form of neurophysiological brain
scans have shown that visual objects activate an area in the left
hemisphere of the brain, and spatial tasks the right.
Further evidence for the model as a whole, and that the visuospatial scratchpad and phonological loop are separate
systems comes from patients who suffer from agnosia. This causes a loss of the ability to recognise objects (the visual
cache), persons, sounds, shapes and even smells. This is often associated with brain injury or neurological illness.
Sufferers will be unable to recognise an object they are presented with, but can still copy a drawing of that object (for
example, if presented with a toy car, they cannot name it as a “car” but can look at it and draw it). This proves that the
spatial component remains there and intact.
EVALUATION OF THE MODEL
Strengths
 The model is an expansion on the multi-store model, it shows why some dual tasks are different, and why you
cannot undertake two different visual or verbal tasks simultaneously successfully
 There is much research supporting the model, including psychological lab experiments and neurophysiological
research, such as brain scans showing the differences in brain activity
 Patients with agnosia support the model’s separation of visuospatial components
www.aspsychology101.wordpress.com
www.studyguide.pk
De Groot (2006) looked at expert chess players, who were no better at recalling where chess pieces had been randomly
placed on the chess board than non-players. However, when the pieces were placed in their correct positions, the chess
players had a (predictably) outstanding memory of where they should be. This supports the idea of the long-term store
being used to help interpret information in the working memory (short-term).
Weaknesses
 Because the episodic buffer was added 26 years after the original model was published, it suggests that the
original model was incomplete, therefore the model may not serve as an explanation of the working memory
 The model doesn’t account for all senses (it only relies on sound and sight), and much of the lab support for the
model uses artificial tasks which lack validity: because the tasks are not true-to-life, you cannot guarantee that the
other senses might have been used in real life
www.aspsychology101.wordpress.com
www.studyguide.pk
Bartlett (1932)
The key idea which Bartlett proposed this theory upon was that memory is not like a tape recorder. Bartlett, and many
other psychologists, have suggested that a memory is not perfectly formed, perfectly encoded and perfectly retrieved.
This is somewhat supported by the levels of processing framework which states that coding and retrieval depend on
how well an event is processed.
Schema an idea or script about
the world (for example
an “attending a lesson”
or “going to the cinema”
script) which paint a
certain expectation of
the event and outline
rules of what to do
Bartlett started by thinking that past and current experiences of the individual reflect
how an event is remembered. He notes that there would be input, which is the
perception of the event. This is followed by the processing, which includes the
perception and interpretation of the event; this involves previous experiences and
schemata.
The memory of an event derives from information from specific memory traces which
were encoded at the time of the event, and ideas that a person has from knowledge,
expectations, beliefs and attitudes. Remembering involves retrieving knowledge that has
been altered to fit with knowledge that the person already has.
War of the Ghosts
The origins of Bartlett’s theory came from a game of Chinese whispers. He decided to construct his own experiment,
which was based around the idea of the game. He used a Native American folk story called War of the Ghosts. He used
such a story because it was very unfamiliar to them, being in a different style and from a different culture, therefore not
slotting into their usual schemata. First of all, Bartlett would read the participants the story, and then ask them to repeat
the story back to him, which prompted several different accounts.
There were several more occasions where Bartlett met with the participants to hear what they could remember of the
folk tale. They were able to recall less and less each time as time went on, so the story became shorter. However, it
tended to make more sense, compared to the original story, which to them made no sense whatsoever.
After about six recall sessions, the participants’ average stories had shortened from 330 words to 180 words. Bartlett
noticed that people had rationalised the story in parts that made no sense to them, and filled in their memories so that
what they were recalling seemed sensible to them.
This means that the participants had reconstructed their memories of the story. Bartlett hereby concluded that memory
is reconstructive, not reproductive.
Elizabeth Loftus
A leading psychologist and expert on human memory, Loftus has done a lot of work in
the area of reconstructive memory. She agrees with Bartlett and has taken his ideas
one stage further in her work.
Loftus’ work includes looking into the reliability of eyewitness testimonies. As a
leading expert into criminology and psychology, eyewitness testimonies were a
particular interest, and her work has influenced cognitive processes behind law
practices today, such as guiding the police not to use leading questions.
You will see more about eyewitness testimonies in the Key Issue for Cognitive
www.aspsychology101.wordpress.com
Rationalisation altering something to make
it make sense to you
Confabulation making up certain parts to
fill in a memory to make it
make sense
www.studyguide.pk
THE EVENT
Default Optional
THE SCHEMA
SLOT
The diagram to the
right shows a picnic
PICNIC
schema as given by
Cohen (1993).
Cohen pointed out
PLACE
FOOD
PEOPLE
ACTIVITIES
five ways in which
schemata can help
influence memory –
by providing or
Woods
Sandwiches
Family
Games
aiding selection and
storage, abstraction,
Park
Sue & John
Walk
integration and
interpretation,
normalisation and
retrieval. This means that there are going to be both advantages and disadvantages of schemata in memory. One of the
main advantages is that they enable us to store the central meaning or gist of new information without necessarily
remembering the precise details (abstraction, selection and storage), unless perhaps the details were particularly
unusual: this saves memory resources. Schemata also help us to understand new information more readily (integration
and interpretation, normalisation), or fill in or guess missing parts of it using default values (retrieval). This makes the
world more coherent and predictable.
However, some of the drawbacks arise with information that does not fit in well with our normal schemata may be
forgotten or ignored, especially the minor details (selection and storage), or distorted (normalisation), so as to make
better sense to us. This links back to Bartlett’s War of the Ghosts. This may cause inaccurate, stereotyped and
prejudiced remembering.
RECONSTRUCTIVE MEMORY THEORY: AN EVALUATION





The theory is backed by much support, including Bartlett’s War of the Ghosts Chinese whisper-style experiment, as
well as the work of Elizabeth Loftus, who has studied the unreliability of eyewitness testimonies
It can be tested by experimental method because the independent variable can be operationalised and measured:
a story can have features that can be counted each time it is recalled and the changes recorded, so up to a point,
the theory can be scientifically tested
The study which used War of the Ghosts had a Native American folk story which made no sense to the participants,
therefore it might be argued that they altered the story to make it make sense because they were being asked to
retell the story
There could have also been demand characteristics for the study, where the participants anticipate what is the
indented answer and try to give that: this would make the findings unreliable
It does not explain how memory is reconstructive: this is a theory of description, not an explanation
www.aspsychology101.wordpress.com
www.studyguide.pk
Questions on Units 2.1 – 2.7 on Models of Memory
1
(a) Complete the table below to outline the definitions of the following cognitive terms.
information processing
involves the input, manipulation and output of information
memory
storing
encoding
retrieving
forgetting
(5 marks)
(b) Explain the computer analogy.
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(6 marks)
Total: 11 marks
www.aspsychology101.wordpress.com
www.studyguide.pk
2
Craik and Lockhart (1972) proposed the Levels of Processing Framework.
(a) Outline the main idea of the framework.
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(3 marks)
(b) Choose one study which tests the framework and outline it.
Name of study: …………………………………………………………………………………
Outline of study: ………………………………………………………………………………..
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(8 marks)
www.aspsychology101.wordpress.com
www.studyguide.pk
There are a number of other memory models which have been based around the multi-store model of
memory (Atkinson and Shiffrin, 1968) and the Levels of Processing Framework (Craik and Lockhart,
1972).
(c) Choose one of the memory models below and outline the model and evaluate it.
The working memory model (Baddeley and Hitch, 1974)
Reconstructive memory (Bartlett, 1932)
The spreading-activation theory of semantic processing (Collins and Quillian, 1969)
Name of model: ………………………………………………………………………………...
Outline of model: ……………………………………………………………………………….
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(8 marks)
Total: 19 marks
www.aspsychology101.wordpress.com
www.studyguide.pk
The cue-dependent theory of forgetting (Tulving, 1975)
This theory of forgetting applies to long-term memory, not the short-term store. It states that forgetting occurs when
the right cues are not available for memory retrieval. Tulving put forward this theory in 1975, stating that memory is
dependent on the right cues being available, and forgetting occurs when they are absent.
Tulving’s theory states that there are two events necessary for recall:
 a memory trace (information is laid down and retained in a store as a result of the original perception of an event)
 a retrieval cue (information present in the individual’s cognitive environment at the time of retrieval that matches
the environment at the time of recall)
For Tulving, forgetting is about the memory trace being intact, but memory failing because the cognitive environment
has changed. There is no appropriate cue to activate the trace. The most noticeable experience of this cue-dependent
forgetting is the Tip of the Tongue Phenomenon (Brown and McNeill, 1966). This refers to knowing a memory exists but
being temporarily unable to recall it.
Cues have been differentiated into:
 context-dependent cues – the situation or context
 state-dependent cues – the person’s state or mood
Below are some brief outlines of studies which support the cue-dependency theory…
CONTEXT-DEPENDENCY FORGETTING
Smith (1985)
Baker et al. (2004)
Smith gave 54 participants a list of
words to learn and immediately recall
in quiet, or with Mozart, or jazz
playing in the background. Two days
later, they were asked to recall the
words again, either in quiet, with
Mozart or with jazz playing. This
made nine different conditions, as
shown in the table, for the
participants. Forgetting occurred
when the background music was not
the same, demonstrating that,
without the same music as a context
cue, recall is impaired.
This study looked at whether chewing gum when learning and
recalling material produces a similar context effect. 83 students aged
18-46 took part, being randomly assigned to one of four conditions.
In all conditions they were given two minutes to learn fifteen words.
They were asked to recall the words immediately and 24 hours later.
The four conditions were:
 gum-gum (chew gum when learning and recalling)
 gum-no gum (chew gum when learning but not recalling)
 no gum-gum (don’t chew gum learning, do when recalling)
 no gum-no gum (don’t chew gum when learning or recalling)
Group
1
2
3
4
5
6
7
8
9
Learning
Quiet
Quiet
Quiet
Mozart
Mozart
Mozart
Jazz
Jazz
Jazz
Recalling
Quiet
Mozart
Jazz
Quiet
Mozart
Jazz
Quiet
Mozart
Jazz
In the immediate recall, there were only small differences between
the average numbers of words correctly recalled, however, 24 hours
later the differences were significant, with an average of 11 words
recalled in the gum-gum situation, and only 7 in the gum-no gum
condition. In both conditions where the gum was present or absent at
both learning and recall, more words were recalled than when the
gum was present at only learning or recall. This suggests that chewing
gum when learning and recalling information significantly aids
memory due to context-dependency effects
www.aspsychology101.wordpress.com
www.studyguide.pk
STATE-DEPENDENCY FORGETTING
Lang et al. (2001)
This investigated the role of emotion as a state cue by inducing fear. 54 students who were fearful of snakes and
spiders had their fear induced again whilst learning a list of words. They found that when the fear was induced for
recall, the scared students were able to recall more learnt words than when they were in a relaxed state.
Experimental research seems to support anecdotal evidence that places, objects, smells and emotions can all be
triggers to aid recall, but without these cues present we are liable to experience difficulty remembering
Miles and Hardman (1998)
Miles and Hardman used aerobic exercise to produce a physiological state in order to test state-dependent recall.
24 undergraduates were required to learn a list of three syllable words whilst on an exercise bicycle. All participants
repeated four combinations of learning and recall whilst pedalling or resting. They concluded that aerobic exercise
did have a significant positive effect on recall when used as a state cue
EVALUATION OF CUE-DEPENDENCY THEORY
The theory is supported by much anecdotal evidence (personal experiences – most people have experienced the “Tip of
the Tongue Phenomenon” where you cannot quite recall what you know exists). There is also a great deal of
experimental evidence (provided by studies) which support the theory. A further strength is that the theory has
practical applications, which are related to cognition and improving memory and ability to recall information. Also, the
theory can be tested, unlike theories such as trace-decay theory. Experiments can test the importance of cues as they
are tangible and measurable, unlike memory traces.
However, one major weakness is that the tasks from all studies supporting the theory are artificial: most often learning
words lists. Also, it is only an explanation for forgetting from long-term memory, it does not include anything about the
short-term store. The theory may not be a complete explanation either, as it cannot explain why emotionally-charged
memories can be really vivid – even without a cue (such as posttraumatic stress disorder or PTSD). It is also hard to
prove whether a memory has been revived from the cue or from the memory trace simply being activated, therefore it
makes the theory hard to refute.
www.aspsychology101.wordpress.com
www.studyguide.pk
Aim: To investigate cue-dependency theory using divers in wet and dry recall conditions
Godden and Baddeley wanted to test cue-dependency theory by investigating the effect of environment on recall. This
was looking at context cues because it was to do with external environment, not the individual.
PROCEDURE
Divers were asked to learn words both on land and underwater. The words were then recalled both on land (dry)
and underwater (wet). This made four conditions:
“dry” learning and “dry” recall
“dry” learning and “wet” recall
“wet” learning and “dry” recall
“wet” learning and “wet” recall
There were 18 divers from a diving club, and the lists had 36 unrelated words of two or three syllables chosen at
random from a word book. The word lists were recorded on tape. There was equipment to play the word lists under
the water. There was also a practice session to teach the divers how to breathe properly with good timing, so as not
to cloud their hearing of the words being read out. Each list was read twice, the second time was followed by fifteen
numbers which had to be written down by the divers to remove the words from their short-term memory
Each diver did all four conditions, making it a repeated measures design. There was 24 hours in between each
condition. Every condition was carried out in the evening, at the end of a diving day. When on land, the divers had
to still wear their diving gear
CONCLUSIONS AND EVALUATION
RESULTS
As predicted, words learned underwater were best
recalled underwater, and words learned best on land
were best recalled on land. The results are shown in
the table below, the figures are the mean number of
words remembered in each condition:
Recall environment
Study
environment
Dry
Wet
Dry
13.5
8.6
Wet
8.5
11.4
The mean numbers of words remembered for
conditions with the same environment for learning and
recall (13.5 out of 36 for dry/dry and 11.4 for wet/wet)
were much higher than those with dissimilar locations
As the hypothesis stated: more words were
remembered when recall took place in the same
environment as learning: this is to do with the contextdependent cues
Godden and Baddeley identified some problems with
the study, including that the divers were volunteers on
a diving holiday, so the setting could not be controlled
as the condition on each day was in a different place
A further difficulty is that there could have been
cheating underwater (because the researchers were
unable to observe the participants), however, the
researchers thought that there was no cheating going
on because that would have always produced better
results underwater, which you can see is not the case
Also, when the location of learning and recalling the
words was different, the divers had to move from one
situation to the other: whereas when the locations
were the same, this did not happen – it is possible that this led to the poorer recall. Godden and Baddeley chose to
investigate this factor further, by running a second study with two separate groups. There were 18 divers, who each
did the disrupted and non-disrupted conditions. The disrupted condition involved going in and out of/out and in the
water in between learning and recall when the situations were the same. The study produced results of 8.44 words
for the non-disrupted condition and 8.69 words for the disrupted condition. Because these numbers were so similar,
it was concluded that this factor did not cause the difference in results of the primary study
Saying these weaknesses, the study did however have strong controls, which makes it replicable so reliability can be
tested. Also, even though the task was artificial, the participants were all divers who had experience with performing
tasks under the water, and so the environment they were in was not unfamiliar, therefore a there was a limited
presence of ecological validity for the experiment
www.aspsychology101.wordpress.com
www.studyguide.pk
The theory of displacement affecting forgetting within the short-term memory
Displacement is based on the idea that the short-term memory has a limited capacity for information. Miller (1956)
argued that the short-term memory capacity is approximately 7±2 items of information. These can be “chunked”
together to increase capacity, but there is a fixed number of slots.
The red blocks below represent pieces of information within our short-term memory
If the short-term memory is full and new information is registered, then some information is going to be pushed out.
There are two options in this case: information can either be forgotten, or moved into the long-term memory where it is
encoded and stored. The information pushed out in either way is then overwritten with this new data. The key idea is
that information will be lost unless rehearsed enough to be moved into the long-term memory.
The diagram shows that as the new piece of information enters the short-term memory, the one on the end is forced
out: this will either go into the long-term memory (if rehearsed enough) or will be forgotten
There is much evidence for the theory of displacement. The multi-store model of memory supports the theory with
primary and recency effects. A primary effect derives from information which is learnt first, and so is quite wellremembered, so the information is most likely moved into the long-term memory. Whereas recency effects come from
information which is learnt last (most recently), therefore it will still be in the rehearsal loop of the short-term memory,
and so also remembered well.
Information which is taken in from a list displays a good example of these effects. The blocks below represent pieces of
information from a list of seven items which have been presented to a participant.
The ones on the left were the items at the top of the list, and the ones on the right on the bottom. When the list is
taken away from the participant and they are asked to recall as many items as they can remember, it is not uncommon
to only remember those which are highlighted green (primary effect), as these were first taken in, and those in blue
(recency effect), as those will still be in the short-term memory. Those shown as red from the middle will be forgotten.
This is because, due to primary and recency effects, information in the middle of the list is not so well-remembered
because it has neither been processed into the long-term memory nor remains in the rehearsal loop: it is forgotten.
www.aspsychology101.wordpress.com
www.studyguide.pk
WAUGH AND NORMAN (1965)
Waugh and Norman decided to test this idea. They read to participants a list of sixteen digits, like below:
7 0 8 4 1 6 0 9 5 5 3 7 2 4 7 8
The participants are then given a number and have to state the number which proceeds the number they are given. For
example, if the probe (digit given to the participant) is 6, the recall should be 0. However, between the probe and the
final digit (the second 8), there is a time gap and more digits have been called out to the participant, making it unlikely
that they will remember the recall. Primary and recency effects are displayed in this experiment:
7 0 8 4 1 6 0 9 5 5 3 7 2 4 7 8
The results of the study found what was expected: it was easier for participants to recall numbers which proceeded
digits from earlier on (primary) and the most recent (recency). Those in the middle were forgotten, as the information
had been lost.
Waugh and Norman tested to see if it was indeed displacement, or decay that was causing forgetting. They did this by
altering the experiment slightly. They did it again, this time with two variations. In one variation, the numbers were read
slowly (one digit per second), and the other variation fast (four digits per second):
 Displacement theory suggests that information is lost as new information is taken in because it is replaced therefore displacement theory would say that the speed of reading would not affect participants’ recall
 Decay theory suggests that information is lost as the memory trace fades over time therefore decay theory would say that when the digits are all read out more quickly, recall would improve as there
is less time for the information to decay from the short-term memory
They ran each of these conditions three times, placing the probe in a different place along the number line each time.
Both decay and displacement theories suggest that recall will improve as the probe moves closer to the final digit.
What Waugh and Norman found from these variations was that there was a slight, but not very huge, improvement on
recall when the digits were read out fast. This suggests that perhaps the conclusions of the original experiment were
wrong, as it might have been decay causing the forgetting: but because the difference was so insignificant, this is
unlikely. However, there was a clear improvement in recall when the probe was closer to the end of the number line:
which both theories suggest. This supports both theories.
EVALUATION OF THE DISPLACEMENT THEORY OF FORGETTING





The theory has been tested by scientific experiments with good controls, which have shown cause-and-effect
relationships (for example, Waugh and Norman)
These controls mean that the experiments could be replicated, the findings appear to be reliable
The theory fits with the multi-store model of memory and working model memory, both of which are supported
with a great deal of evidence, so this helps to support the model
It is difficult to operationalise the theory and measure displacement: what is taken as displacement could simply
be decay (loss of information due to lack of rehearsal over time) or interference (when something already learned
interferes with current learning or when something learned later gets in the way of what was learned previously)
Tasks used in the experiments are artificial so the findings may lack experimental validity (for example, the Waugh
and Norman studies were not real-life tasks)
www.aspsychology101.wordpress.com
www.studyguide.pk
Questions on Units 2.8 – 2.10 on Theories of Forgetting
1
(a) Brown and Neill (1966) popularised the Tip of the Tongue Phenomenon.
(i)
Explain the Tip of the Tongue Phenomenon.
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
(2 marks)
(ii) The phenomenon fits into the cue-dependency theory of forgetting, Tulving (1975).
Explain this theory of forgetting, using psychological terms.
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
(6 marks)
www.aspsychology101.wordpress.com
www.studyguide.pk
(b) Explain the two different types of cue-dependency forgetfulness.
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(3 marks)
(c) Below are four studies which support the cue-dependency theory of forgetting. Choose one study
from the list, and outline the study.
Smith (1985)
Baker et al. (2004)
Lang et al. (2001)
Miles and Hardman (1998)
Name of study: …………………………………………………………………………………
Outline of study: ………………………………………………………………………………..
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(5 marks)
Total: 16 marks
www.aspsychology101.wordpress.com
www.studyguide.pk
2
Godden and Baddeley (1975) wanted to test the cue-dependency theory. They used divers to recall
word lists underwater and on land to test context-dependent cues.
(a) In the table below, write either T (true) or F (false) in each empty box.
True or false
The study used an independent groups design
There were four conditions used in the study
The study took place in the same place everyday
The study took place at the same time everyday
There was 24 hours in between each condition
(5 marks)
(b) Complete the table below to show the results of Godden and Baddeley’s main study.
Recall environment
(mean number of words remembered out of 36)
Dry
Study environment
(mean number of
words remembered
out of 36)
Wet
Dry
Wet
8.5
(3 marks)
(c) Godden and Baddeley noticed some problems with their initial study.
(i)
State and explain one issue they found with their study.
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
(2 marks)
(ii) Explain how they overcame this problem.
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
(2 marks)
Total: 12 marks
www.aspsychology101.wordpress.com
www.studyguide.pk
3
The displacement theory of forgetting suggests that the short-term memory has a limited capacity.
(a) How many “chunks” of information did Miller (1956) suggest it could capacitate?
…………………………………………………………………………………………………
(1 mark)
(b) What happens when the capacity of the short-term memory is full, and new information is taken in,
according to the theory?
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
…………………………………………………………………………………………………
(5 marks)
(c) Waugh and Norman (1965) tested the theory of displacement, to see if it was displacement or
decay which causes forgetting in the short-term memory.
Their original study (to test only displacement theory) used participants who had to remember a
list of digits and give the researcher a digit which proceeded one which was read out.
(i)
Explain how they decided to test which of the two theories caused forgetting.
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
(3 marks)
www.aspsychology101.wordpress.com
www.studyguide.pk
(ii) State the results of this experiment, and explain what they meant.
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
……………………………………………………………………………………………
(3 marks)
Total: 12 marks
www.aspsychology101.wordpress.com
www.studyguide.pk
You have to study at least one key issue for each approach to psychology
This is an important issue because of the number of cases where people are found guilty of crimes with no other
evidence except for eyewitness testimonies.
An eyewitness is a witness to a crime, who must give their account of the event, and possibly identify the criminal
from an identity parade or appear in court. This can lead to a conviction, so if the eyewitness testimony is wrong,
someone has been wrongly convicted of a crime they did not commit.
Elizabeth Loftus is a leading expert in the area and has done a lot of research into the reliability of eyewitness
testimonies. She has identified many useful factors. For example, eyewitnesses can be swayed by identity parades
(this is likely to be because they want to help so feel they must answer, or might assume that the criminal has to be
in the line-up). They will be looking to find the nearest match to the person they saw, not the actual person: this can
lead to wrongful convictions. For example, if the eyewitness saw a black person commit a crime, and the line-up
consisted of five white men and one black man, the black man may be chosen as guilty.
A real-life example is that of Bobby Joe Leaster. In 1970, he was picked up by the police for murdering a shop-owner
and was taken to the shop, where the deceased’s wife identified him as the murderer from the back of the police
car. He was sentenced to life in prison. In 1986, after Leaster has spent sixteen years in imprisonment, the bullet of
the victim was matched to a gun linked to two robbers of the time of the murder. Leaster was released, after
spending 16 years in prison for a crime he did not commit. After an analysis of 69 cases of wrongful conviction, it
was found that 29 of them (42%) were due to mistaken identity from eyewitnesses.
Application of concepts and ideas:
 Bartlett (1932) discussed the idea that memory is not like a tape recorder, how schemata affect remembering
 Loftus and Ketcham (1991) found that innocent individuals were wrongly convicted 45% of the time by
eyewitness testimonies from the police cases they studied
 Chartier (1977) suggests that an eyewitness’ memory of the events will be vague, and so will try to fill in the
gaps to make it make more sense to them, which goes with the theory of memory being reconstructive
The term flashbulb memory refers to when the memory of an event is so powerful, it is as though the memory is a
photograph which the person can relive to such detail long after the event has taken place. Researchers use the
phrase “now print” to explain these memories, because it was as if the whole episode was a snapshot and imprinted
in memory as such.
Features of flashbulb memories are that they are vivid and can potentially last for the person’s entire lifetime. They
tend to be about events which bare emotional significance, for example, Diana’s crash or 9/11 would be
remembered as flashbulb memories.
Researchers would like to explain how flashbulb memories are stored, partly because this is of interest, but also
partly to see if this helps to explain how we remember. This issue is what leads to “flashbulb” memories and how
they can be explained.
Application of concepts and ideas:
 Brown and Kulik (1977) described the idea of flashbulb memory and pointed out that such memories are
special and long-lasting; they also found that 75% of black people who were asked about the assassination of
Martin Luther King could recall it, compared to only 33% of white people
 Colgrove (1899) found that most people could remember precisely what they were doing when Lincoln was
assassinated
www.aspsychology101.wordpress.com
www.studyguide.pk
2.1 Cognitive Psychology
The cognitive approach is all about how we take in information from our world, organise it, and use it to help us function
successfully. It is conerned with the interal operation of the mind, and seeks to understand the role of mental processes in
determining human behaviour. Psychologists argue that cognition involves everything human beings do.
There are five key terms to the approach:
 information processing: involves the input, manipulation and output of information
 memory: the ability to retain and reproduce mental impressions; it involves encoding, storing and retrieving information
 forgetting: losing of putting information away from the memory
 storing: the way in which information is retained within the brain after it has been registered
 retrieving: the act of locating and extracting stored information from the brain
There are two key assumptions to the cognitive approach; the first one is information processing (above). The focus of the
approach is on information processing and how this might affect behaviour. The flow of information is described as:
INPUT → PROCESS → OUTPUT
2.2 The Computer Analogy
The second key assumption is the computer analogy, which assumes the brain functions similarly to a computer.
The flow of information for a computer
The flow of information in the human brain
However, there are some limitations to the assumption. There are a number of ways in which our brains differ from a PC:
Computer
A computer cannot lose information (unless data
becomes corrupt or there is damaged loss of data)
You can choose to delete certain information from a
computer permanently
Human brain
The brain only pays attention to a very small amount of
information input
The brain can easily misplace information and experience
difficulty recalling information
You cannot push something unpleasant deliberately from your
mind
A computer is emotionless
Emotions have a strong impact on the way our minds function
A computer only knows as much as the information
which has been input
The brain can try to piece together memories and fill in the gaps
A computer receives all input
2.3 The Multi-Store Model of Memory
The multi-store model is based upon the linear idea of information processing. Its researchers, Atkinson and Shiffrin (1968)
chose to investigate capacity of storage, duration of the storage, and the representation method.
Sensory
register
information
comes in
Shortterm
memory
information
is rehearsed
or lost
Longterm
memory
information
is stored as
it comes
from shortterm
memory
www.studyguide.pk
Sensory register: this can last up to around 2 seconds. Information is taken in by our senses. If the information is not attended
to, it will be permanently lost
Short-term memory: (e.g. looking up a phone number and remembering it for the short time it takes to dial it) this lasts only
temporarily, and it is common to rehearse the information. For example, if you are looking up a phone number, you will say to
yourself “01294…” to yourself several times as you walk to the phone to dial it. This type of memory is mainly auditory and has
a limited capacity
Long-term memory: this can last for years and supposedly has an unlimited storage timeframe. It is mainly encoded in terms of
meaning (semantically-encoded memory).
 There have been many lab experiments which support
the model, such as Glanzer and Cunitz (see right)
because the primacy and recency effects are explained
by it
 Case studies, such as that of Clive Wearing, who noted
an area of the brain (the hippocampus) which, when
damaged, prevents new memories from being laid down
– this provides physiological support
 Even though case studies like Clive Wearing have
suggested an area of the brain for short-term memory,
another case study (Shallice and Warrington, 1970)
showed that a victim of a motorbike accident was able to
add long-term memories even though his short-term was
damaged. This goes against the multi-store model
 The experiments that give evidence for the model use
artificial tasks, which means that the results might not be
valid
2.4 Levels of Processing Framework
This model was put forward by Craik and Lockhart (1972) as an improvement on the multi-store model. They suggested that
memory actually depended on the depth of processing, not being in different stores. Their levels of processing framework
suggests that information is more readily transferred into the long-term memory (LTM) is it is processed semantically (deep
processing, involving considering, understanding and relating to past memories to gain meaning). If it is merely repeated, they
said it is less likely to go into the LTM. Craik and Lockhart suggested three levels of processing:
- shallow processing
when remembering words, this involves structural processing, looking at what they look like
- intermediate processing this is phonemic (or phonetic) processing – looking at the sound of the word
- deep processing
this is semantic processing (considering the meaning of the word)
The table below outlines a summary of their framework:
Feature
Memory trace
Deeper analysis
Rehearsal in primary memory
When attention is diverted
Explanation
Comes with depth of processing or degree of elaboration: no depth of processing means no
memory trace
Leaves a more persistent memory trace
Holds information but leaves no memory trace
Information is lost at a rate that depends on the level of analysis
 There is much evidence for the framework, including

many studies (see that of Craik and Tulving below)
 It links research into memory with research into perception
and selective attention; it focuses on information
processing and the whole process; this means it is a

stronger explanation than the multi-store model, because
more studies can be explained by it
It is unclear whether it is really the depth of processing
which affects the strength of the memory trace: it may be
time spent processing, since deeper processing also
involves spending more time processing
There may be more effort involved in deeper processing,
which means that the greater effort may be what
produces better recall (better memory)
2.5 Craik and Tulving (1975)
KEY STUDY
Aim: To test the levels of processing framework by looking at trace durability
The levels of processing framework suggests that material which has been processed deeply (semantically) will be recalled the
best. Craik and Tulving tested this in 1975 by looking at trace durability (how long the trace lasts) and how it is affected by the
depth of processing. When the memory trace has gone, forgetting occurs. The study used participants remembering material
which had been processed at each of the different levels to see how it affected their recall performance
www.studyguide.pk
PROCEDURE
1 The participants were put into situations where they used different depths of processing:
- shallow processing involved asking questions about the words themselves (structural processing)
- intermediate processing involved questions about rhyming words (phonemic processing)
- deep processing involved whether a word fit into a particular semantic category (semantic processing)
2 All ten experiments used the same basic procedure. Participants were tested individually, and were told that the
experiments were about perception and reaction time. A tachistoscope was used, which flashed words onto a screen
3 Different words were shown, one at a time, for 0.2 seconds. Before the word was shown, participants were asked a
question about the word, which would lead to different levels of processing, from the list above
4 They give a “yes” response with one hand and a “no” response with the other
The questions were designed to have half of them answered “yes” and half “no”
5 After all the words have been completed, the participants had an unexpected recognition assessment
In Experiment 1, structural, phonemic and semantic processing was measured, as well as whether or not a particular word
was present. Words were presented at 2-second intervals over the tachistoscope. There were 40 words and 10 conditions.
Five questions were asked: Do the words rhyme? Is the word in capitals? Does the word fit into this category? Does the
word fit into this sentence? Is there a word present or not? Each question had “yes” and “no” responses, making ten
conditions overall
FINDINGS and CONCLUSIONS
Response Type
Yes
No
Level of Processing from Least Deep (1) to Deepest (5)
1 Is there a
2 Is the word in
3 Does the word
word?
capitals?
rhyme?
Proportion of words recognised correctly
0.22
0.18
0.78
N/A
0.14
0.36
4 Does the word
fit into this
category?
5 Does the word
fit into this
sentence?
0.93
0.63
0.96
0.83
Deeper encoding (when the participants had to consider whether a word fitted into a particular category or sentence) took
longer and gave higher levels of performance. Questions where the response was “Yes” also produced higher recall rates
than those which were responded with “No”. It is interesting that “Yes” and “No” answers took the same amount of
processing time, but “Yes” answers led to better recognition rates
It was concluded that the enhanced performance was because of qualitatively different processing, not just because of
extra time studying. Craik and Tulving say “manipulation of levels of processing at the time of input is an extremely
powerful determinant of retention of word events”
EVALUATION


The experiments were designed carefully with clear
controls and operationalisation of variables. The
study can therefore be replicated and the findings
are likely to be reliable
The framework is clear and the study takes the ideas
and tests them directly, subsequently feeding back to
the framework
 One weakness is how to test “depth” – it can be very
vague – it could be effort or time spent processing
which affected the recall performance
 The tasks are artificial. They involve processing words
in artificial ways and then trying to recognise them.
This is not something that would be done in real life,
so the study could be said to lack validity
2.6 Working Memory Model
Baddeley and Hitch (1974) used the multi-store model of memory as the basis for the working memory model. They were
dissatisfied with the multi-store model, but used the idea of the short-term memory and long-term store. This model is an
improvement on the short-term memory of the multi-store model
www.studyguide.pk
The original model is shown here. The central executive controls
the other components of the working memory, and combines
information from those sources into one episode
The phonological loop consists of the articulatory
loop (or inner voice) and the primary acoustic
store (or inner ear). The inner ear receives
auditory memory traces which decay very rapidly.
The inner voice revives these traces by rehearsing
them
Central executive
Articulatory loop
(inner voice)
Visuo-spatial scratch pad
(inner eye)
The visuospatial scratchpad manipulates spatial information, such as
Primary acoustic store
shapes, colours and the positioning of objects. It is divided into two
(inner ear)
parts: the visual cache and the inner scribe. The cache stores
information about form and colour, and the scribe deals with spatial
information and movement, and also rehearses the information held in the scratchpad to be transferred to the executive
The reason the model highlights the phonological loop and visuospatial scratchpad being two separate elements is because it is
difficult to perform two similar tasks simultaneously successfully. For example, you cannot perform two visual tasks together
well, nor two auditory, but you can do one visual and one auditory together
Further evidence supports the idea of the two systems being separated, for example, patients with agnosia. This causes a loss
of the ability to recognise objects (the visual cache), persons, sounds, shapes and even smells. This is often associated with
brain injury or neurological illness. Sufferers will be unable to recognise an object they are presented with, but can still copy a
drawing of that object (for example, if presented with a toy car, they cannot name it as a “car” but can look at it and draw it).
This proves that the spatial component remains there and intact
De Groot (2006) looked at expert chess players, who were no better at recalling where chess pieces had been randomly placed
on the chess board than non-players. However, when the pieces were placed in their correct positions, the chess players had a
(predictably) outstanding memory of where they should be. This supports the idea of the long-term store being used to help
interpret information in the working memory (short-term).
 The model is an expansion on the multi-store model, it
shows why some dual tasks are different, and why you
cannot undertake two different visual or verbal tasks
simultaneously successfully
 There is much research supporting the model, including
psychological lab experiments and neurophysiological
research, such as brain scans showing the differences in
brain activity
 Patients with agnosia support the model’s separation of
visuospatial components


Because the episodic buffer was added 26 years after the
original model was published, it suggests that the original
model was incomplete, therefore the model may not serve as
an explanation of the working memory
The model doesn’t account for all senses (it only relies on
sound and sight), and much of the lab support for the model
uses artificial tasks which lack validity: because the tasks are
not true-to-life, you cannot guarantee that the other senses
might have been used in real life
2.7 Reconstructive Memory
The key idea which Bartlett proposed this theory upon was that memory is not like a tape recorder. Bartlett, and many other
psychologists, have suggested that a memory is not perfectly formed, perfectly encoded and perfectly retrieved
Bartlett started by thinking that past and current experiences of the individual reflect how an event is remembered. He notes
that there would be input, which is the perception of the event. This is followed by the processing, which includes the
perception and interpretation of the event; this involves previous experiences and schemata (ideas or scripts about the world,
for example an “attending a lesson” or “going to the cinema” script, which paint a certain expectation of the event and outline
rules of what to do)
War of the Ghosts
The origins of Bartlett’s theory came from a game of Chinese whispers. He decided to construct his own experiment, which was
based around the idea of the game. He used a Native American folk story called War of the Ghosts. He used such a story
www.studyguide.pk
because it was very unfamiliar to them, being in a different style and from a different culture, therefore not slotting into their
usual schemata. First of all, Bartlett would read the participants the story, and then ask them to repeat the story back to him,
which prompted several different accounts. There were several more occasions where Bartlett met with the participants to
hear what they could remember of the folk tale. They were able to recall less and less each time as time went on, so the story
became shorter. However, it tended to make more sense, compared to the original story, which to them made no sense
whatsoever
After about six recall sessions, the participants’ average stories had shortened from 330 words to 180 words. Bartlett noticed
that people had rationalised the story in parts that made no sense to them, and filled in their memories so that what they
were recalling seemed sensible to them
Rationalisation: altering something so it makes sense to you
Confabulation: making up certain parts to fill in a memory so it makes sense

The theory is backed by much support, including

Bartlett’s War of the Ghosts Chinese whisper-style
experiment, as well as the work of Elizabeth Loftus, who
has studied the unreliability of eyewitness testimonies
 It can be tested by experimental method because the

independent variable can be operationalised and
measured: a story can have features that can be counted
each time it is recalled and the changes recorded, so up
to a point, the theory can be scientifically tested

The study used War of the Ghosts, which made no sense
to the participants, therefore it might be argued that they
altered the story to make it make sense because they
were being asked to retell the story
There could have also been demand characteristics for
the study, where the participants anticipate what is the
indented answer and try to give that: this would make the
findings unreliable
It does not explain how memory is reconstructive: this is a
theory of description, not an explanation
2.8 Cue-Dependent Theory of Forgetting
Tulving (1975) proposed this theory of forgetting for the long-term memory. He suggests that memory is dependent upon
there being the right cues available. Forgetting occurs when they are not. Two materials are required for recall: a memory
trace (information stored as a result of the original perception of the event), and a retrieval cue (information present in the
individual’s cognitive environment at the time of retrieval that matches the environment at the time of recall)
Everyone has experienced the Tip of the Tongue Phenomenon (proposed by Brown and McNeill, 1966). This refers to knowing
a memory exists, but not having the right cues to access it. This is an example of cue-dependent forgetting
Retrieval cues have been separated into two groups: context cues (the situation or context) and state cues (the individual’s
state or mood at the time). Below is an example of a study exemplary of each
Baker et al. (2004)
Lang et al. (2001)
This study looked at whether chewing gum when learning and
recalling material produces a similar context effect. 83 students
aged 18-46 took part, being randomly assigned to one of four
conditions. In all conditions they were given two minutes to learn
fifteen words. They were asked to recall the words immediately and
24 hours later. The conditions were:
 gum-gum (chew gum when learning and recalling)
 gum-no gum (chew gum when learning but not recalling)
 no gum-gum (don’t chew gum learning, do when recalling)
 no gum-no gum (don’t chew gum when learning or recalling)
In both conditions where the gum was present or absent at both
learning and recall, more words were recalled than when the gum
was present at only learning or recall. This suggests that chewing
gum when learning and recalling information significantly aids
memory due to context-dependency effects
This investigated the role of emotion as a state
cue by inducing fear. 54 students who were
fearful of snakes and spiders had their fear
induced again whilst learning a list of words.
They found that when the fear was induced for
recall, the scared students were able to recall
more learnt words than when they were in a
relaxed state. Experimental research seems to
support anecdotal evidence that places, objects,
smells and emotions can all be triggers to aid
recall, but without these cues present we are
liable to experience difficulty remembering
www.studyguide.pk
The theory is supported by much anecdotal evidence (personal experiences – most people have experienced the “Tip of the
Tongue Phenomenon” where you cannot quite recall what you know exists). There is also a great deal of experimental
evidence (provided by studies) which support the theory. A further strength is that the theory has practical applications, which
are related to cognition and improving memory and ability to recall information. Also, the theory can be tested, unlike theories
such as trace-decay theory
However, one major weakness is that the tasks from all studies supporting the theory are artificial: most often learning words
lists. Also, it is only an explanation for forgetting from long-term memory, it does not include anything about the short-term
store. The theory may not be a complete explanation either, as it cannot explain why emotionally-charged memories can be
really vivid – even without a cue (such as posttraumatic stress disorder or PTSD). It is also hard to prove whether a memory
has been revived from the cue or from the memory trace simply being activated, therefore it makes the theory hard to refute
2.9 Godden and Baddeley (1975)
KEY STUDY
Aim: To investigate cue-dependency theory using divers in wet and dry recall conditions
PROCEDURE
FINDINGS and CONCLUSIONS
Divers were asked to learn words both on land and
underwater. The words were then recalled both on land (dry)
and underwater (wet). This made four conditions: “dry”
learning and “dry” recall; “dry” learning and “wet” recall;
“wet” learning and “dry” recall and “wet” learning and “wet”
recall
There were 18 divers from a diving club, and the lists had 36
unrelated words of two or three syllables chosen at random
from a word book. The word lists were recorded on tape.
There was equipment to play the word lists under the water.
There was also a practice session to teach the divers how to
breathe properly with good timing, so as not to cloud their
hearing of the words being read out. Each list was read twice,
the second time was followed by fifteen numbers which had
to be written down by the divers to remove the words from
their short-term memory. Each diver did all four conditions.
There was 24 hours in between each condition. When on land,
the divers had to still wear their diving gear
As predicted, words learned underwater were best
recalled underwater, and words learned best on land
were best recalled on land. The results are shown in
the table below, the figures are the mean number of
words remembered in each condition:
Recall environment
Study
environment
Dry
Wet
Dry
13.5
8.6
Wet
8.5
11.4
The mean numbers of words remembered for
conditions with the same environment for learning and
recall (13.5 out of 36 for dry/dry and 11.4 for wet/wet)
were much higher than those with dissimilar locations
EVALUATION


There were strong controls present, which makes the
study replicable, so its findings are likely to be
reliable
Even though the tasks were artificial, all of the
participants were divers who had experience
performing tasks under the water, and so the
environment they were in was not unfamiliar – this
means that there was a limited presence of
ecological validity
 The divers were all volunteers on a diving holiday, so
the setting was not controlled, it changed location
each day
 There could have been cheating underwater, as the
researchers could not observe the participants
(although it was assumed cheating did not happen as
if it had, there would have been higher recall
underwater, which there wasn’t)
 There was a longer amount of time between study
and recall when the conditions were different,
because they had to get in/out of the water to swap –
this could have led to the lower recall produced
2.10 Displacement Theory of Forgetting
Displacement is based on the idea that the short-term memory has a limited capacity for information. Miller (1956) argued
that the short-term memory capacity is approximately 7±2 items of information. These can be “chunked” together to increase
capacity, but there is a fixed number of slots
www.studyguide.pk
If the short-term memory is full and new information is registered, then some information is going to be pushed out. There are
two options in this case: information can either be forgotten, or moved into the long-term memory where it is encoded and
stored. The information pushed out in either way is then overwritten with this new data. The key idea is that information will
be lost unless rehearsed enough to be moved into the long-term memory
There is much evidence for the theory of displacement. The multi-store model of memory supports the theory with primary
and recency effects. A primary effect derives from information which is learnt first, and so is quite well-remembered, so the
information is most likely moved into the long-term memory. Whereas recency effects come from information which is learnt
last (most recently), therefore it will still be in the rehearsal loop of the short-term memory, and so also remembered well
The ones on the left were the items at the top of the list, and the ones on the right on the bottom. When the list is taken away
from the participant and they are asked to recall as many items as they can remember, it is not uncommon to only remember
those which are highlighted green (primary effect), as these were first taken in, and those in blue (recency effect), as those will
still be in the short-term memory. Those shown as red from the middle will be forgotten. This is because, due to primary and
recency effects, information in the middle of the list is not so well-remembered because it has neither been processed into the
long-term memory nor remains in the rehearsal loop: it is forgotten
Waugh and Norman decided to test this idea. They read to participants a list of sixteen digits. The participants are then given a
number and have to state the number which proceeds the number they are given. For example, if the probe (digit given to the
participant) is 6, the recall should be 0. However, between the probe and the final digit (the second 8), there is a time gap and
more digits have been called out to the participant, making it unlikely that they will remember the recall. Primary and recency
effects are displayed in this experiment:
7 0 8 4 1 6 0 9 5 5 3 7 2 4 7 8
The results of the study found what was expected: it was easier for participants to recall numbers which proceeded digits from
earlier on (primary) and the most recent (recency). Those in the middle were forgotten, as the information had been lost
Waugh and Norman tested to see if it was indeed displacement, or decay that was causing forgetting. They did this by altering
the experiment slightly. They did it again, this time with two variations. In one variation, the numbers were read slowly (one
digit per second), and the other variation fast (four digits per second):
 Displacement theory suggests that information is lost as new information is taken in because it is replaced therefore displacement theory would say that the speed of reading would not affect participants’ recall
 Decay theory suggests that information is lost as the memory trace fades over time therefore decay theory would say that when the digits are all read out more quickly, recall would improve as there is less
time for the information to decay from the short-term memory
They ran each of these conditions three times, placing the probe in a different place along the number line each time. Both
decay and displacement theories suggest that recall will improve as the probe moves closer to the final digit
What Waugh and Norman found from these variations was that there was a slight, but not very huge, improvement on recall
when the digits were read out fast. This suggests that perhaps the conclusions of the original experiment were wrong, as it
might have been decay causing the forgetting: but because the difference was so insignificant, this is unlikely. However, there
was a clear improvement in recall when the probe was closer to the end of the number line: which both theories suggest. This
supports both theories



The theory has been tested by scientific experiments

leading to cause-and-effect conclusions
The experiments have strong controls – so the
experiments are replicable and their findings are likely to 
be reliable
The theory fits nicely with both the multi-store model
and the working memory model, both of which are
individually supported with their own evidence
It is difficult to operationalised the theory and measure
displacement (what could be displacement might actually
be decay)
Tasks used in the experiments to test the theory are
artificial and not everyday tasks, therefore they lack
validity
Download