PSY 369: Psycholinguistics Psycholinguistics Review for Exam 1 (1 week from today) Chapters 1, 2, 3, 5 Lectures weeks 1, 2, 3, 4 Week 1 - What is language? - - Compared to communication Do animals use language What is psycholinguistics? Week 1 & Chapter 1 terms and concepts Cognitive science Linguistics Semantics Syntax Phonology Pragmatics Wilhelm Wundt Behaviorism B. F. Skinner Roger Brown Noam Chomsky Associative chain theory Rationalism Empiricism Tacit knowledge Explicit knowledge Communication Charles Hockett’s features of language Animals and Language Language features Week 2 - Crash course in linguistics - Different levels of analyses - The parts and the rules Week 2 & Chapter 2 terms and concepts Word order Phonology Duality of patterning Phones & phonemes Minimal pairs Distinctive features Categorical perception Morphology Free and bound morphemes Derivational & inflectional rules Wug test Syntax & Grammar Linguistic productivity Phrase structure rules Noam Chomsky Recursive rules Arbitrariness Observational adequacy Descriptive adequacy Explanatory adequacy Syntactic ambiguity Deep structure Surface structure Transformational rules Psychological reality of grammar Derivational theory of complexity Centrality of syntax Semantics Sense and Reference Pragmatics Lexical semantics Compositional semantics allomorphs Week 3 - Crash course in cognitive psychology - Mental structures and processes Week 3 & Chapter 3 terms and concepts Atkinsin & Shiffrin model Sensory stores Short term memory Working memory Long term memory Declarative memory Procedural memory Attention Top down processing Bottom up processing Automatic processing Controlled processing Semantic memory Episodic memory Serial processing Parallel processing Modularity Cognitive psychology The mind as a computer analogy George Sperling George Miller Chunking Limited resources theory Feature integration theory Working memory capacity Ebbinghaus Bahrick Dual task procedure Visual Search exps Week 4 - Representing Language - What and how are properties of language mentally represented? Week 4 & Chapter 5 terms and concepts Internal (mental) lexicon Lexical access Tip-of-the-tongue Syntactic category Inflectional morphemes Derivation morphemes Sense and reference Synonym Hyponymy & hypernymy Semantic network Hierarchical network Collins and Quillian model Lexical decision task Collins and Loftus model Word frequency Spreading activation models Semantic priming Intersection search Lexical ambiguity Semantic verification Lexical primatives Cognitive economy Lexical organization Typicality effect Speech errors Forster search model Prior context effects Morton Logogen model Marslen-Wilson cohort model Recognition point Psycholinguistics : A brief history 1900 10 20 50 60 70 80 90 2000 Multidisciplinary origins • philosophy (e.g., theories of meaning) • physiology (e.g., brain trauma effects on language) • linguistics (e.g., historical vs. descriptive, Noam Chomsky) • psychology (e.g., behaviorist vs. cognitive approaches) • computer science (e.g., artificial intelligence) What is communication? Any means by which two (or more) individuals exchange information Paralinguistic techniques – non-verbal communication Non-linguistic communication - that do involve vocalization Hand signals, facial expressions, body language, nods, smiles, winks, etc. Also includes things like tone of voice, tempo, volume, etc. Grunts, groans, snorts, sighs, whimpers, etc. Not all produced sounds are intended to convey messages, so they aren’t communication e.g., snoring Features of Language (Hockett, 1960) Arbitrariness Displacement Productivity Discreteness Semanticity Duality of patterning “dog” “labrador” “perro” “hund” Last week my dog escaped the backyard and dug in the neighbor’s garden. - Four legged animal - Common pet - Fur - Chases cats - Barks - Etc. “dog” “dog” Meaning No meaning “dog” Words and morphemes Phonomes “dog” /d/ /o/ /g/ Hockett (1960) is available for download in the ‘optional readings’ on Blackboard Animals and language? Parrot Dog Bird song Arbitrariness Displacement Productivity Discreteness Semanticity Duality of patterning ? ? ? ? ? Bee dance Human Language What is language? Some generally agreed upon conclusions Symbolic Voluntary Language use is under our individual control Language is systematic Elements are used to represent something other than itself There is hierarchical structure that organizes linguistic elements Modalities Spoken, written, signed (sign language) Assumed primacy of speech - it came first Week 2 - Crash course in linguistics - Different levels of analyses - The parts and the rules Week 2 & Chapter 2 terms and concepts Word order Phonology Duality of patterning Phones & phonemes Minimal pairs Distinctive features Categorical perception Morphology Free and bound morphemes Derivational & inflectional rules Wug test Syntax & Grammar Linguistic productivity Phrase structure rules Noam Chomsky Recursive rules Arbitrariness Observational adequacy Descriptive adequacy Explanatory adequacy Syntactic ambiguity Deep structure Surface structure Transformational rules Psychological reality of grammar Derivational theory of complexity Centrality of syntax Semantics Sense and Reference Pragmatics Lexical semantics Compositional semantics allomorphs Levels of analysis language Phonology Morphology Syntax Semantics Pragmatics structure medium of transmission phonetics grammar phonology morphology syntax pragmatics use meaning (semantics) lexicon discourse Phonology The sounds of a language Phonemes, allophones & phones Phonemes - abstract (mental) representations of the sound units in a language Minimal pairs: pie, buy, tie, die, sigh, lie, my, guy, why, shy Articulatory features Allophones - different sounds that get categorized as the same phoneme Phones - a general term for the sounds used in languages Rules about how to put the sounds together ‘Spill’ vs. ‘Pill’ Rule: If /p/ is used in word initial position you add aspiration (a puff of air), if word internal don’t aspirate Morphology Morpheme – smallest unit that conveys meaning Productivity Free morphemes: can stand alone as words Bound morphemes: can not stand alone as words Inflectional rules used to express grammatical contrasts in sentences Derivational rules Construction of new words, or change grammatical class Allomorphs: different variations of the same morpheme (e.g., plural morpheme in English) Language differences Isolating, Inflecting, Agglutinating languages Psychological reality of Morphology Speech errors Stranding errors: The free morpheme typically moves, but the bound morpheme stays in the same location (“they are Turking talkish”) Morpheme substitutions: (“Where's the fire distinguisher?”) Morpheme shift: (“I haven't satten down and writ__ it”) Wug test (Gleason, 1958) Here is a wug. Now there are two of them. There are two _______. Syntax: the ordering of the words • The underlying structural position, rather than surface linear position matters. S S NP a dog VP NP V bites NP a a man VP V bites man Subject position Object position NP a dog Syntax: the ordering of the words • Not just the linear ordering • It is the underlying set of syntactic rules VP VP V NP an elephant NP PP P shot NP V P NP in my pajamas PP shot an elephant NP in my pajamas Generative Grammar (wiki) The pieces: – Grammatical features of words – Phrase structure rules - these tell us how to build legal structures • S --> NP VP Recursion: you can embed structures within structures NP --> (A) (ADJ) N (PP) PP --> Prep NP The result is an infinite number of syntactic structures from a finite set of pieces Chomsky’s Linguistics Chomsky proposed that grammars could be evaluated at three levels: Observational adequacy Descriptive adequacy Explanatory adequacy Transformational grammar Two stages phrase structures for a sentence Build Deep Structure Convert to Surface Structure Psychological reality of syntax Derivational theory of complexity The more transformations, the more complex Evidence for (trace) The boy was bitten by the wolf The boy was bitten. (involves deletion) No evidence for more processing of the second sentence Some recent evidence or reactivation of moved constituent at the trace position Evidence for syntax Syntactic priming Syntactic priming Bock (1986), Task: If you hear a sentence, repeat it, if you see a picture describe it a: The ghost sold the werewolf a flower b: The ghost sold a flower to the werewolf a: The girl gave the teacher the flowers b: The girl gave the flowers to the teacher Semantics The study of meaning Arbitrariness “What’s in a name? that which we call a rose By any other name would smell as sweet.” Words are not the same as meaning Words are symbols linked to mental representations of meaning (concepts) Even if we changed the name of a rose, we wouldn’t change the concept of what a rose is Separation of word and meaning Concepts and words are different things Translation argument Every language has words without meaning, and meanings without words Imperfect mapping Multiple meanings of words e.g., transmogrify, wheedle, scalawag e.g., ball, bank, bear Elasticity of meaning Meanings of words can change with context e.g., newspaper Semantics Philosophy of meaning Sense and reference “The world’s most famous athlete.” “The athlete making the most endorsement income.” 2 distinct senses, 1 reference Now Over time the senses typically stay the same, while the references may change In the 90’s Semantics Two levels of analysis (and two traditions of psycholinguistic research) Word level (lexical semantics) How do we store words? How are they organized? What is meaning? How do words relate to meaning? Sentence level (compositional semantics) How do we construct higher order meaning? How do word meanings and syntax interact? Lexical Semantics Word level The (mental) lexicon: the words we know The average person knows ~60,000 words How are these words represented and organized? Dictionary definitions? Necessary and sufficient features? Lists of features? Networks? Lexical Ambiguity What happens when we use ambiguous words in our utterances? “Oh no, Lois has been hypnotized and is jumping off the bank!” Money “bank” River “bank” Lexical Ambiguity Psycholinguistic evidence suggests that multiple meanings are considered Debate: how do we decide which meaning is correct ‘bank’ usually means Based on: frequency,Hmm… context the financial institution, but Lois was going fishing with Jimmy today … Compositional Semantics Phrase and sentence level Some of the theories Truth conditional semantics: meaning is a logical relationship between an utterance and a state of affairs in the world Jackendoff’s semantics Concepts are lists of features, images, and procedural knowledge Conceptual formation rules Cognitive grammar Mental models - mental simulations of the world Pragmatics Sentences do more than just state facts, instead they are uttered to perform actions How to do things with words (J. L. Austin, 1955 lectures) Using registers Conversational implicatures Speech acts Pragmatics Registers: How we modify conversation when addressing different listeners Determine our choice of wording or interpretation based on different contexts and situations Speech directed at babies, at friends, at bosses, at foreigners Pragmatics Conversational implicatures Speakers are cooperative Grice’s conversational maxims Quantity: say only as much as is needed Quality: say only what you know is true Relation: say only relevant things Manner: Avoid ambiguity, be as clear as possible Pragmatics Speech acts: How language is used to accomplish various ends Direct speech acts Indirect speech acts Open the window please. Clean up your room! “It is hot in here” “Your room is a complete mess!” Non-literal language use e.g., Metaphors and idioms Pyscholinguistics and pragmatics Three-stage theory Stage 1: compute the literal interpretation of the utterance Stage 2: evaluate the interpretation against assumptions Grice’s conversational maxims Stage 3: if interpretation doesn’t seem correct, derive (or retrieve) non-literal interpretation Pyscholinguistics and pragmatics One stage approaches Evaluate utterance at multiple levels simultaneously and select the appropriate one Use context to derive the single most-likely interpretation Week 3 - Crash course in cognitive psychology - Mental structures and processes Week 3 & Chapter 3 terms and concepts Atkinsin & Shiffrin model Sensory stores Short term memory Working memory Long term memory Declarative memory Procedural memory Attention Top down processing Bottom up processing Automatic processing Controlled processing Semantic memory Episodic memory Serial processing Parallel processing Modularity Cognitive psychology The mind as a computer analogy George Sperling George Miller Chunking Limited resources theory Feature integration theory Working memory capacity Ebbinghaus Bahrick Dual task procedure Visual Search exps Mind as computer analogy Limitations of the analogy Computers Minds (Brains??) fast serial (mostly) digital few connections (relatively) slow parallel analog trillions of connections Other analogies out there: Mind as a brain (Connectionism) Mind as a body (Embodied Cognition) The ‘standard model’ The Multistore Model Information ‘flows’ from one memory buffer to the next The sensory store George Sperling’s full and partial report experiments Properties sensory specific - one for vision, one for audition, etc. high capacity extremely fast decay Short Term Memory Serial position recall experiments (e.g., Peterson & Peterson), STM span experiments, chunking Properties rapid access (about 35 milliseconds per item) limited capacity (7+/- 2 chunks; George Miller, 1956) fast decay, about 12 seconds (longer if rehearsed or elaborated) Working Memory Allocate attentional resources to the subcomponents Directs elaboration/manipulation of information Working memory instead of STM Store and manipulate visual and spatial information Directly from perception Indirectly from imagery Phonological rehearsal mechanism Phonological store Very limited capacity Rehearsal maintains information in the store Long term memory Properties Capacity: Unlimited? Duration: Decay/interference, retrieval difficulty Organization Multiple subsystems for type of memory Associative networks (more on these next week) Long term memory: Capacity How much can we remember? Lots, no known limits to how much memory storage we have. More important issue concerns questions about encoding and retrieval Encoding - getting memories into LTM what gets in? Rehearsal Depth of processing – organization, distinctiveness, effort, elaboration Retrieval - getting memories out of LTM what gets out? exact memories or reconstructed memories? Long term memory: Duration How long do our memories last? Ebbinghaus (1885/1913) He memorized non-sense syllables. Memorize them until perfect performance, Test to relearn the lists perfectly. This was called the "savings." Bahrick (1984) He has done a number of studies asking people about memories for things (e.g., Spanish, faces of classmates, etc.) that they learned over 50 years past. He has found evidence that at least some memories stick around a really long time. Long term memory: Organization This theory suggests that there are different memory components, each storing different kinds of information. Declarative The Multiple Memory Stores Theory Declarative episodic episodic - memories about events semantic - knowledge of facts Procedural - memories about how to do things (e.g., the thing that makes you improve at riding a bike with practice. Procedural semantic Attention Major tool of the central executive Limited capacity resource Filtering capabilities Integration function Attention: Limited resource Only have so much ‘energy’ to make things go, so need to divide it and allocate it to processes Single pool (e.g., Kahneman, 1973) Multiple pools (e.g., Navon & Gopher, 1979) Central bank of resources available to all tasks that need it Several banks of specialized resources – divided up in terms of input/output modalities, stages of info processing (perception, memory, response output) Dual task experiments Attention: An information filter Information bottleneck. There is so much info, only some is let through, while the rest is filtered out Early selection (e.g., Broadbent, 1958, Triesman, 1964) Late filters (Deutsch & Deutsch) Everything gets in, bottleneck comes at response level (can only respond to limited number of things) Cocktail party effect, dichotic listening Attention: Integration Attention is used to ‘glue’ features together Feature integration theory & Visual search exps Where’s Waldo Find the X X X X X XX XX XX X XX Pop out X O OO XO XO O XX Slow search X X O X O X Attention: How do we control it? Attention as a ‘spotlight’ Move it around, make it focused or diffuse Is it ‘aimed’ or ‘pulled’ Automaticity Controlled processes Require resources Under some volitional direction Slow, effortful Automatic processes Require little attention Obligatory Fast Bottom-up & Top-down Terms come from computer science Bottom up (data driven) relies upon evidence that is physically present, building larger units based on smaller ones Top down (knowledge driven), using higher-level information to support lower-level processes T C E FROG T Doing the laundry story Week 4 - Representing Language - What and how are properties of language mentally represented? Week 4 & Chapter 5 terms and concepts Internal (mental) lexicon Lexical access Tip-of-the-tongue Syntactic category Inflectional morphemes Derivation morphemes Sense and reference Synonym Hyponymy & hypernymy Semantic network Hierarchical network Collins and Quillian model Lexical decision task Collins and Loftus model Word frequency Spreading activation models Semantic priming Intersection search Lexical ambiguity Semantic verification Lexical primatives Cognitive economy Lexical organization Typicality effect Speech errors Forster search model Prior context effects Morton Logogen model Marslen-Wilson cohort model Recognition point Storing linguistic information Tale of the tape: High capacity: 40,000 – 60,000 words Fast: Recognition in as little as 200ms (often before word ends) How do we search that many, that fast!? – suggests that there is a high amount of organization Or something much more complex “The world’s largest data bank of examples in context is dwarfed by the collection we all carry around subconsciously in our heads.” E. Lenneberg (1967) Excellent reading: Words in the Mind, Aitchison (1987, 2003) Lexical primitives Word primitives horse horses Need a lot of representations Fast retrieval Morpheme primitives horse barn barns -s barn Economical - fewer representations Slow retrieval - some assembly required Decomposition during comprehension Composition during production Lexical primitives Lexical Decision task (e.g., Taft, 1981) See a string of letters As fast as you can determine if it is a real English word or not “yes” if it is “no” if it isn’t Typically speed and accuracy are the dependent measures table vanue daughter tasp cofef hunter Yes No Yes No No Yes Lexical primitives Lexical Decision task This evidence supports the morphemes as primitives view daughter Pseudo-suffixed daught -er Takes longer hunter Multimorphemic hunt -er Lexical primitives May depend on other factors What kind of morpheme Inflectional (e.g., singular/plural, past/present tense) Derivational (e.g., drink --> drinkable, infect --> disinfect) Frequency of usage High frequency multimorphemic (in particular if derivational morphology) may get represented as a single unit e.g., impossible vs. imperceptible Compound words Semantically transparent Buttonhole Semantically opaque butterfly Lexical organization Factors that affect organization Phonology Frequency Imageability, concreteness, abstractness Grammatical class Semantics Lexical organization Phonology Words that sound alike may be stored “close together” Brown and McNeill (1966) Tip of the tongue phenomenon (TOT) What word means to formally renounce the throne? abdicate Look at what words they think of but aren’t right e.g, “abstract,” “abide,” “truncate” Lexical organization Phonology Words that sound alike may be stored “close together” Brown and McNeill (1966) Tip of the tongue phenomenon (TOT) % of matches 50 Similar-sounding words 40 30 Similar-meaning words 20 10 1 Letters at 2 3 Word beginning 3 2 1 Word end More likely to approximate target words with similar sounding words than similar meanings The “Bathtub Effect” - Sounds at the beginnings and ends of words are remembered best (Aitchison, 2003) Lexical organization Frequency Typically the more common a word, the faster (and more accurately) it is named and recognized Typical interpretation: easier to retrieve (or activate) However, Balota and Chumbley (1984) Frequency effects depend on task Lexcial decision - big effect Naming - small effect Category verifcation - no effect A canary is a bird. T/F Lexical organization Imageability, concreteness, abstractness Umbrella Lantern Freedom Apple Knowledge Evil More easily remembered More easily accessed Lexical organization Grammatical class Grammatical class constraint on substitution errors “she was my strongest propeller” (proponent) “the nation’s dictator has been exposed” (deposed) Word association tasks Associate is typically of same grammatical class Lexical organization Grammatical class Open class words Content words (nouns, verbs, adjectives, adverbs) Closed class words Function words (determiners, prepositions, …) Lexical organization Semantics Free associations (see the “cat” demo in earlier lecture) Most associates are semantically related (rather than phonologically for example) Semantic Priming task For the following letter strings, decide whether it is or is not an English word Lexical organization Related Semantic Priming task nurse doctor Unrelated shoes doctor Responded to faster “Priming effect” Lexical organization Another possibility is that there are multiple levels of representation, with different organizations at each level Meaning based representations Grammatical based representations Sound based representations Semantic Networks Semantic Networks Words can be represented as an interconnected network of sense relations Each word is a particular node Connections among nodes represent semantic relationships Collins and Quillian (1969) Animal Lexical entry Bird has feathers can fly has wings Semantic Features has skin can move around breathes Fish has fins can swim has gills Collins and Quillian Hierarchical Network model Lexical entries stored in a hierarchy Representation permits cognitive economy Reduce redundancy of semantic features Collins and Quillian (1969) Testing the model Semantic verification task An A is a B True/False An apple has teeth Use time on verification tasks to map out the structure of the lexicon. Collins and Quillian (1969) has skin can move around breathes Animal Bird Robin has feathers can fly has wings Robins have skin Testing the model Sentence Robins eat worms Robins have feathers Robins have skin Verification time 1310 msecs 1380 msecs 1470 msecs eats worms has a red breast Participants do an intersection search Collins and Quillian (1969) Problems with the model Effect may be due to frequency of association “A robin breathes” is less frequent than “A robin eats worms” Assumption that all lexical entries at the same level are equal The Typicality Effect A whale is a fish vs. A horse is a fish Which is a more typical bird? Ostrich or Robin. Collins and Quillian (1969) Animal Robin and Ostrich occupy the same relationship with bird. Bird Robin has feathers can fly has wings eats worms Ostrich has a red breast has skin can move around breathes Fish has fins can swim has gills has long legs is fast Verification times: can’t fly “a robin is a bird” faster than “an ostrich is a bird” Semantic Networks Alternative account: store feature information with most “prototypical” instance (Eleanor Rosch, 1975) Rate on a scale of 1 to 7 if these are good examples of category: Furniture TV bed chair table refrigerator couc h desk 1) chair 1) sofa 2) couch 3) table : : 12) desk 13) bed : : 42) TV 54) refrigerator Semantic Networks Alternative account: store feature information with most “prototypical” instance (Eleanor Rosch, 1975) Prototypes: Some members of a category are better instances of the category than others Fruit: apple vs. pomegranate What makes a prototype? More central semantic features What type of dog is a prototypical dog? What are the features of it? We are faster at retrieving prototypes of a category than other members of the category Spreading Activation Models Collins & Loftus (1975) street vehicle car truck house orange blue Fire engine fire red pear roses flowers apple tulips bus Words represented in lexicon as a network of relationships Organization is a web of interconnected nodes in which connections can represent: fruit categorical relations degree of association typicality Spreading Activation Models Collins & Loftus (1975) street vehicle car bus truck blue house orange Fire engine fire red apple tulips pear roses flowers Retrieval of information fruit Spreading activation Limited amount of activation to spread Verification times depend on closeness of two concepts in a network Spreading Activation Models Advantages of Collins and Loftus model Recognizes diversity of information in a semantic network Captures complexity of our semantic representation (at least some of it) Consistent with results from priming studies Spreading Activation Models More recent spreading activation models Probably the dominant class of models currently used Typically have multiple levels of representations Activate Retrieval Lexical access Up until this point we’ve focused on structure of lexicon But the evidence is all inferred from usage Speech errors, priming studies, verification, lexical decision While structure is important, so are the processes that may be involved in activating and retrieval the information We’ve seen this already a little with intersection searches and spreading activation Activate Retrieval Lexical access How do we retrieve the linguistic information from Long-term memory? What factors are involved in accessing (activating and/or retrieving?) information from the lexicon? Models of lexical access Recognizing a word Recognizing a word Input Search for a match cat cat dog cap wolf tree yarn cat claw fur hat Select word Retrieve lexical information Cat cat noun Animal, pet, Meows, furry, Purrs, etc. Lexical access Factors affecting lexical access Frequency Semantic priming Role of prior context Phonological structure Morphological structure Lexical ambiguity Role of prior context Swinney (1979) Hear: “Rumor had it that, for years, the government building has been plagued with problems. The man was not surprised when he found several spiders, roaches and other bugs in the corner of his room.” Lexical Decision task Context related: Context inappropriate: Context unrelated sew ant spy Results and conclusions Within 400 msecs of hearing "bugs", both ant and spy are primed After 700 msecs, only ant is primed Morphological structure Snodgrass and Jarvell (1972) Do we strip off the prefixes and suffixes of a word for lexical access? Lexical Decision Task: Response times greater for affixed words than words without affixes Evidence suggests that there is a stage where prefixes are stripped. Models of lexical access Serial comparison models Search model (Forster, 1976, 1979, 1987, 1989) Parallel comparison models Logogen model (Morton, 1969) Cohort model (Marslen-Wilson, 1987, 1990) Logogen model (Morton 1969) Context system Auditory stimuli Visual stimuli Auditory analysis Visual analysis Semantic Attributes Logogen system Available Responses Output buffer Responses Logogen model The lexical entry for each word comes with a logogen The lexical entry only becomes available once the logogen ‘fires’ When does a logogen fire? When you read/hear the word Think of a logogen as being like a ‘strength-o-meter’ at a fairground When the bell rings, the logogen has ‘fired’ ‘cat’ [kæt] • What makes the logogen fire? – seeing/hearing the word • What happens once the logogen has fired? – access to lexical entry! ‘cat’ [kæt] • So how does this help us to explain the frequency effect? – High frequency words have a lower threshold for firing –e.g., cat vs. cot ‘cot’ [kot] Low freq takes longer ‘doctor’ [doktə] • Spreading activation from doctor lowers the threshold for nurse to fire – So nurse take less time to fire doctor ‘nurse’ [nə:s] Spreading activation network doctor nurse nurse Search model Visual input Pointers Decreasing frequency Entries in order of Access codes Auditory input /kat/ cat Mental lexicon mat cat mouse Cohort model Three stages of word recognition 1) Activate a set of possible candidates 2) Narrow the search to one candidate Recognition point (uniqueness point) - point at which a word is unambiguously different from other words and can be recognized 3) Integrate single candidate into semantic and syntactic context Specifically for auditory word recognition Speakers can recognize a word very rapidly Usually within 200-250 msec Cohort model Prior context: “I took the car for a …” /s/ /sp/ … soap spinach psychologist spin spit sun spank … spinach spin spit spank … time /spi/ spinach spin spit … /spin/ spin Comparing the models Each model can account for major findings (e.g., frequency, semantic priming, context), but they do so in different ways. Search model is serial and bottom-up Logogen is parallel and interactive (information flows up and down) Cohort is bottom-up but parallel initially, but then interactive at a later stage