Give me several hundred more milliseconds: temporal dynamics of

advertisement
Give me several hundred more
milliseconds: temporal dynamics
of verb prediction in Japanese
Shota Momma1
Hiromu Sakai2
Colin Phillips1 1University
of Maryland 2Hiroshima University
Prediction in language
Comprehenders can predict upcoming words…
I would like my coffee with cream and _____
… very quickly: Online prediction ~= Offline prediction*
*Altman & Kamide, 1999; DeLong & Kutas, 2005; Kamide et al., 2003; Van Berkum et al.,
2005 a.o.
Roadmap
Temporary divergence between online/offline prediction
Goal: learn about how we make prediction
• Part 1 (Empirical): Background & Main result
– Close relation b/w prediction & N400. – Slow prediction: transient online/offline divergence (main
result – simple!)
• Part 2 (Theoretical): Why slow? Making how claims based
on when data.
– Lexical prediction as a semantic memory search
problem
Part I: Prediction & N400
Prediction & N400
Amplitude of N400 closely track predictability (= offline cloze
prob.) of a word.
lo:
There was nothing wrong with the car. (<0.03)
hi:
He mailed the letter without a stamp. (0.92)
Kutas & Hillyard (1984) among many many others.
Failure to predict?
Some aspects of context fails to influence N400.
Argument role-reversal
At the breakfast the boy would eat …
?At the breakfast the egg would eat …
No N400 diff. b/w
Black & Blue.
Kuperberg eta l. (2007); Kim & Osterhout (2005); Kolk et al.,
(2003); Chow et al., (2012); van Herten et al., 2006; Bornkessel
& Schlesewsky, 2006; Kolk & Chwilla, 2007; Hoeks et al., 2004)
Slow prediction?
Not ‘failure’ - some types of information
(e.g., argument role) are harder to use,
leading to slow prediction?
So, does extra time before the verb help?
Chow et al. (2015ab)
Wing Yee Chow UMD 2013 Now at UCL
Current Study
Japanese
– Role reversal by case-marker reversal (NOM-ACC) – Two-word sentences (one argument dropped). Bee-NOM STING
Fish-ACC CATCH
Scholar-NOM STUDY
God-ACC WORSHIP
Current Study
Japanese
– Role reversal by case-marker reversal (NOM-ACC) – Two-word sentences (one argument dropped). ?Bee-ACC STING
?Fish-NOM CATCH
?Scholar-ACC STUDY
?God-NOM WORSHIP
Current Study
Japanese
– Role reversal by case-marker reversal (NOM-ACC) – Two-word sentences (one argument dropped). Canonical vs. Role-reversed
Bee-NOM STING vs. ? Bee-ACC STING
Fish-ACC CATCH vs. ? Fish-NOM CATCH
Scholar-NOM STUDY vs. ? Scholar-ACC STUDY
God-ACC WORSHIP vs. ? God-NOM WORSHIP
Design
Factor 1: Plausibility
Plausible
Bee-NOM sting Fish-ACC catch
Implausible
Bee-ACC sting Fish-NOM catch
800ms
Factor 2: SOA
Short
+
蜂が
Long
+
蜂が
刺す
刺す
1200ms
Method details
N =24, all right handed native Japanese speakers
Delayed plausibility judgment;
160 Experimental (40 per condition) + 160 ‘filler’ sentences; Latin-Square
Result – Verb
- Canonical (bee-NOM sting) - Reversed (bee-ACC sting)
Short SOA
Result – Verb
- Canonical (bee-NOM sting) - Reversed (bee-ACC sting)
Short SOA
- Canonical (bee-NOM sting) - Reversed (bee-ACC sting)
Long SOA
Result – Verb
N400 effect (amplitude difference )
*
**
n.s
Average midline electrodes
Result – Control
Plausible
Turtle-NOM swim Apple-ACC eat
Implausible
Turtle-NOM gets-cold Apple-ACC shave
N400 effect (amplitude difference )
****
***
Case marker difference
not encoded? – No.
I don’t care what those tiny morphemes
tell me. I am going to ignore them.
Verb
Noun
Bee-NOM sting vs. Bee-ACC sting Apple-ACC eat vs. Apple-ACC shave
NOM vs. ACC
*
**
n.s
****
***
Whatever the source of this difference, it is due to
something that happens before the verb.
-> Slow prediction, not failure to predict/
N400’s failure to reflect prediction
**
This is probably not
due to people’s
inattentiveness to
the case morph.
Same pattern in verb-final structure in Chinese:
Last week policeman BA suspect arrest…
Last week suspect BA policeman arrest…
Policeman BA suspect ZAI last week arrest…
Suspect BA Policeman ZAI last week arrest…
Sim. amount of cloze diff., diff. N400 pattern:
… which customer the waitress had served… (25.4%)
… which waitress the customer had served… (zero)
… which illustrator the author had hired… (27.7%)
… which readers the author had hired… (zero)
Chow et al. (2015 ab); see Kukona for related findings.
Anticipatory look to crook (target) &
policeman (lure)
Less anticipatory look to lure
Tobi arrested the crook.
(crook) at later time window.
vs.
Tobi was arrested by the policeman.
Kukona et al. (2011)
Part II: what’s going on?: preliminary model of (lexical) prediction
Prediction as a memory search
Prediction = memory search
Lexical prediction ~= semantic memory search problem
Given bee-ACC…
what does it takes to predict swat (or whatever appropriate
verb) instead of sting?
– Search for a predicate that typically takes bee as a
patient.
– Can we spot this item in semantic memory in one direct
step? – probably not.
Where’s Waldo?
[+red-and-white stripe]
[+wearing glasses]
[+wearing a bobble hat]
[+having a walking stick]
[+male]
….
Prediction as a search
Intuitive strategy
1) Search for things with red-and-white stripes
(‘pop-out’/parallel feature). 2) Among them, search for a male with a walking
stick etc.
Serial search is necessary at stage 2.
Spreading activation as search space
reduction
.....bee.......
STING BUZZ
INSECT HONEY
HIVE PAIN HURT
SWAT
Spotting the right items
Sting
Honey
Insect
Pain
Buzz
Hive
Swat
Spotting the right items
Sting
Honey
Insect
Pain
Buzz
Hive
Swat
Spotting the right items
Sting
Honey
Insect
Pain
Buzz
Swat
Hive
Given Bee-ACC, need to find items that take-bee-as-apatient (i.e., need to reject items that take-bee-as-anagent).
Why is argument role slow to affect
prediction in role reversal sentences?
Search order Activation Strength
early
late
Sting
Buzz
Swat
strong
weak
Prediction is slow when attractive lure (sting)
outranks appropriate target (swat).
Model Summary
• Lexical prediction = sematic memory search
– Two-staged: Spreading activation-like process (use of
association in some form) + subsequent serial search.
– Serial search: spotting a item that satisfies complex
features specified by the context.
Conclusion
Empirical: when data
Theoretical: how claim
N400’s insensitivity to role
reversal can be remedied
by simply giving several
hundred more milliseconds.
Lexical prediction is… (1) two-step process, that
involves… (2) Reduction of search space
via spreading activation-like
mechanism and
(2) Subsequent serial (slow)
checking of each candidate
Argument role information
is slow at affecting verb
prediction.
Short
Plausible
Bee-NOM sting
Implausible
Bee-ACC sting
Long Bee-NOM sting Bee-ACC sting
N400 effect only in Long SOA conditions.
Prediction as a memory search
Proposal: lexical prediction = reduction of search space
using pop-out feature + serial search.
Given bee-ACC….
1) Search for bee-related words.
2) Among them, search for verbs.
3) Among them, search for one that takes bee as
an patient.
Claim: Step (3) requires serial search (= slow).
Prediction as a memory search
Proposal: lexical prediction = reduction of search space
using pop-out feature + serial search.
Given bee-ACC….
1) Search for bee-related words.
2) Among them, search for verbs.
3) Among them, search for one that takes bee as
an patient.
Why serial? Suggestion: you are searching lexical
space, but you are looking for a word that satisfies nonlexical property, [takes-bee-as-a-patient]. Input to the
prediction is words, output is also words, but need to use
non-lexical feature (world-knowledge) in between.
Representation mismatch = serial = slow.
Spotting the right words
Sting
Honey
Insect
Pain
Buzz
Hive
Given Bee-ACC, need to find [+verb].
Swat
Spotting the right words
Sting
Honey
Insect
Pain
Buzz
Swat
Hive
Given Bee-ACC, need to find [+verb].
Does this feature ‘pop out’ (parallel)? Maybe.
Anticipatory look to crook (target) &
policeman (lure)
Less anticipatory look to lure
Tobi arrested the crook.
(crook) at later time window.
vs.
Tobi was arrested by the policeman.
Kukona et al. (2011)
Same pattern in verb-final structure in Chinese:
Last week policeman BA suspect arrest…
Last week suspect BA policeman arrest…
Policeman BA suspect ZAI last week arrest…
Suspect BA Policeman ZAI last week arrest…
Sim. amount of cloze diff., different N400 pattern:
… which customer the waitress had served… (25.4%)
… which waitress the customer had served… (zero)
… which illustrator the author had hired… (27.7%)
… which readers the author had hired… (zero)
Chow et al. (2015 ab); see Kukona for related findings.
Prediction as a memory search
Proposal:
lexical prediction = searching sematic memory in two step for items that satisfies multiple properties .
1) reduction of search space by exploiting semantic association (directly encoded in semantic memory network), i.e., spreading activation.
2) Serially check (often slow) whether an attended item satisfies complex property specified by the context.
Download