Contextual Vocabulary Acquisition: From Algorithm to Curriculum Michael W. Kibby William J. Rapaport

advertisement
Contextual Vocabulary Acquisition:
From Algorithm to Curriculum
Michael W. Kibby
Department of Learning & Instruction and The Reading Center
William J. Rapaport
Department of Computer Science & Engineering
Department of Philosophy, and Center for Cognitive Science
Karen M. Wieland
Department of Learning & Instruction , The Reading Center,
and The Nichols School
NSF ROLE Grant REC-0106338
1
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Definition of “CVA”
“Contextual Vocabulary Acquisition” =def
• the acquisition of word meanings from text
– “incidental”
– “deliberate”
• by reasoning about
– contextual clues
– background knowledge (linguistic, factual, commonsense)
• Including hypotheses from prior encounters (if any) with the
word
• without external sources of help
– No dictionaries
– No people
47
QuickTime™ and a
GIF decompressor
are needed to see this picture.
CVA: From Algorithm to Curriculum
1. Computational theory of CVA
–
Based on:
•
•
–
algorithms developed by Karen Ehrlich (1995)
verbal protocols (case studies)
Implemented in a semantic-network-based
knowledge-representation & reasoning system
•
SNePS (Stuart C. Shapiro & colleagues)
2. Educational curriculum to teach CVA
–
–
–
Based on our algorithms & protocols
To improve vocabulary & reading comprehension
Joint work with Michael Kibby & Karen Wieland
•
Center for Literacy & Reading Instruction
48
QuickTime™ and a
GIF decompressor
are needed to see this picture.
People Do “Incidental” CVA
• We know more words than explicitly taught
– Average high-school grad knows ~45K words
 learned ~2.5K words/year (over 18 yrs.)
– But only taught ~400/school-year
• ~ 4800 in 12 years of school (~ 10% of total)
 Most word meanings learned from context
− including oral & perceptual contexts
– “incidentally” (unconsciously)
• How?
50
QuickTime™ and a
GIF decompressor
are needed to see this picture.
People Also Do “Deliberate” CVA
•
•
•
•
•
•
•
You’re reading;
You understand everything you read, until…
You come across a new word
Not in dictionary
No one to ask
So, you try to “figure out” its meaning from “context”
How?
– guess? derive? infer? deduce? educe? construct? predict? …
– our answer:
• Compute it from inferential search of “context”, including background
knowledge
51
QuickTime™ and a
GIF decompressor
are needed to see this picture.
What does ‘brachet’ mean?
52
QuickTime™ and a
GIF decompressor
are needed to see this picture.
(From Malory’s Morte D’Arthur [page # in brackets])
1.
2.
3.
4.
10.
18.
There came a white hart running into the hall with
a white brachet next to him, and thirty couples of
black hounds came running after them. [66]
As the hart went by the sideboard, the white
brachet bit him. [66]
The knight arose, took up the brachet and rode
away with the brachet. [66]
A lady came in and cried aloud to King Arthur,
“Sire, the brachet is mine”. [66]
There was the white brachet which bayed at him
fast. [72]
The hart lay dead; a brachet was biting on his
throat, and other hounds came behind. [86]
53
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Computational cognitive theory of
how to learn word meanings
• From context
– I.e., text + grammatical info + reader’s prior knowledge
• With no external sources (human, on-line)
– Unavailable, incomplete, or misleading
• Domain-independent
– But more prior domain-knowledge yields better
definitions
• “definition” = hypothesis about word’s meaning
– Revisable each time word is seen
54
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Cassie learns what “brachet” means:
Background info about:
harts, animals, King Arthur, etc.
No info about:
brachets
Input: formal-language (SNePS) version of simplified English
A hart runs into King Arthur’s hall.
• In the story, B12 is a hart.
• In the story, B13 is a hall.
• In the story, B13 is King Arthur’s.
• In the story, B12 runs into B13.
A white brachet is next to the hart.
• In the story, B14 is a brachet.
• In the story, B14 has the property “white”.
• Therefore, brachets are physical objects.
(deduced while reading;
Cassie believes that only physical objects have color)
55
QuickTime™ and a
GIF decompressor
are needed to see this picture.
--> (defineNoun "brachet")
Definition of brachet:
Class Inclusions: phys obj,
Possible Properties: white,
Possibly Similar Items:
animal, mammal, deer,
horse, pony, dog,
I.e., a brachet is a physical object that can be white
and that might be like an animal, mammal, deer,
horse, pony, or dog 56
QuickTime™ and a
GIF decompressor
are needed to see this picture.
A hart runs into King Arthur’s hall.
A white brachet is next to the hart.
The brachet bites the hart’s buttock.
--> (defineNoun "brachet")
Definition of brachet:
Class Inclusions: animal,
Possible Actions: bite buttock,
Possible Properties: white,
Possibly Similar Items: mammal, pony,
57
QuickTime™ and a
GIF decompressor
are needed to see this picture.
A hart runs into King Arthur’s hall.
A white brachet is next to the hart.
The brachet bites the hart’s buttock.
The knight picks up the brachet.
The knight carries the brachet.
--> (defineNoun "brachet")
Definition of brachet:
Class Inclusions: animal,
Possible Actions: bite buttock,
Possible Properties: small, white,
Possibly Similar Items: mammal, pony,
58
QuickTime™ and a
GIF decompressor
are needed to see this picture.
A hart runs into King Arthur’s hall.
A white brachet is next to the hart.
The brachet bites the hart’s buttock.
The knight picks up the brachet.
The knight carries the brachet.
The lady says that she wants the brachet.
--> (defineNoun "brachet")
Definition of brachet:
Class Inclusions: animal,
Possible Actions: bite buttock,
Possible Properties: valuable, small,
white,
Possibly Similar Items: mammal, pony,
59
QuickTime™ and a
GIF decompressor
are needed to see this picture.
A hart runs into King Arthur’s hall.
A white brachet is next to the hart.
The brachet bites the hart’s buttock.
The knight picks up the brachet.
The knight carries the brachet.
The lady says that she wants the brachet.
The brachet bays at Sir Tor.
[background knowledge: only hunting dogs bay]
--> (defineNoun "brachet")
Definition of brachet:
Class Inclusions: hound, dog,
Possible Actions: bite buttock, bay, hunt,
Possible Properties: valuable, small, white,
I.e. A brachet is a hound (a kind of dog) that can bite, bay, and hunt,
and that may be valuable, small, and white.
60
QuickTime™ and a
GIF decompressor
are needed to see this picture.
General Comments
• System’s behavior  human protocols
• System’s definition  OED’s definition:
= A brachet is “a kind of hound which hunts by scent”
61
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Computational cognitive theory of how to learn word meanings from context (cont.)
• 3 kinds of vocabulary acquisition:
– Construct new definition of unknown word
• What does ‘brachet’ mean?
– Fully revise definition of misunderstood word
• Does “smiting” entail killing?
– Expand definition of word used in new sense
• Can you “dress” (i.e., clothe) a spear?
• Initial hypothesis;
Revision(s) upon further encounter(s);
Converges to stable, dictionary-like definition;
Subject to revision
62
QuickTime™ and a
GIF decompressor
are needed to see this picture.
State of the Art: Computational Linguistics
• Information extraction systems
• Autonomous intelligent agents
• There can be no complete lexicon
• Such systems/agents shouldn’t have
to stop to ask questions
65
QuickTime™ and a
GIF decompressor
are needed to see this picture.
State of the Art: Computational Linguistics
• Granger 1977: “Foul-Up”
– Based on Schank’s theory of “scripts” (schema theory)
– Our system not restricted to scripts
• Zernik 1987: self-extending phrasal lexicon
– Uses human informant
– Ours system is really “self-extending”
• Hastings 1994: “Camille”
– Maps unknown word to known concept in ontology
– Our system can learn new concepts
• Word-Sense Disambiguation:
– Given ambiguous word & list of all meanings, determine the
“correct” meaning
• Multiple-choice test 
– Our system: given new word, compute its meaning
• Essay question 
66
QuickTime™ and a
GIF decompressor
are needed to see this picture.
State of the Art: Vocabulary Learning (I)
• Elshout-Mohr/van Daalen-Kapteijns 1981,1987:
– Application of Winston’s AI “arch” learning theory
– (Good) reader’s model of new word = frame
• Attribute slots, default values
• Revision by updating slots & values
– Poor readers update by replacing entire frames
– But EM & vDK used:
• Made-up words
• Carefully constructed contexts
– Presented in a specific order
67
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Elshout-Mohr & van Daalen-Kapteijns
Experiments with neologisms in 5 artificial contexts
• When you are used to a view it is depressing when you live in a
room with kolpers.
– Superordinate information
• At home he had to work by artificial light because of those kolpers.
• During a heat wave, people want kolpers, so sun-blind sales
increase.
– Contexts showing 2 differences from the superordinate
• I was afraid the room might have kolpers, but plenty of sunlight
came into it.
• This house has kolpers all summer until the leaves fall out.
– Contexts showing 2 counterexamples due to the 2 differences
68
QuickTime™ and a
GIF decompressor
are needed to see this picture.
State of the Art: Psychology
• Johnson-Laird 1987:
– Word understanding  definition
– Definitions aren’t stored
– “During the Renaissance, Bernini
cast a bronze of a mastiff eating
truffles.”
69
QuickTime™ and a
GIF decompressor
are needed to see this picture.
State of the Art: Psychology
• Sternberg et al. 1983,1987:
– Cues to look for (= slots for frame):
•
•
•
•
•
•
•
Spatiotemporal cues
Value cues
Properties
Functions
Cause/enablement information
Class memberships
Synonyms/antonyms
– To acquire new words from context:
• Distinguish relevant/irrelevant information
• Selectively combine relevant information
• Compare this information with previous beliefs
70
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Sternberg
• The couple there on the blind date was
not enjoying the festivities in the least.
An acapnotic, he disliked her smoking;
and when he removed his hat, she, who
preferred “ageless” men, eyed his
increasing phalacrosis and grimaced.
71
QuickTime™ and a
GIF decompressor
are needed to see this picture.
State of the Art: Vocabulary Learning (II)
Some dubious contributions:
•
Mueser 1984: “Practicing Vocabulary in Context”
–
•
BUT: “context” = definition !!
Clarke & Nation 1980: a “strategy” (algorithm?)
1.
2.
Look at word & context; determine POS
Look at grammatical context
•
3.
Look at wider context
•
4.
E.g., “who does what to whom”?
[E.g., search for Sternberg-like clues]
Guess the word; check your guess
72
QuickTime™ and a
GIF decompressor
are needed to see this picture.
CVA: From Algorithm to Curriculum
“guess the word”
=
“then a miracle occurs”
•
•
73
Surely,
we computer scientists
can “be more explicit”!
QuickTime™ and a
GIF decompressor
are needed to see this picture.
CVA: From algorithm to curriculum …
• Treat “guess” as a procedure call (“subroutine”)
– Fill in the details with our algorithm
– Convert the algorithm into a curriculum
• To enhance students’ abilities to use deliberate CVA strategies
– To improve reading comprehension
… and back again!
• Use knowledge gained from CVA case studies to
improve the algorithm
• I.e., use Cassie to learn how to teach humans
& use humans to learn how to teach Cassie
74
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Why not use a dictionary?
Because:
• People are lazy (!)
• Dictionaries are not always available
• Dictionaries are always incomplete
• Dictionary definitions are not always useful
– ‘chaste’ =df clean, spotless / “new dishes are chaste”
– ‘college’ =df a body of clergy living together and
supported by a foundation
• Most words are learned via incidental CVA,
not via dictionaries
• Most importantly:
– Dictionary definitions are just more contexts!
75
QuickTime™ and a
GIF decompressor
are needed to see this picture.
How Does Our System Work?
• Uses a semantic network computer system
– semantic networks = “concept maps”
– serves as a model of the reader
– represents:
• reader’s prior knowledge
• the text being read
– can reason about the text and the reader’s knowledge
82
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Fragment of reader’s prior knowledge:
m3 = In “real life”, white is a color
m6 = In “real life”, harts are deer
m8 = In “real life”, deer are mammals
m11 = In “real life”, halls are buildings
m12 = In “real life”, b1 is named “King Arthur”
m14 = In “real life”, b1 is a king
(etc.)
83
QuickTime™ and a
GIF decompressor
are needed to see this picture.
m16 = if
& if
& if
then
v3 has property v2
v2 is a color
v3  v1
v1 is a kind of physical object
84
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Reading the story:
m17 = In the story, b2 is a hart
m24 = In the story, the hart runs into b3
(b3 is King Arthur’s hall) – not shown
(harts are deer) – not shown
85
QuickTime™ and a
GIF decompressor
are needed to see this picture.
The entire network showing the reader’s mental context
consisting of prior knowledge, the story, & inferences.
The definition algorithm searches this network & abstracts
parts of it to produce a (preliminary) definition of ‘brachet’.
86
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Implementation
• SNePS (Stuart C. Shapiro & SNeRG):
– Intensional, propositional semantic-network
knowledge-representation & reasoning system
– Node-based & path-based reasoning
• I.e., logical inference & generalized inheritance
– SNeBR belief revision system
• Used for revision of definitions
– SNaLPS natural-language input/output
– “Cassie”: computational cognitive agent
87
QuickTime™ and a
GIF decompressor
are needed to see this picture.
How It Works
• SNePS represents:
– background knowledge + text information
in a single, consolidated semantic network
• Algorithms deductively search network for
slot-fillers for definition frame
• Search is guided by desired slots
– E.g., prefers general info over particular info,
but takes what it can get
88
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Noun Algorithm
Find or infer:
• Basic-level class memberships
(e.g., “dog”, rather than “animal”)
– else most-specific-level class memberships
– else names of individuals
•
•
•
•
•
Properties of Ns (else, of individual Ns)
Structure of Ns (else …)
Functions of Ns (else …)
Acts that Ns perform (else …)
Agents that perform acts w.r.t. Ns
& the acts they perform (else…)
• Ownership
• Synonyms
Else do: “syntactic/algebraic manipulation”
• “Al broke a vase”  a vase is something Al broke
– Or: a vase is a breakable physical object
89
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Verb Algorithm
• Find or infer:
– Predicate structure:
• Categorize arguments/cases
– Results of V’ing:
• Effects, state changes
– Enabling conditions for V
• Future work:
– Classification of verb-type
– Synonyms
• [Also: preliminary work on adjective algorithm]
90
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Belief Revision
•
Used to revise definitions of words with different
sense from current meaning hypothesis
SNeBR (ATMS; Martins & Shapiro 88):
•
–
If inference leads to a contradiction, then:
1.
2.
•
SNeBR asks user to remove culprit(s)
& automatically removes consequences inferred from culprit
SNePSwD (SNePS w/ Defaults; Martins & Cravo 91)
–
•
Currently used to automate step 1, above
AutoBR (Johnson & Shapiro, in progress)
& new default reasoner (Bhushan & Shapiro, in progress)
–
Will replace SNePSwD
91
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Revision & Expansion
• Removal & revision being automated via SNePSwD by ranking all propositions
with kn_cat:
most
certain
intrinsic
story
least
certain
info re: language; fundamental background info
(“before” is transitive)
info in text
(“King Lot rode to town”)
life
background info w/o variables or inference
(“dogs are animals”)
story-comp
info inferred from text (King Lot is a king, rode on a horse)
life-rule.1
everyday commonsense background info
(BearsLiveYoung(x)  Mammal(x))
life-rule.2
specialized background info
(x smites y  x kills y by hitting y)
questionable already-revised life-rule.2; not part of input
92
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Belief Revision: “smite”
•
•
Misunderstood word; 2-stage “subtractive” revision
Background knowledge includes:
(*) smite(x,y,t)  hit(x,y,t) & dead(y,t) & cause(hit(x,y,t),dead(y,t))
P1: King Lot smote down King Arthur
D1: If person x smites person y at time t, then x hits y at t, and y is dead at t
Q1: What properties does King Arthur have?
R1: King Arthur is dead.
P2: King Arthur drew Excalibur.
Q2: When did King Arthur do this?
• SNeBR is invoked:
– KA’s drawing E is inconsistent with being dead
– (*) replaced: smite(x,y,t)  hit(x,y,t) & dead(y,t) & [dead(y,t)  cause(hit, dead)]
D2: If person x smites person y at time t, then x hits y at t & (y is dead at t)
P3: [another passage in which ~(smiting  death)]
D3: If person x smites person y at time t, then x hits y at t
93
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Belief Revision: “dress”
•
•
“additive” revision
Bkgd info includes:
(1) dresses(x,y)  z[clothing(z) & wears(y,z)
(2) Spears don’t wear clothing (both kn_cat=life.rule.1)
P1: King Arthur dressed himself.
D1: A person can dress itself; result: it wears clothing.
P2: King Claudius dressed his spear.
[Cassie infers: King Claudius’s spear wears clothing.]
Q2: What wears clothing?
•
SNeBR is invoked:
–
–
–
KC’s spear wears clothing inconsistent with (2).
(1) replaced: dresses(x,y)  z[clothing(z) & wears(y,z)] v NEWDEF
Replace (1), not (2), because of verb in antecedent of (1) (Gentner)
P3: [other passages in which dressing spears precedes fighting]
D2: A person can dress a spear or a person;
result: person wears clothing or person is enabled to fight
94
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Figure out meaning of word from what?
• context (i.e., the text)?
– Werner & Kaplan 52, McKeown 85, Schatz & Baldwin 86
• context and reader’s background knowledge?
– Granger 77, Sternberg 83, Hastings 94
• context including background knowledge?
– Nation & Coady 88, Graesser & Bower 90
• Note:
– “context” = text  context is external to reader’s mind
• Could also be spoken/visual/situative (still external)
– “background knowledge”: internal to reader’s mind
• What is (or should be) the “context” for CVA?
95
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Some Proposed Preliminary Definitions
(to extract order out of confusion)
• Unknown word for a reader =def
– Word or phrase that reader has never seen before
– Or only has vague idea of its meaning
• Different levels of knowing meaning of word
– Notation: “X”
96
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proposed preliminary definitions
• Text =def
– (written) passage
– containing X
– single phrase or sentence … several
paragraphs
97
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proposed preliminary definitions
• Co-text of X in some text =def
– The entire text “minus” X; i.e., entire text surrounding X
– E.g., if X = ‘brachet’, and text =
• “There came a white hart running into the hall with a white brachet
next to him, and thirty couples of black hounds came running after
them.”
Then X’s co-text in this text =
• “There came a white hart running into the hall with a white ______
next to him, and thirty couples of black hounds came running after
them.”
– Cf. “cloze” tests in psychology
• But, in CVA, reader seeks meaning or definition
– NOT a missing word or synonym: There’s no “correct” answer!
– “Co-text” is what many mean by “context”
• BUT: they shouldn’t!
98
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proposed preliminary definitions
• The reader’s prior knowledge =def
– the knowledge that the reader has when s/he
begins to read the text
– and is able to recall as needed while reading
• “knight picks up & carries brachet” ? small
• Warnings:
– “knowledge”  truth
• so, “prior beliefs” is better
– “prior” vs. “background” vs. “world”, etc.
• See next slide!
99
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proposed preliminary definitions
• Possible synonyms for “prior knowledge”,
each with different connotation:
– Background knowledge:
• Can use for information that author assumes reader to have
– World knowledge:
• General factual knowledge about things other than the text’s
topic
– Domain knowledge:
• Specialized, subject-specific knowledge about the text’s topic
– Commonsense knowledge:
• Knowledge “everyone” has
– E.g., CYC, “cultural literacy” (Hirsch)
• These overlap:
– PK should include some CSK, might include some DK
– BK might include much DK
100
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Steps towards a
Proper Definition of “Context”
•
Step 1:
–
The context of X for a reader =def
1. The co-text of X
2.
•
“+” the reader’s prior knowledge
Both are needed!
–
After reading:
•
“the white brachet bit the hart in the buttock”
most subjects infer that brachets are (probably) animals, from:
•
•
–
That text, plus:
Available PK premise: “If x bites y, then x is (probably) an
animal.
Inference is not an enthymeme!
•
(argument with missing premise)
101
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proper definition of “context”:
•
(inference not an enthymeme because):
– When you read, you “internalize” the text
•
•
•
You “bring it into” your mind
Gärdenfors 1997, 1999; Jackendoff 2002
“Missing” premise might be in reader’s mind!
– This “internalized text” is more important than the
actual words on paper:
•
•
Text:
Misread as:
“I’m going to put the cat out”
“I’m going to put the car out”
– leads to different understanding of “the text”
– What matters is what the reader thinks the text is,
•
•
Not what the text actually is
Therefore …
102
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proper definition of “context”:
• Step 2:
– The context of X for a reader =def
• A single KB, consisting of:
1. The reader’s internalized co-text of X
2.
“+”
the reader’s prior knowledge
103
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proper definition of “context”:
• But: What is “+”?
– Not: mere conjunction or union!
– Active readers make inferences while reading.
• From text = “a white brachet”
& prior commonsense knowledge = “only physical objects have color”,
reader might infer that brachets are physical objects
• From “The knight took up the brachet and rode away with the brachet.”
& prior commonsense knowledge about size,
reader might infer that brachet is small enough to be carried
– Whole > sum of parts:
• inference from [internalized text + PK]  new info not in text or in PK
• I.e., you can learn from reading!
104
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proper definition of “context”:
• But: Whole < sum of parts!
– Reader can learn that some prior beliefs were mistaken
• Or: reader can decide that text is mistaken (less likely)
• Reading & CVA need belief revision!
• operation “+”:
– input: PK & internalized co-text
– output: “belief-revised integration” of input, via:
• Expansion:
– addition of new beliefs from ICT into PK, plus new inferences
• Revision:
– retraction of inconsistent prior beliefs together with inferences from them
• Consolidation:
– eliminate further inconsistencies
105
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Prior Knowledge
Text
PK1
PK2
PK3
PK4
106
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Prior Knowledge
Text
T1
PK1
PK2
PK3
PK4
107
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Integrated KB
Text
internalization
PK1
T1
I(T1)
PK2
PK3
PK4
108
QuickTime™ and a
GIF decompressor
are needed to see this picture.
B-R Integrated KB
Text
internalization
PK1
T1
I(T1)
PK2
inference
PK3
P5
PK4
109
QuickTime™ and a
GIF decompressor
are needed to see this picture.
B-R Integrated KB
Text
internalization
PK1
I(T1)
T1
T2
PK2
inference
PK3
P5
PK4
I(T2)
P6
110
QuickTime™ and a
GIF decompressor
are needed to see this picture.
B-R Integrated KB
Text
internalization
PK1
I(T1)
T1
T2
PK2
inference
PK3
T3
P5
PK4
I(T2)
P6
I(T3)
111
QuickTime™ and a
GIF decompressor
are needed to see this picture.
B-R Integrated KB
Text
internalization
PK1
I(T1)
T1
T2
PK2
inference
PK3
T3
P5
PK4
I(T2)
P6
I(T3)
112
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Note: All “contextual” reasoning is done in this “context”:
B-R Integrated KB
internalization
PK1
P7
Text
I(T1)
T1
T2
PK2
inference
PK3
T3
P5
PK4
I(T2)
P6
I(T3)
113
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Proper definition of “context”:
• The context that reader R should use to
hypothesize a meaning for R’s internalization of
unknown word X as it occurs in text T =def
– The belief-revised integration of R’s prior
knowledge with R’s internalization of the
co-text T–X.
114
QuickTime™ and a
GIF decompressor
are needed to see this picture.
This definition agrees with…
• Cognitive-science & reading-theoretic views of
text understanding
– Schank 1982, Rumelhart 1985, etc.
• & AI techniques for text understanding:
– Reader’s mind modeled by KB of prior knowledge
• Expressed in AI language (for us: SNePS)
– Computational cognitive agent reads the text,
• “integrating” text info into its KB, and
• making inferences & performing belief revision along the
way
– When asked to define a word,
• Agent deductively searches this single, integrated KB for
information to fill slots of a definition frame
– Agent’s “context” for CVA = this single, integrated KB
116
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Some Open Questions
• Roles of spoken/visual/situative contexts
• Relation of CVA “context” to formal
theories of context (e.g., McCarthy, Guha…)
• Relation of I(T) to prior-KB; e.g.:
– Is I(Ti) true in prior-KB?
• It is “accepted pro tem”.
– Is I(T) a “subcontext” of pKB or B-R KB?
• How to “activate” relevant prior
knowledge.
• Etc.
118
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Problem in Converting
Algorithm into Curriculum
• “A knight picks up a brachet and carries it away …”
• Cassie:
– Has “perfect memory”
– Is “perfect reasoner”
– Automatically infers that brachet is small
• People don’t always realize this:
– May need prompting: How big is the brachet?
– May need relevant background knowledge
– May need help in drawing inferences
• Teaching CVA =? teaching general reading comprehension
– Vocabulary knowledge correlates with reading comprehension
119
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Research Methodology
• AI team:
– Develop, implement, & test better computational
theories of CVA
– Translate into English for use by reading team
• Reading team:
– Convert algorithms to curriculum
– Think-aloud protocols
• To gather new data for use by AI team
• As curricular technique (case studies)
120
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Web Page
http://www.cse.buffalo.edu/~rapaport/cva.html
126
QuickTime™ and a
GIF decompressor
are needed to see this picture.
Download