The Giant: An Agent-based Approach to Knowledge Thomas

advertisement
From: Proceedings of the Eleventh International FLAIRS Conference. Copyright © 1998, AAAI (www.aaai.org). All rights reserved.
The Giant: An Agent-based Approach to Knowledge
Construction & Sharing
Thomas
R. Reichherzer,Alberto J. Cafias, KennethM. Ford, Patrick J. Hayes
Institute for Human
&MachineCognition
TheUniversityof WestFlorida
11000University Parkway
Pensacola, FL32514
treichhe@ai.uwf.edu
Abstract
Thepurposeof this paperis to report ona researchprojectof
a software agent, called the Giant, to support knowledge
construction and sharing amonglearners. The Giant is embeddedin an educational software tool used by students
whencollaborating across classrooms, throughoutschools
and countries, on a subject of study. Basedon propositions
being shared amongthe students, it automatically draws
tentative conclusionsand displays themto the user. At the
user’s requestthe Giantpresentsexplanationsof its reasoning. It is anticipatedthat throughthe interactionwiththis
software agent students will review and refine their own
knowledge.
Introduction
Over the past three years, The University of West Florida
and IBMLatin America have established a communications
networkof K-12 schools, whichincludes not only the telecommunicationstechnology, but most importantly a pedagogical methodology,educational software tools, and curriculum material. This network enables students throughout
schools in Latin Americato collaborate in the construction
and sharing of modelsof their beliefs [Cafias et al. 1996].
The partnership, called Project Quorum,has provided specific tools that enable students to perform this collaboration. One of them, the Concept MapEditor (CMap),
based on the idea of using concept mapsas a visual representation of concepts and their relationships as knownby an
individual [Novak & Gowin 1984]. Concept maps follow
the idea of assimilation theory [Ausubel et al. 1978] where
meaningful learning, as opposed to rote learning, results
from linking new information with relevant preexisting
conceptsor propositions in the learner’s cognitive structure.
CMap’sgraphical interface offers simple point-and-click
commandsto construct concept maps. The system automatically derives propositions encoded in the concept
Copyright©1998,American
Associationfor Artificial Intelligence(www.aaai.org).
All rights reserved.
136 Reichherzer
maps. Optionally, the user can extract propositions manually. All propositions are local to the student’s workspace
and cannot be seen by others. However,the system enables
the user to publish propositions in what is called a "knowledge soup’--a collection of propositions madeby all students. The process of publishing propositions makesclaims
of an individual visible to others, providinga basis for collaboration in the classroom. Students can see claims from
their classmates that are related to their ownpropositions.
(The system has somebuilt-in heuristics to decide about
relatedness amongpropositions.) Furthermore, they can
attach messages to propositions made by others, thereby
intensifying collaboration in the classroom.At no time does
the system reveal any information about ownership of
propositions. Webelieve that this aspect of anonymitywill
encouragepeople whoare less confident to share their ideas
(see [Cafias et al. 1995] for a description of the knowledge
sharing tools).
The environmentis suitable for a software agent that interacts individually with the student, generating tentative
conclusions about topics of study using the students" propositions and a simple set of rules. Theseclaims often represent different threads of thought than those followedby the
student. Werefer to this agent as the Giant because it
’knows’a great deal, but its conclusions maybe unrealistic
and its questions therefore silly but amusing. The Giant
displays its ownclaims to the user and lets him/her decide
on their correctness. Allowing the student to makethese
decisions puts the learner into the position of a teacher to
the Giant, forcing him/her to review both personal propositions, propositions from others, and the Giant’s claims. The
system is not guaranteed to draw rational conclusions from
the concept map(similar to what a humanbeing would do)
or verifying the student’s propositions. Suchan intention is
not pursuedin our study.
Preprocessing
TheGiant obtains propositions from both the student and
the ’knowledgesoup.’ Propositions are simple sentences
typically constructedfromtwo conceptsand a link between
them. Ananalysis of the constituents of the sentences is
essential for the reasoningprocessof the Giant. For that
purpose weprovidedthe system with a grammar(see Figure 1 below)that closely describesthe structure of the sentences derived from the students’ concept maps. Our
grammaris defined for the Spanishlanguage(most countries in Latin Americaspeak Spanish)with respect to the
structure of sentencesfoundin conceptmaps.
S
¯
NP
NP
VP
VB
VB
VB
VB
DO
IDO
¯
¯
¯
¯
¯
¯
¯
¯
¯
NPVP
[ArticleI Quantifier
I Cardinal]
Noun
[Adjective]
[ArticleI Quantifier
I Cardinal]
[Adjective]
Noun
[Quantifier
I Cardinall
VBDO
[IDOl
[Negation]
Verb[Adverb]
[Negation]
[Adverb]
Verb
lModal
I Auxiliaryl
Verb[Adverb]
Negation
[Adverb]
[Auxiliary
I Modal]
Verb
NP
Preposition
NP
Figurel:Grammar
usedby the Giant
when
parsing
propositions.
Thegrammar
also addressesthe fact that propositions in a
concept mapmaynot always follow the grammarof a natural language.In Figure2 both propositions"plants someare
green"and "plants for exampletrees" do not build correct
English sentences. However,concept mapsdo not impose
any restrictions on howto encodepropositions. Therefore,
examples,similar to the onein Figure2, cannotbe avoided
in a classroom
setting.
Preposition, and Noundenote the correspondingclasses of
words. The symbolNegationrepresents the Spanishword
for negatinga proposition.
Usingthe grammar,weconstructed a parser that finds
the structureof a propositionandidentifies the constituents.
Theparser takes a propositionas a triplet T=(X,Y, Z) with
Xas the left concept,Yas the link andZ as the right concept or the right conceptfollowedby anotherlink-concept
sequence.Thelatter case occurs whenthe propositionextends over morethan two concepts and a link (see Figure
3). Eachpart of the triplet is parsedindividuallyusingtransition networks[Allen1987].
I
are
,
~
during
y
~-
-~
----’--g---produce
z
Figure
3: Triplet
structure
of propositions
Theparser uses a lexicon to performstemmingand tagging
of the wordsin the proposition. Theresult of the parsing
processis a table that containseverythingessential for the
Giant’s reasoning. The table reveals information about
keywords
suchas quantifiers, adjectives, adverbs,andverbs
foundin the link betweentwo concepts. Figure 4 showsan
exampleof a table producedby the parsing process from
the sentence"all steamboat captains havemoustaches."
nouns
in theleft concept
/\
some are for example
Figure2: Propositions
in n conceptmap.
The proposed grammardoes not match every possible
proposition whichcan be written as a concept map. Our
objective is to acquirenecessaryinformationfroma proposition for the reasoningprocessof the Giantandnot to correctly parse sentencesof a natural language.In our grammar, the non-terminalsymbolsArticle, Quantifier, Cardinal, Noun, Adjective, Adverb, Verb, Auxiliary, Modal,
quantifier,
cardinal
appliedto theleft concept
ac[jective
in theleft concept
modal/auxiliary
verb
linkingverb
linking
verb~infinitive
form)
adverb
rightconcept
proposition
negated
number
of theleft concept
voiceof theproposition
partof speech
of thefightconcept
steamboat
captains
all
----have
have
--mustaches
no
plural
active
noun
Figure
4: Result
of,*parsing
process.
IntelligentAgents
137
TheGiant’s behavior and appearance
As a motivational device for children, we have given the
Giant a ’personality’ of a friendly and eager learner who
tries to capture the student’s attention by displaying its own
’understanding’ of the world. This understanding is based
on ideas from the student and the shared knowledgein the
classroom. The limited knowledgeand the simple reasoning
methodsutilized by the Giant often lead to less intelligent,
sometimes conceptually wrong conclusions. Hence, the
Giant appears as a silly, funny, and overall helpless being
which sometimes’knows’ some surprising things about the
world.
Students mayprovide clarification to the Giant by telling
it with which conclusions he or she agrees or disagrees.
Conclusions which are not reasonable at all maybe refuted
by sending the message’silly’ to the Giant. In somecases
the Giant requests new concepts from the user in order to
completepropositions (Figure 5 depicts such a request).
UPlg~,u,ua
¯
Figure5: Questions
fromthe Giant.Theentryfield
stating"Cactu~es"
wascompleted
bythe student.
Students maycontrol the Giant’s behavior by selecting
from the two different settings in the system--the Giant’s
actnvity and curiosity. Amongthese settings, students may
choosetheir Giant to be active or lazy, curious or cautious.
An active Giant is eager to produce as manypropositions
as possible whereas an inactive Giant provides a small
numberof propositions that reflect its less active status.
The Giant appears curious about the student’s work whenit
applies the entire set of rules to the propositions, yet does
not insist on complete fulfillment of the premises of the
rules. This behavior setting allows the Giant to derive conclusions that are not necessarily implied in the student’s
propositions. As a result the Giant might explore new facts
about the subject of study. However,in this state there is a
good chance that the Giant derives silly statements. The
cautious state tries to avoid this by executing rules only if
they completelyfulfill the premises.
The Giant’s curiosity and excitement of learning from
the student, and its appreciation whenit receives confirma138 Reichherzer
tion on its ownpropositions is underlined by facial expressions in the form of comical cartoon faces.
The Giant has a general idea of howcertain wordsaffect
concepts or their relationships. These words, called keywords, are used in the Giant’s reasoning process. Since the
Giant does not have any commonsense knowledge or semantic understanding, it cannot draw the line between interesting or boring propositions. Instead, the Giant randomlyselects propositions from the student or the ’knowledge soup.’ This maycause it to sometimesappear ignorant
or moodyand to neglect propositions that might be appealing to question.
The Giant never intrudes upon the student. It works in
the background producing its ownpropositions and waits
for the student to call it. Whenaskedby the student, it presents an explanation of its reasoning. This is whenwe enter
the interaction betweenstudent and Giant. The student has
the choice either to supply information to the Giant or to
send the Giant into the background. The Giant welcomes
the student’s advice and uses it to ’clarify’ its knowledge
about the subject of study. Suchbehavior is in contrast to
the role of an authority that judges and corrects the student’s work.
TheGiant’s reasoning
For reasoning, the Giant uses a set of rules and the parsed
propositions to produceconclusions. The effect of the rules
is to generate propositions that are at a tangent to the student’s thread of thoughts and therefore mayhelp the student to enrich the concept mapby adding new concepts or
linksto it.
Wecan split the rules into three categories. The first
category applies transitivity to propositions comingeither
from the student or the ’knowledgesoup.’ The next category explores keywordsthat characterize the left concept or
the action in the proposition. Suchkeywordsare cluantifiers, cardinals or adjectives associated with the left concept,
adverbs associated with the verb in the link, or verbs that
express causal dependencies amongthe concepts in the
concept map. The last category of rules explore the structure of the student’s concept mapswhich mayreveal information about commonalities amongconcepts.
Weuse the following notation to refer to the information
in the table that the parser creates froma proposition p:
leftCon(p)
rightCon(p)
linkVerblnf(p)
linkVerb(p)
voice(p)
modAuxVerb(p)
quanCardAdj(p)
the left concept
the right concept
the linking verb (infinitive form)
the linking verb
the voice of the proposition
the modal/ auxiliary verbs
a quantifier, cardinal, or adjective
adverb(p)
negated(p)
speechRCon(p)
associatedwith the left concept
adverb
indicates if p contains the negating
word’no’
part of speechof the right concept
Givena propositionp, the followingthree rules matchif
a keyword
is foundor if the Giantis in a curiousstatus.
[quanCardAdj(p)e Kewordlist
then
or status = curious]
newProposition = there are leflCon(p) that linkVerbfp)
[no(t)] ri&htCon(p)
Transitivity
SupposePl and P2 are two propositions either from the
student or the ’knowledge
soup.’ Thetwotransitivity rules
[adverb~p) ¯ Kewo~list or status = curious]
then
newProposiflon
= leflCon(p) linkVerb(p) not always
rightCon(p
arcff
ffghtCon(Pl) ffi leflC°n(P2)
linkVerblnffpI) ffi linkVerblnf(P2)
voice(pI) ffi v°ice(P2)
and
and
then
newPropesition = leflCon(p i ) linkVerb(p2) rightCon(p
2)
if
rightCon(p I ) = leRC°n(P2)
linkVerblnf(p I )" linkVerblnf(P2)
and
voice(p I) = v°ice(P2)
and
and
[(linkVerbInf(Pl), linkVerblnf(P2)) ¢ gewordthen
list or status = curious]
newPropositlon = leftCon(pl) linkVerb(P2) rightCon(P2)
In both rules, the right conceptof Pl has to matchthe left
conceptof P2 andthe voices haveto be identical. Thedifference in the voice matters becausein a passive voice
propositionthe active component
is the right conceptwhile
in an active voice propositionthe active component
is the
left concept. Hence,wecan only apply transitivity to
propositionswith identical voice.
Thefirst rule matchesall propositionsPl andP2 with the
same root form of the linking verb. The second rule
matchespropositionswith different root formsof the linking verbonly if the linkingverbsin their root formare in a
keyword
list or if the Giantis in a curiousstatus. For example, given the two propositions "fish are animals" and
"animalsbreatheair" the Giantconcludesthat "fish breathe
air."
Keywords
TheGiant has access to a list of words, or wordpairs
known
to the systemas keywordsthat signal the plausibility of a conclusion.Thelist is fixed; i.e. neither the Giant
nor the user can delete or addinformationin the list. Later
systemswill explorethe potentiality of adaptingthe list as
the Giant’s reasoningproceeds.
Theconstruction of the newsentence is influenced by
certain wordsthat mayoccur in p. (For simplicity, weomit
details of sentenceconstructionsince they are not relevant
in the currentstudy.)
if
[iinkVerblnf(p) Kewordlist orstatus = c uffthen
OUS]
newProposition = the ability to linkVerblnflp)
rightCon(p) requires something
Acautious Giant uses the rules only if a conclusionis
plausibledueto the existenceof a keywordin the proposition. For example,from"someplants haveleaves" the Giant mayconclude that "there are plants that have no
leaves," or "the ability to haveleaves requires something,"
or "plants havenot alwaysleaves." Thelatter inferenceis
an exampleof the Giant’s limited capabilities of sentence
understandingandconstruction.
Classification
Theclassification rule uses the structure of a concept
mapto performreasoning.It asserts the presenceof a classifier for a set of different conceptswhenthese concepts
themselvesare linked with identical linking verb to a commonconcept.Theidentical linking verbsprovidea hint that
an enumerationof facts mayhaveoccurred. Figure6 shows
an exampleof a conceptmapto whichthe rule applies.
have
/
some have
x
Figure 6: Search for classifier
concepts
This concept mapenumeratessomeparts of plants. A
possibleclassifier for the concepts"roots" and"leaves"is
"eatable." Thus, the concept mapmaybe extendedwith
"roots andleaves are eatable." Theclassification rule concludes that "there is a common
identifier for roots and
leaves." Conditionsin the rule for anygiven propositions
Pl and P2 are a common
left concept, identical linking
verbs, andidentical voicesof the twopropositions.
IntelligentAgents
139
Results
Summary
The tables below show the performance of the system on
two small school projects with concept mapsbuilt in classroom settings by sixth-grade students. In particular, we
were interested in what kind of propositions and questions
the Giant produces and howstudents interact with the Giant. The ’knowledgesoup’ contained propositions derived
from all the concept maps. The domains for the two selected projects are "the nature of plants" and "automobiles." Table I and 2 present a subset of the student’s and
the Giant’s propositions and consequentquestions or explanations by the Giant.
Wehave developed a software agent for Project Quorum,
called the Giant with workingversions in English, Spanish,
and Portuguese. The Giant’s task is to draw conclusions
from the student’s propositions and knowledge that is
shared in the classroom, display them to the student, and
ask for verification. Our tests have shownthat the Giant
produces claims that mostly are interesting and smart but
sometimesare boring and silly. Most important, however,
is that the Giant’s claims can suggest newways of thinking
which arise from other propositions. As a consequence,
students are faced to analyze their ownor other peoples’
propositions, leading them to refine and broaden their own
knowledge.Weanticipate that this process will result in
richer and broader concept maps.
Wehave introduced a relatively small set of inference
rules with surprising results. The effort remains to enlarge
significantly the set of rules that perform the automatic
reasoning and to incorporate learning into the system that
wouldenable the agent to test new facts against those discovered in previous sessions.
Student’s propositions
plants need sunlight
plants need minerals
someplants have leaves
plants need water
Giant’s questions
Do you know which plants
do not have leaves?
Do you know when plants
don’t have leaves?
Giant’s propositions
there are plants that don’t
have leaves
plants makephotosyntheses
plants release water
plants don’t always have
leaves
Giant’s explanations
"plants have leaves and
leaves make photosyntheses; thus, plants makephotosysnthesis’"
References
Allen, J. (1987). Natural LanguageProcessing. NewYork:
The Benjamin/CummingsPublishing Company.
Table 1: Test results of project "the nature of plants."
Student’s propositions
automobilesare vehicles
automobiles need oil
automobiles need gasoline
automobiles have windows
automobiles have wheels
aut0_mob!!e,sproduce fumes
Giant’s questions
Do you know when automobiles do not produce
fumes?
Do you know automobiles
that have not always
wheels7
Giant’s propositions
there are automobiles that
have no wheels
there are automobiles that
do not produce fumes
automobiles produce noise
Giant’s explanations
"automobiles have engines
and engines produce noise,
thus, automobiles produce
noise"
Table 2: Test results of project "automobiles."
In both tests, the student was collaborating with the Giant
actively, trying to answerall the Giant’s questions. Onaverage, the student refuted two out of ten questions with
"silly" and supplied for three out of six questions newconcepts.
140 Reiehherzer
Ausubel, D. P., Novak, J. D. Hanesian, H. (1978). Educational Psychology: A Cognitive View (24 ed.). NewYork:
Holt, Rinehart & Winston. Reprinted, 1986. NewYork:
Warbel & Peck.
Cafias, A. J., Ford, K., Hayes,P. J., Brennan,J., Reichherzer, T. (1995). KnowledgeConstruction and Sharing
Quorum, Proceedings of AI-ED 95, Washington DC, pp.
218-225.
Cafias, A. J., Ford, K., Hayes,P. J., Brennan, J., Reichherzer, T., Suri, N. (1996). An Environmentfor the Construction and Sharing of Knowledge,Proceedings of the Florida
Artificial Intelligence Research Symposium(FLAIRS96).
Key West, pp. 238-242.
Novak, J. D. & Gowin, D. B. (1984). Learning Howto
Learn. NewYork: CambridgeUniversity Press.
Download