A Comprehensive Approach to Argumentation

advertisement
56
From: AAAI Technical Report SS-99-01. Compilation copyright © 1999, AAAI (www.aaai.org). All rights reserved.
A Comprehensive Approach to Argumentation
Philip N. Judson and Jonathan Vessey
LHASA
Ltd
School
ofChemistry
University
ofLeeds
Leeds
LS29JT
OK
judson@dircon.co.uk
jonv@chcin.lccds.ac.uk
Abstract
(save in the case of a conclusiveargument
in oppositionto
the first, whichis the specialcaseof contradiction).
A modelhasbeendeveloped
whichimplements
thekey
principles
of theLogic
ofArgumentation
- a method
of
reasoning
based
onthehuman
process
ofreaching
conclusions
Reasoning about Toxicity
bycomparing
arguments
forandagainst
propositions.
The
modelis beingusedto improve
reasoningaboutthe potential
Three examplesillustrate inter-relationships between
toxicityof novelcompounds.
argumentswhicha modelfor reasoning about poltntial
toxicity needsto incorporate.
Introduction
It is generally thought that chemicals showing
carcinogenicity
in rats should not automatically be
There is a needto makepredictions about toxicities of
expected
to
be
similarly
toxic to other species (including
novel substances, perhaps even before they have been
man)if they promoteperoxisomeproliferation, because
synthesised. Experienced
toxicologists
aremoderately
this mechanismis peculiar to the rat. Of course, a
good
atdoing
this,
even
when
theavailable
information
is
chemicalmaybe carcinogenicfor different reasons, andso
veryuncertain.
Howcancompultrs
assist
them?
it is not valid to state, or to imply,that a compound
that is
carcinogenicto rats but causesperoxisome
proliferation is
Oneapproach
seeksto imitalt
humanexperts
in
safe to man- or eventhat the evidencearguesin favourof
knowledge-based
expert
systems
(KBS).
There
havebeen
safety. A reasoning model needs to be able to deal
somesuccesses
in theuseof KBSin toxicology.
For
correctly with evidencethat undercutsan argument,rather
example,
theDEREK
system
isreported
tobequilt
good
thannegatesit, like this.
atpredicting
thepotential
ofchemicals
asskin
scnsitizers
(Deardenet al. 1997). Current KBSnevertheless fall
Certain
substructural
features
in chemicals
are
long wayshort of the reasoningcapabilities of a human
commonly
associated
with
carcinogenicity.
In
some
cases,
being.
this
connection
isnotstrictly
corrcc%
thereality
being
that
positive Amestest results havebeenlinked to the presence
Thelogic of argumentation(LA)(Krauseet al. 1995)
of the sub-structuralfeatures. TheAmes
test is an indirect
wasused as the basis of a prototypeimprovedsystemfor
indicator of potential carcinogenicitybut this distinction
predictingtoxicity (Foxet al. 1996),but with a restricted
has been lost somewherealong the line between the
model.This paper outlines a morecompletemodel.
original researchersand, ultimately, the popularpress. A
computerreasoning system should construct the overall
LAformalists the waythat people reach decisions by
prediction of carcinogenicityfromthe basic steps, but it
identifying the argumentsfor and against a proposition,
shouldbe able to report its reasoningin full, so that a user
and then weighing them against each other. Some
is properlyinformed.
arguments
relate directly to the decisionor propositionof
interest. Other argumentsmaybe about the grounds of
Electrophiles with reactivitics within a particular
the direct ones, or about the reliability of the arguments
range cause skin sensitization. Theymustbe sufficiently
themselves.So the reasoningprocess can be represented
reactive to combine
with proteins, leadingto the raising of
by a tree graph.
antibodies, but not so reactive that, becomingcompletely
Arguments
in a chain
ofreasoning
in a complicated boundto the skin surface, they never reach the site of
action. Suchreactivity can be assumedwith reasonable
domain
arcrarely
conclusive.
Indeed,
if thereis a
confidence
for a substancethat has certain substructural
conclusive
argument
fororagainst
a proposition,
then
the
features
(for
example,an alpha-betaunsaturatedaldehyde,
matter is decidedand all other argumentsare irrelevant
57
whichacts as a Michaelacceptor). Substancescontaining
the right features do not all provokesensitization, for a
variety of reasons. Oneis thought to be the failure of a
substanceto penetrate the skin becauseof its physicochemicalproperties. Acomputersystem needsto be able
to modifyits predictions about a chemicalthat meetsthe
criteria for electophilicity, after taking into account
physicalpropertiesthoughtto affect skin penetration.
An Outline of the Model
The modelwehave develol~dwill be described in detail
elsewhere (Judson and Vessey, in press), and only
outline is given here. Figure 1 illustrates the main
elementsof a tree for reasoningin LA.
"’:y
B" /f’:
Figure
l" o,:o
f
1 - A simpleI~re~~
Thecapital letters at the nodesof the graph represent
assertions and the arcs represent argumentsthat connect
them.Thegraphis a directed one, fromleR to right. So,
for example,the tree indicates that the likelihood of C
maybe deducedfromthe likelihood of A. Thelower case
letters represent degrees of persuasiveness of the
argumentsthemselves.So, for example,"a" is howlikely
C wouldbe if Awereproventrue (or howunlikely, if the
truth of Aarguesfor the falsenessof C).
Reasoning along the chain from A to E is like the
prediction of carcinogenityfroma substructural feature
associated with positive Amestest results. Step A to C
might be "if such-and-suchsub-structural feature is
present in the molecule,a positive Amestest result is
probable".Step C to E mightbe "if an Ames
test result is
positive, then carcinogenicity is plausible". A more
complete implementationwouldfurther sub-divide the
chain to include the implicit steps from "Amestest
positive’ to "genotoxic" and from "genotoxic" to
"carcinogenic".
Thepersuasivenessof an argumentcan be subject to other
influences, as in the case of predicting toxicity in man
fromobservedtoxicity in the rat, wherethe argumentis
weakened
if peroxisome
proliferation is also seen. This is
the purposeof arcs leading towardsthe values of other
arcs, suchas the oneleadingfrom"F" to "a".
In some domains, there may be a case for allowing
argumentsof the kind "If Ais plausible then C is true",
whichwouldrequire the notion of a threshold value for
the argument linking A and C. "a’" represents this
threshold value. Theconceptis includedfor completeness
but it will not be discussedfurther in this paper.
In LAan argumentfor a proposition is consideredto
haveno direct effect onbelief that the propositionis false,
and vice versa. For example, observed toxicity to an
animal maysupport the proposition that a chemicalmay
be toxic in man:absenceof observedtoxicity in the animal
does not supportthe propositionthat the chemicalis nontoxic in man;there is simplya lack of evidenceto support
the proposition
that it is.
This does not mean that negative reasoning is
excluded,only that it mustbe explicit. Toassert that the
absenceof toxicity in laboratoryanimalsmakestoxicity in
manunlikely,it is necessaryto write a rule suchas, "If no
toxic effects are observedin laboratoryanimalsthen it is
doubtedthat the chemicalwill be toxic in man".
In the implementationof the modelwhichwefavour,
if there are several argumentsin supportof a conclusion,
the level of belief in the conclusionis equal to the level
conferredby the strongest argument.For example,it does
not matter howmanyindependentargumentssupport the
viewthat a conclusionis plausible, it remainsplausible.
Nonumberof reasons for thinking somethingis plausible
shouldmakeit a certainty, to use an extremeillustration.
If the weightsof argumentfor and against a conclusion
are equal, belief in the conclusionis classedas equivocal.
This is distinguished from the case wherethere is no
evidence either way, which is classed as open. For
example, given the information that a chemical shows
carcinogenicity in the rat but also causes peroxisome
proliferation, it maybe appropriateto class the likelihood
that the chemicalwill be carcinogenic in manas open,
whereastoxicological concernwouldbe equivocal for a
chemicalwith the right structural features for a skin
sensitizer but with physical properties on the borderline
for skin penetration.
Giventhe statement "if A is true then the level of
belief in C is a", what should happenif A is less than
true? In the model,the level of C becomeswhicheveris
the weaker- the level of belief in A or the value of a.
Supposefor illustration, that ’probable’ meansmore
58
certain than ’plausible’ and ’certain’ meanscompletely
certain. Giventhe statement "If A is certain then C is
probable": C can never be predicted to be more than
probable on the basis of evidence about A; if A is
probable,thenso is C; ifA is plausible,then C is also only
plausible.
As commented
above,if Ais provenfalse, nothing can
be said about C on the basis of this evidence.It is not
valid to concludethat C is also false, becauseit maybe
true for someother reason. Moregenerally, if evidence
about the truth of Afalls anywhere
in the realmof doubt
rather than belief, nothingcan be concludedaboutC.
Theseparationof argumentsfor andagainst raises the
possibility of contradiction.Theoretically,there mightbe
an argumentthat proves a conclusion to be true and an
independentargumentthat proves it to be false. Sucha
situation can arise in practice, too - as the result of
erroneous observations, for example, or because of
genuinescientific contradiction that has not yet been
resolved.
In the implementation of the model which we are
using, contradiction dominatesamongargumentsrelating
directly to the sameproposition,unless it is over-ruledby
an independentargumentthat the proposition is true or
false: it is usual scientific practice whenconfrontedwith
apparentcontradictionto seek out alternative evidenceto
resolvethe difficulty. Contradictionis not transmittedas
contradiction fromone level to the next in the reasoning
process,but is interpretedas the openstate. Giventhat "if
A is true then C is plausible", for example,contradiction
in Ameansthat Ais either true or false. If it is true then
C is plausible. Asexplainedabove,if Ais false then C is
open,whichincludesthe possibility of its beingplausible.
So, overall, Cis open.
The Basic Model
The central componentsof the modelare the methodsfor
aggregating argumentsrelating to the sameproposition
(e.g. the combinationof degrees of belief in E conferred
via the arcs fromC andDin figure I) and for determining
the degreeof belief conferredby each argument(e.g. the
belief transmittedto C fromA). Theyare as follows.
Combining Arguments
Theaggregatevaluefor a node,T, is determinedby:
T = Resolve[Max{For(C~x,
Cb,y.... },
Max{
Against(C~.x,
Cb.y.... }1
where: Resolve[] returns the single value whichis the
resolution of any pair of opposingforces in a resolution
matrix(see below);the sets For andAgainstare the sets
argumentssupportingand opposingT, respectively; C~,x,
Cb,y.... are the forcesof thosearguments;
Max{...} returns
the memberof the set upon whichit operates whichis
highest in an ordered list For or Against, as appropriate
(see
below).
Reasoning along a Chain
Thevaluetransmittedalongan arc in the decisiontree for
the statement, "if a is Xothen C is x," is determinedas
follows:
if Xois exclusive
then
if a = Xothen
C=x
else
C = open
endif
else if x ~ Subset(-,a)then
C = open
else ifa = Min{a,Xo}then
C -- Min{a,x}
else
C=x
end
where:Subset(~a)is the set of terms For or Againstwhich
does not contain value ’a’; Min{a,x} is the term ranking
lowerin the orderedlist, For or Against,to whicha and x
belong.Theterm ’exclusive’ is explainedbelow.
The Resolution Matrix
For the equationaboveto operateit is necessaryto define
a matrixfor the resolutionof pairs of levels of belief. The
matrixwhichweare using is shownin Table1.
OrderedSets For and Against
Weare usingthe followingsets:
For
Against
certain
impossible
contradicted
contradicted
probable
improbable
plausible
doubted
equivocal
equivocal
open
open
The operations described aboveare only valid for terms
that can be ordered and resolved unambiguously.
For the
59
purposes of this implementationof the modelthe terms
are defined as follows (note that in somecases the
definitions are morerestrictive than the meaningsof the
wordsin common
usage)"
certain
there is proofthat the propositionis true
probable
there is at least one strong argumentthat
the proposition is true and there are no
argumentsagainst it
plausible
the weight of evidence supports the
proposition
equivocal
there is an equalweightof evidencefor and
against the proposition
improbable there is at least one strong argumentthat
the proposition is false and there are no
arguments
that it is true
impossiblethere is proof that the propositionis false
Themembers
of the sets aboveare categorisedas follows:
Exclusiveterms:contradicted, equivocal,open.
Non-exclusiveterms: impossible, improbable, doubted,
plausible, probable,certain.
Conclusion
This paper outlines a modelwhichwebelieve can be used
for reasoning under uncertainty in any domain. Our
current interest is in using it to improvethe performance
of knowledge
basedsystemsfor toxicology.
Acknowledgements
Wewishparticularly to thank JohnFoxfor his vision in
promotingthe basic concepts of LAand Paul Krausefor
his advice and commentsduring the developmentof the
modeldescribedhere.
contradicted there is apparently proof that the
propositionis both true andfalse
open
there is no evidence that supports or
opposesthe proposition
Exclusivity
Whensometerms from the abovesets are attached to a
given argumentor proposition, they implythe terms below
themin the list. For example,if somethingis known
to be
certain, then the answerto the question"is it plausible?"
is "yes". Someterms excludeall others. For example,the
state of knowledgeabout somethingcannot be open and
simultaneouslyhave someother value. The latter terms
are describedas’exclusive’.
References
Dearden,J. C.; Barratt, M. D.; Benigni, R.; Bristol, D.
W.; Combos,R. D.; Cronin, M. T. D.; Judson, P. N.;
Payne, M. P.; Richard, A. M.; Tichy, M.; Worth,A. P.;
and Youriek,J. J. 1997. The Development
and Validation
of Expert Systemsfor Predicting Toxicity. ATLA25:223252.
P. N. and Vessey, J. A Comprehensive
Approach
to Argumentation. Submitted to the IMAJournal of
Mathematics
Appliedin Businessand Industry, 1999.
Judson,
P. J.; Ambler,S. J.; Elvang-Geransson,M.; and
Fox, J. 1995. A Logic of Argumentationfor Reasoning
UnderUncertainty. ComputationalIntelligence 11(1):
113-121.
Krause,
Table 1
The resolution matrix
impossible
contradicted
improbable
doubted
equivocal
open
certain
contradicted probable
contradictedimpossible
impossible
certain
contradicted
contradicted
certain
contradicted
equivocal
certain
contradicted plausible
certain
contradicted probable
certain
contradicted
probable
plausible
impossible
contradicted
doubted
equivocal
plausible
plausible
equivocal
impossible
contradicteA
improbable
doubted
equivocal
equivocal
open
impossible
contradicted
improbable
doubted
equivocal
open
Download