Uncertainty

advertisement
Uncertainty
Chapter 14
Copyright, 1996 © Dale Carnegie & Associates, Inc.
Uncertainty
Evolution of an intelligent agent: problem solving,
planning, uncertainty
It is an unavoidable problem in reality.
An agent must act under uncertainty.
To make decision with uncertainty, we need
Probability theory
Utility theory
Decision theory
CS 471/598 by H. Liu
2
Sources of uncertainty
No access to the whole truth
No categorical answer
Incompleteness
The qualification problem - impossible to explicitly
enumerate all conditions
Incorrectness of information about conditions
The rational decision depends on both the
relative importance of various goals and the
likelihood of its being achieved.
CS 471/598 by H. Liu
3
Handling uncertain
knowledge
Difficulties in using FOL to cope with UK
A dental diagnosis system using FOL
Reasons
Laziness - too much work!
Theoretical ignorance - we don’t know everything
Practical ignorance - we don’t want to include all
Represent UK with a degree of belief
The tool for handling UK is probability theory
CS 471/598 by H. Liu
4
Probability provides a way of summarizing the
uncertainty that comes from our laziness and
ignorance - how wonderful it is!
Probability, belief of the truth of a sentence
1 - true, 0 - false, 0<P<1 - intermediate degrees of
belief in the truth of the sentence
Degree of truth (fuzzy logic) vs. degree of belief
Alternatives to probability theory?
CS 471/598 by H. Liu
5
All probability statements must indicate
the evidence wrt which the probability is
being assessed.
Prior or unconditional probability before
evidence is obtained
Posterior or conditional probability after new
evidence is obtained
CS 471/598 by H. Liu
6
Uncertainty & rational
decisions
Without uncertainty, decision making is simple achieving the goal or not
With uncertainty, it becomes uncertain - three
plans A90, A120 and A1440
We need first have preferences between the
different possible outcomes of the plans
 Utility theory is used to represent and reason with
preferences.
CS 471/598 by H. Liu
7
Rationality
Decision Theory = Probability T + Utility T
Maximum Expected Utility Principle defines
rationality
An agent is rational iff it chooses the action that
yields the highest utility, averaged over all possible
outcomes of the action
A decision-theoretic agent (Fig 14.1, p 419)
Is it any different from other agents we learned?
CS 471/598 by H. Liu
8
Basic probability notation
Prior probability
Proposition - P(Sunny)
Random variable - P(Weather=Sunny)
Each RV has a domain (sunny,rain,cloudy,snow)
Probability distribution P(weather) = <.7,.2,.08,.02>
Joint probability P(A^B)
probabilities of all combinations of the values of a set
of RVs
more later
CS 471/598 by H. Liu
9
Conditional probability
Conditional probability
P(A|B) = P(A^B)/P(B)
Product rule - P(A^B) = P(A|B)P(B)
Probabilistic inference does not work like logical
inference
“P(A|B)=0.6” != “when B is true, P(A) is 0.6”
P(A)
P(A|B), P(A|B,C), ...
CS 471/598 by H. Liu
10
The axioms of probability
All probabilities are between 0 and 1
Necessarily true (valid) propositions have
probability 1, false (unsatisfiable) 0
The probability of a disjunction
P(AvB)=P(A)+P(B)-P(A^B)
A Venn diagram illustration
CS 471/598 by H. Liu
11
The joint probability
distribution
Joint completely specifies an agent’s
probability assignments to all propositions
in the domain
A probabilistic model consists of a set of
random variables (X1, …,Xn).
An atomic event is an assignment of
particular values to all the variables.
CS 471/598 by H. Liu
12
Joint probabilities
An example of two Boolean variables
Toothache
Cavity
!Cavity
0.04
0.01
!Toothache
0.06
0.89
• Observations: mutually exclusive and
collectively exhaustive
• What are P(Cavity), P(Cavity v Toothache),
P(Cavity|Toothache)?
CS 471/598 by H. Liu
13
Joint (2)
Impractical to specify all the entries for
the Joint over n Boolean variables.
If there is a Joint, we can read off any
probability we need.
Sidestep the Joint and work directly with
conditional probability
CS 471/598 by H. Liu
14
Bayes’ rule
Deriving the rule via the product rule
P(B|A) = P(A|B)P(B)/P(A)
A more general case is P(X|Y) = P(Y|X)P(X)/P(Y)
Bayes’ rule conditionalized on evidence E
P(X|Y,E) = P(Y|X,E)P(X|E)/P(Y|E)
Applying the rule to medical diagnosis
meningitis (P(M)=1/50,000)), stiff neck (P(S)=1/20),
P(S|M)=0.5, what is P(M|S)?
Why is this kind of inference useful?
CS 471/598 by H. Liu
15
Applying Bayes’ rule
Relative likelihood
Comparing the relative likelihood of meningitis and
whiplash, given a stiff neck, which is more likely?
P(M|S)/P(W|S) = P(S|M)P(M)/P(S|W)P(W)
Avoiding direct assessment of the prior
P(M|S) =? P(!M|S) =? And P(M|S) + P(!M|S) = 1,
P(S) = ? P(S|!M) = ?
Normalization - P(Y|X)=P(X|Y)P(Y)
How to normalize (Ex 14.7)?
Make the entries in the table P(Y|X) sum to 1
CS 471/598 by H. Liu
16
Using Bayes’ rule
Combining evidence
from P(Cavity|Toothache) and P(Cavity|Catch) to
P(Cavity|Toothache,Catch)
Bayesian updating
from P(Cavity|T)=P(Cavity)P(T|Cavity)/P(T)
to P(Cavity|T,Catch)=•
P(Catch|T,Cavity)/P(Catch|T)
Independent events A, B
P(B|A)=P(B), P(A|B)=P(A), P(A,B)=P(A)P(B)
Conditional independence (X and Y are ind given Z)
P(X|Y,Z)=P(X|Z)
CS 471/598 by H. Liu
17
Where do probabilities
come from?
There are three positions:
The frequentist - numbers can come only from experiments
The objectivist - probabilities are real aspects of the universe
The subjectivist - characterizing an agent’s belief
What’s the probability that the sun will still exist
tomorrow? (P 430)
The reference class problem
The doctor categorizes patients - an example
CS 471/598 by H. Liu
18
Summary
Uncertainty exists in the real world.
It is good (it allows for laziness) and bad (we
need new tools)
Priors, posteriors, and joint
Bayes’ rule - the base of Bayesian Inference
Conditional independence allows Bayesian
updating to work effectively with many pieces
of evidence.
But ...
CS 471/598 by H. Liu
19
Download