Logic 1

advertisement
CSM6120
Introduction to Intelligent Systems
Propositional and Predicate Logic
rkj@aber.ac.uk
Logic and AI

Would like our AI to have knowledge about the world,
and logically draw conclusions from it

Search algorithms generate successors and evaluate them,
but do not “understand” much about the setting

Example question: is it possible for a chess player to have
8 pawns and 2 queens?

Search algorithm could search through tons of states to see if
this ever happens…
Types of knowledge


There are four basic sorts of knowledge (different
sources may vary as to number and definitions)
Declarative



Facts
E.g. Joe has a car
Procedural


Knowledge exercised in the performance of some task
E.g. What to check if car won't start

First check a, then b, then c, until car starts
Types of knowledge

Meta



Knowledge about knowledge
E.g. knowing how an expert system came to its conclusion
Heuristic


Rules of thumb, empirical
Knowledge gained from experience, not proven

i.e. could be wrong!
Methods of representation

Object-attribute-value

Say you have an oak tree.You want to encode the fact that it is
a tree, and that its species is oak

There are three pieces of information in there:




It's a tree
It has (belongs to) a species
The species is oak
The fact that all trees have a species might be obvious to you,
but it's not obvious to a computer

We have to encode it
How to encode?

Maybe something like this:




(species tree oak)
species(tree, oak)
tree(species, oak)
Or some other similar method that your program can
parse and understand
What if you’re not sure?

You could include an uncertainty factor

The uncertainty factor is a number which can be taken into
account by your program when making its decisions

The final conclusion of any program where uncertainty
was used in the input is likely to also have an uncertainty
factor

If you're not sure of the facts, how can the program be
sure of the result?
Encoding uncertainty

To encode uncertainty, we may do something like this:


species(tree, oak, 0.8)
This would mean we were only 80% sure that the tree
concerned was an oak tree
Rules

A knowledge base (KB) may have rules

IF premises THEN conclusion

May be more than one premise

May contain logical functions


AND, OR, NOT
If premises evaluate to TRUE, rule fires

IF tree (species, oak) THEN tree (type, deciduous)
More complex rules
IF tree is a beech
AND the season is summer
THEN tree is green
IF tree is a beech
AND the season is summer
AND tree-type is red beech
THEN tree is red


This example illustrates a
problem whereby two rules
could both fire, but lead to
different conclusions
Strategies to resolve would
include firing the most
specific rule first

The second rule is more
specific
What is a Logic?

A language with concrete rules




Many ways to translate between languages



A statement can be represented in different logics
And perhaps differently in same logic
Expressiveness of a logic


No ambiguity in representation (may be other errors!)
Allows unambiguous communication and processing
Very unlike natural languages e.g. English
How much can we say in this language?
Not to be confused with logical reasoning

Logics are languages, reasoning is a process (may use logic)
Syntax and semantics

Syntax




Semantics



Rules for constructing legal sentences in the logic
Which symbols we can use (English: letters, punctuation)
How we are allowed to combine symbols
How we interpret (read) sentences in the logic
Assigns a meaning to each sentence
Example: “All lecturers are seven foot tall”



A valid sentence (syntax)
And we can understand the meaning (semantics)
This sentence happens to be false (there is a counterexample)
Propositional logic

Syntax

Propositions, e.g. “it is wet”
Connectives: and, or, not, implies, iff (equivalent)

Brackets (), T (true) and F (false)


Semantics (Classical/Boolean)

Define how connectives affect truth


“P and Q” is true if and only if P is true and Q is true
Use truth tables to work out the truth of statements
A story


Your Friend comes home; he is completely wet
You know the following things:







Your Friend is wet
If your Friend is wet, it is because of rain, sprinklers, or both
If your Friend is wet because of sprinklers, the sprinklers must be on
If your Friend is wet because of rain, your Friend must not be carrying the
umbrella
The umbrella is not in the umbrella holder
If the umbrella is not in the umbrella holder, either you must be carrying the
umbrella, or your Friend must be carrying the umbrella
You are not carrying the umbrella

Can you conclude that the sprinklers are on?

Can AI conclude that the sprinklers are on?
Knowledge base for the story







FriendWet
FriendWet → (FriendWetBecauseOfRain OR
FriendWetBecauseOfSprinklers)
FriendWetBecauseOfSprinklers → SprinklersOn
FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella)
UmbrellaGone
UmbrellaGone → (YouCarryingUmbrella OR
FriendCarryingUmbrella)
NOT(YouCarryingUmbrella)
Syntax

What do well-formed sentences in the knowledge base look
like?

A BNF grammar:



Symbol := P, Q, R, …, FriendWet, …
Sentence := True | False | Symbol | NOT(Sentence) | (Sentence AND
Sentence) | (Sentence OR Sentence) | (Sentence → Sentence)
We will drop parentheses sometimes, but formally they really
should always be there
Semantics

A model specifies which of the propositional symbols are true
and which are false


Given a model, I should be able to tell you whether a sentence is true or
false
Truth table defines semantics of operators:
a
b
NOT(a)
a AND b
a OR b
a→b
false
false
true
false
false
true
false
true
true
false
true
true
true
false
false
false
true
false
true
true
false
true
true
true
• Given a model, can compute truth of sentence recursively with these
Tautologies

A sentence is a tautology if it is true for any setting of its
propositional symbols
P
Q
P OR Q
NOT(P) AND
NOT(Q)
(P OR Q) OR (NOT(P)
AND NOT(Q))
false
false
false
true
true
false
true
true
false
true
true
false
true
false
true
true
true
true
false
true
• (P OR Q) OR (NOT(P) AND NOT(Q)) is a tautology
Is this a tautology?

(P → Q) OR (Q → P)
Logical equivalences

Two sentences are logically equivalent if they have the same truth
value for every setting of their propositional variables
P
Q
P OR Q NOT(NOT(P) AND NOT(Q))
false
false
false
false
false
true
true
true
true
false
true
true
true
true
true
true
• P OR Q and NOT(NOT(P) AND NOT(Q)) are logically
equivalent. Tautology = logically equivalent to True
Famous logical equivalences











(a OR b) ≡ (b OR a) commutatitvity
(a AND b) ≡ (b AND a) commutatitvity
((a AND b) AND c) ≡ (a AND (b AND c)) associativity
((a OR b) OR c) ≡ (a OR (b OR c)) associativity
NOT(NOT(a)) ≡ a double-negation elimination
(a → b) ≡ (NOT(b) → NOT(a)) contraposition
(a → b) ≡ (NOT(a) OR b) implication elimination
NOT(a AND b) ≡ (NOT(a) OR NOT(b)) De Morgan
NOT(a OR b) ≡ (NOT(a) AND NOT(b)) De Morgan
(a AND (b OR c)) ≡ ((a AND b) OR (a AND c)) distributivity
(a OR (b AND c)) ≡ ((a OR b) AND (a OR c)) distributivity
Entailment

A set of premises A logically entails a conclusion B (written
as A |= B) if and only if every model/interpretation that
satisfies the premises also satisfies the conclusion, i.e. A
→B

i.e. B is a valid consequence of A

{p} |= (p ∨ q)

{p} |# (p ∧ q)


(e.g. p=T, q=F)
{p, q} |= (p ∧ q)
Entailment

We have a knowledge base (KB) of things that we know are true




FriendWetBecauseOfSprinklers
FriendWetBecauseOfSprinklers → SprinklersOn
Can we conclude that SprinklersOn?
We say SprinklersOn is entailed by the KB if, for every setting of the
propositional variables for which the KB is true, SprinklersOn is also true
FWBOS
SprinklersOn
Knowledge base
false
false
false
false
true
false
true
false
false
true
true
true
SprinklersOn is entailed! (KB |= SprinklersOn)
Inconsistent knowledge bases

Suppose we were careless in how we specified our knowledge
base:
PetOfFriendIsABird → PetOfFriendCanFly
PetOfFriendIsAPenguin → PetOfFriendIsABird
PetOfFriendIsAPenguin → NOT(PetOfFriendCanFly)
PetOfFriendIsAPenguin

No setting of the propositional variables makes all of these
true


Therefore, technically, this knowledge base implies/entails anything…
TheMoonIsMadeOfCheese
Satisfiability

A sentence is satisfiable if it is true in some model



How do you verify if a sentence is satisfiable?



If a sentence α is true in model m, then m satisfies α, or we say m is the
model of α
A KB is satisfiable if a model m satisfies all clauses in it
Enumerate the possible models, until one is found to satisfy
Can order the models so as to evaluate the most promising ones first
Many problems in Computer Science can be reduced to a
satisfiability problem

E.g. CSP is all about verifying if the constraints are satisfiable by some
assignments
Inference

Inference is the process of deriving a specific sentence from a
KB (where the sentence must be entailed by the KB)


“KBs are a haystack”



KB |-i a = sentence a can be derived from KB by procedure i
Entailment = needle in haystack
Inference = finding it
If KB is true in the real world, then any sentence derived from
KB by a sound inference procedure is also true in the real
world
Simple algorithm for inference

Want to find out if sentence a is entailed by knowledge base…

For every possible setting of the propositional variables,

If knowledge base is true and a is false, return false

Return true

Not very efficient: 2#propositional variables settings

There can be many propositional variables among premises that are
irrelevant to the conclusion. Much wasted work!
Inference by enumeration
Proof methods


Proof methods divide into (roughly) two kinds:
Model checking (inference by enumeration)




Truth table enumeration (always exponential in n)
Improved backtracking, e.g., Davis-Putnam-Logemann-Loveland (DPLL)
Heuristic search in model space (sound but incomplete)
Application of inference rules/reasoning patterns



Legitimate (sound) generation of new sentences from old
Proof = a sequence of inference rule applications
Can use inference rules as operators in a standard search
algorithm
Typically require transformation of sentences into a normal form
Reasoning patterns

Obtain new sentences directly from some other sentences in
knowledge base according to reasoning patterns

E.g., if we have sentences a and a → b, we can correctly
conclude the new sentence b

This is called modus ponens

E.g., if we have a AND b, we can correctly conclude a

All of the logical equivalences from before also give reasoning
patterns
Formal proof that the sprinklers are on

Knowledge base:
1.
FriendWet
2.
FriendWet → (FriendWetBecauseOfRain OR
FriendWetBecauseOfSprinklers)
3.
FriendWetBecauseOfSprinklers → SprinklersOn
FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella)
UmbrellaGone
4.
5.
6.
7.
UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella)
NOT(YouCarryingUmbrella)
Formal proof that the sprinklers are on
8.
YouCarryingUmbrella OR FriendCarryingUmbrella (modus ponens on 5 and 6)
9.
NOT(YouCarryingUmbrella) → FriendCarryingUmbrella (equivalent to 8)
10.
FriendCarryingUmbrella (modus ponens on 7 and 9)
11.
NOT(NOT(FriendCarryingUmbrella) (equivalent to 10)
12.
NOT(NOT(FriendCarryingUmbrella)) → NOT(FriendWetBecauseOfRain)
(equivalent to 4 by contraposition)
13.
NOT(FriendWetBecauseOfRain) (modus ponens on 11 and 12)
14.
FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers (modus ponens on 1
and 2)
15.
NOT(FriendWetBecauseOfRain) → FriendWetBecauseOfSprinklers (equivalent to 14)
16.
FriendWetBecauseOfSprinklers (modus ponens on 13 and 15)
17.
SprinklersOn (modus ponens on 16 and 3)
Reasoning about penguins

1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Knowledge base:
PetOfFriendIsABird → PetOfFriendCanFly
PetOfFriendIsAPenguin → PetOfFriendIsABird
PetOfFriendIsAPenguin → NOT(PetOfFriendCanFly)
PetOfFriendIsAPenguin
PetOfFriendIsABird (modus ponens on 4 and 2)
PetOfFriendCanFly (modus ponens on 5 and 1)
NOT(PetOfFriendCanFly) (modus ponens on 4 and 3)
NOT(PetOfFriendCanFly) → FALSE (equivalent to 6)
FALSE (modus ponens on 7 and 8)
FALSE → TheMoonIsMadeOfCheese (tautology)
TheMoonIsMadeOfCheese (modus ponens on 9 and 10)
Evaluation of deductive inference

Sound



Complete



Yes, because the inference rules themselves are sound
(This can be proven using a truth table argument)
If we allow all possible inference rules, we’re searching in an infinite
space, hence not complete
If we limit inference rules, we run the risk of leaving out the necessary
one…
Monotonic

If we have a proof, adding information to the KB will not invalidate the
proof
Getting more systematic

Any knowledge base can be written as a single formula in conjunctive
normal form (CNF)




CNF formula: (… OR … OR …) AND (… OR …) AND …
… can be a symbol x, or NOT(x) (these are called literals)
Multiple facts in knowledge base are effectively ANDed together
FriendWet → (FriendWetBecauseOfRain OR
FriendWetBecauseOfSprinklers)
becomes
(NOT(FriendWet) OR FriendWetBecauseOfRain OR
FriendWetBecauseOfSprinklers)
Converting story to CNF

FriendWet


FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers)


UmbrellaGone
UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella)


NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella)
UmbrellaGone


NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn
FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella)


NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers
FriendWetBecauseOfSprinklers → SprinklersOn


FriendWet
NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella
NOT(YouCarryingUmbrella)

NOT(YouCarryingUmbrella)
Unit resolution

Unit resolution: if we have
l1 OR l2 OR … li … OR lk
and
NOT(li)
we can conclude
l1 OR l2 OR … li -1 OR li +1 OR … OR lk

Basically modus ponens
Applying resolution to story problem
1.
FriendWet
2.
NOT(FriendWet) OR FriendWetBecauseOfRain OR
FriendWetBecauseOfSprinklers
3.
NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn
4.
NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella)
5.
UmbrellaGone
6.
NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella
7.
NOT(YouCarryingUmbrella)
8.
NOT(UmbrellaGone) OR FriendCarryingUmbrella (6,7)
9.
FriendCarryingUmbrella (5,8)
10.
NOT(FriendWetBecauseOfRain) (4,9)
11.
NOT(FriendWet) OR FriendWetBecauseOfSprinklers (2,10)
12.
FriendWetBecauseOfSprinklers (1,11)
13.
SprinklersOn (3,12)
Resolution (general)

General resolution allows a complete inference mechanism
(search-based) using only one rule of inference

Resolution rule:



Given: P1  P2  P3 … Pn and P1  Q1 … Qm
Conclude: P2  P3 … Pn  Q1 … Qm
Complementary literals P1 and P1 “cancel out”
Why it works:

Consider 2 cases: P1 is true, and P1 is false
Applying resolution
1.
FriendWet
2.
NOT(FriendWet) OR FriendWetBecauseOfRain OR
FriendWetBecauseOfSprinklers
3.
NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn
4.
NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella)
5.
UmbrellaGone
6.
NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella
7.
NOT(YouCarryingUmbrella)
8.
NOT(FriendWet) OR FriendWetBecauseOfRain OR SprinklersOn (2,3)
9.
NOT(FriendCarryingUmbrella) OR NOT(FriendWet) OR SprinklersOn (4,8)
10.
NOT(UmbrellaGone) OR YouCarryingUmbrella OR NOT(FriendWet) OR
SprinklersOn (6,9)
11.
YouCarryingUmbrella OR NOT(FriendWet) OR SprinklersOn (5,10)
12.
NOT(FriendWet) OR SprinklersOn (7,11)
13.
SprinklersOn (1,12)
Systematic inference…

General strategy: if we want to see if sentence a is entailed,
add NOT(a) to the knowledge base and see if it becomes
inconsistent (we can derive a contradiction)


= proof by contradiction
CNF formula for modified knowledge base is satisfiable if and
only if sentence a is not entailed

Satisfiable = there exists a model that makes the modified knowledge
base true = modified knowledge base is consistent
Resolution algorithm

Given formula in conjunctive normal form, repeat:




If the empty clause results, formula is not satisfiable


Find two clauses with complementary literals,
Apply resolution,
Add resulting clause (if not already there)
Must have been obtained from P and NOT(P)
Otherwise, if we get stuck (and we will eventually), the
formula is guaranteed to be satisfiable
Example

Our knowledge base:
1) FriendWetBecauseOfSprinklers
2) NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn

Can we infer SprinklersOn? We add:
3) NOT(SprinklersOn)

From 2) and 3), get
4) NOT(FriendWetBecauseOfSprinklers)

From 4) and 1), get empty clause = SpinklersOn entailed
Exercises

In twos/threes, please!
Download