Uploaded by Richard Ketchersid

Topic 1

advertisement
Propositional Logic
Definition: A proposition is a sentence with a definite truth value.
Examples:
Sentence
Declarative
True/False
Proposition?
The GCU campus is in Phoenix.
Yes
T
Yes
2+3=5
Yes
T
Yes
2+3=6
Yes
F
Yes
Is GCU in New Mexico?
No
NA
No
Does 2 + 2 = 4?
No
NA
No
Do your homework!
No
NA
No
This sentence is false.
Yes
NA
No1
1
If the sentence is true, then it must be false. If false it must be true. So it must be neither!
Propositional Variables
Definition: A propositional variable is a variable, usually, p, q, … These stand in place of
propositions. The value of a propositional variable is either True (T) or False (F).
We use propositional variables to construct compound propositions with the logical
connectives “or” (“∨”), “and” (“∧”), “implies” (“→”), and “not” (“¬”).
Example: Let p = “You have COVID.” and q = “Your COVID test is positive.”
Compound Proposition
Notation
Name
You do not have COVID.
¬p
negation
You have COVID or your COVID test is positive.
p∨q
disjunction
You have COVID and your COVID test is positive.
p∧q
conjunction
You have COVID implies your COVID test is positive.
p→q
conditional
You have COVID if and only if your COVID test is positive.
p↔q
bi-conditional
Variants
There are many variants that you must be aware of, for example, the following are all ways of
expressing “p → q”:
p implies q
p is sufficient for q
q provided that p
If p, then q
q is necessary for p
q whenever p
p only if q
q follows from p
q unless ¬p
Similar alternatives exist for the other connectives. Your homework will cover some of these.
Truth tables for the logical connectives
“not p” is true when p is false and false when p is true.
p
¬p
T
F
F
T
Truth tables for the logical connectives
“p and q” is true when p is true and q is true and otherwise false.
p
q
p∧q
T
T
T
T
F
F
F
T
F
F
F
F
Truth tables for the logical connectives
“p or q” is true when p is true or q is true or both. (inclusive or)
p
q
p∨q
T
T
T
T
F
T
F
T
T
F
F
F
In natural languages, like English, context often determines if the current use of “or” should
be inclusive. For example, “You might get back quarters or dimes for change.” You might
very well get both quarters and dimes back, so this is an inclusive or.
Truth tables for the logical connectives
“p xor q” is true when p is true or q is true but not both. (exclusive or)
p
q
p⊕q
T
T
F
T
F
T
F
T
T
F
F
F
For example, “You may have fries or a salad with your meal.” This would usually be interpreted to
mean “not both” and hence a use of exclusive or.
Truth tables for the logical connectives
“p implies q” is false only when the hypothesis p is true and the conclusion q is false.
p
q
p→q
T
T
T
T
F
F
F
T
T
F
F
T
This can be a little strange at first when the hypothesis is false. Both of the following sentences are true,
as in each case, the hypothesis is false:
● If the earth is flat, then the moon is round. (F → T)
● If the earth is flat, then the moon is square. (F → F)
More on conditionals
Conditionals are so important that they deserve some additional comments. Given a conditional p → q
there are three related propositions
● (Contrapositive) ¬q → ¬p
● (Converse) q → p
● (Inverse) ¬p → ¬q
The main thing to remember about these is that the contrapositive and the original conditional are
(logically) equivalent1, whereas the converse and inverse are different. This is easy to see in a truth table:
1
Assume below that a is a fixed integer:
p
q
¬p
¬q
p→q
¬q → ¬p
q→p
¬p → ¬q
T
T
F
F
T
T
T
F
T
F
F
T
F
F
T
F
▹If a < 2, then a < 3. (conditional - true for all a)
▹If a ≥ 3, then a ≥ 2. (contrapositive - true for all a)
▹If a ≥ 2, then a ≥ 3. (inverse - true for some a)
▹If a < 3, then a < 2. (converse - true for some a)
F
T
T
F
T
T
F
T
For which a are the third and fourth true?
F
F
T
T
T
T
T
F
More on logical equivalence later.
Truth tables for the logical connectives
“p if and only if q” is true precisely when either both p and q are true or both p and q are false.
p
q
p↔q
T
T
T
T
F
F
F
T
F
F
F
T
It is common to use the terminology “p is necessary and sufficient for q.”
Truth tables for compound propositions
p
q
¬p
¬q
p→q
¬q → ¬p
(p → q) ↔ (¬q → ¬p)
T
T
F
F
T
T
T
T
F
F
T
F
F
T
F
T
T
F
T
T
T
F
F
T
T
T
T
T
A truth table for a compound
proposition with n propositional
variables will have 2n rows.
Definition: If the value of a compound proposition is always T, then the proposition is
called a tautology. In the example above, (p → q) ↔ (¬q → ¬p) is a tautology. If the
value is always F, then the proposition is a contradiction. A proposition that is not a
contradiction is satisfiable. If a proposition is neither a tautology nor a contradiction,
then it is called a contingency.
Satisfiability
From the previous slide it is clear that checking if a proposition is satisfiable might
require 2n steps if there are n propositional variables. This is exponential in the size of
the problem. Sometime filling out the entire table can be avoided. For example, show
that ((p ∧ q) → r) ∧ ¬((p → r) ∧ (q → r)) is satisfiable:
p
q
r
p∧q
(p ∧ q) → r
F
F
T
p→r
q→r
(p → r) ∧ (q →
r)
F
¬((p → r) ∧ (q → r))
T
Satisfiability
From the previous slide it is clear that checking if a proposition is satisfiable might
require 2n steps if there are n propositional variables. This is exponential in the size of
the problem. Sometime filling out the entire table can be avoided. For example, show
that ((p ∧ q) → r) ∧ ¬((p → r) ∧ (q → r)) is satisfiable:
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
F
T
F
F
T
T
F
(p → r) ∧ (q →
r)
F
¬((p → r) ∧ (q → r))
T
Note here that we made a choice. We could have
made p → r false and q → r true just as well in order
to make (p → r) ∧ (q → r) false.
Satisfiability
From the previous slide it is clear that checking if a proposition is satisfiable might
require 2n steps if there are n propositional variables. This is exponential in the size of
the problem. Sometime filling out the entire table can be avoided. For example, show
that ((p ∧ q) → r) ∧ ¬((p → r) ∧ (q → r)) is satisfiable:
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
F
T
F
F
T
T
F
(p → r) ∧ (q →
r)
F
¬((p → r) ∧ (q → r))
T
What about ¬((p ∧ q) → r) ∧((p → r) ∧ (q → r))?
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
F
T
F
T
T
(p → r) ∧ (q →
r)
T
¬((p → r) ∧ (q → r))
Satisfiability
From the previous slide it is clear that checking if a proposition is satisfiable might
require 2n steps if there are n propositional variables. This is exponential in the size of
the problem. Sometime filling out the entire table can be avoided. For example, show
that ((p ∧ q) → r) ∧ ¬((p → r) ∧ (q → r)) is satisfiable:
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
F
T
F
F
T
T
F
(p → r) ∧ (q →
r)
F
¬((p → r) ∧ (q → r))
T
What about ¬((p ∧ q) → r) ∧((p → r) ∧ (q → r))?
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
T
T
F
T
F
T
T
(p → r) ∧ (q →
r)
T
¬((p → r) ∧ (q → r))
Satisfiability
From the previous slide it is clear that checking if a proposition is satisfiable might
require 2n steps if there are n propositional variables. This is exponential in the size of
the problem. Sometime filling out the entire table can be avoided. For example, show
that ((p ∧ q) → r) ∧ ¬((p → r) ∧ (q → r)) is satisfiable:
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
F
T
F
F
T
T
F
(p → r) ∧ (q →
r)
F
¬((p → r) ∧ (q → r))
T
What about ¬((p ∧ q) → r) ∧((p → r) ∧ (q → r))?
p
q
r
p∧q
(p ∧ q) → r
p→r
q→r
T
T
F
T
F
T
T/F
(p → r) ∧ (q →
r)
¬((p → r) ∧ (q → r))
T
We see here that this proposition is not satisfiable. It is a
contradiction.
Abstraction
Any compound proposition with atomic propositional variables p1, ..., pn can be viewed as a function
f(p1, …, pn) mapping from {T, F}n into {T, F}. This is precisely what a truth table visualizes. To be
specific I will identify the truth table for a compound proposition with this function.
Example: Consider p ∧ (q ∨ r). A truth table might look like
p
q
r
q∨r
p ∧ (q ∨ r)
T
T
T
T
T
T
T
F
T
T
This is in essence what the truth table is.
(T,T,T)
(T,T,F)
(T,F,T)
T
F
T
T
T
(T,F,F)
(F,T,T)
T
F
F
F
F
F
T
T
T
F
F
T
F
T
F
F
F
T
T
F
F
F
F
F
F
(F,T,F)
(F,F,T)
(F,F,F)
T
F
Equivalence
Definition: Two compound propositions are equivalent if they have the same truth table.
Example:
p
q
r
q∨r
p ∧ (q ∨ r)
p∧q
p∧
r
(p ∧ q) ∨ (p ∧
r)
T
T
T
T
T
T
T
T
T
T
F
T
T
T
F
T
T
F
T
T
T
F
T
T
T
F
F
F
F
F
F
F
F
T
T
T
F
F
F
F
F
T
F
T
F
F
F
F
F
F
T
T
F
F
F
F
F
F
F
F
F
F
F
F
Here we see that p ∧ (q ∨ r)
and (p ∧ q) ∨ (p ∧ r) have the
same truth table (truth function)
and hence are equivalent
propositions.
It should be clear that two
compound propositions:
f(p,q,r) and g(p, q, r) are equivalent
iff
f(p,q,r) ↔ g(p,q,r) is a tautology.
Truth Table (Truth Function) ⇨ Compound proposition
A natural question is “Can any truth function be represented as a compound
proposition?” The answer is easy and “Yes, in two different ways.”
Proof by example:
Disjunctive normal form (DNF):
p
q
r
f(p,q,r)
T
T
T
T
T
T
F
T
T
F
T
T
T
F
F
F
F
T
T
F
F
T
F
F
F
F
T
F
F
F
F
F
(p ∧ q ∧ r) ∨ (p ∧ q ∧ ¬r) ∨ (p ∧ ¬q
∧ r)
It is clear that this is T for the first three rows and false for the last
five rows, so it has exactly this truth table.
Conjunctive normal form (CNF):
(¬p ∨ q ∨ r) ∧ (p ∨ ¬q ∨ ¬r) ∧ (p ∨ ¬q ∨ r) ∧ (p ∨ q ∨ ¬r) ∧
(p ∨ q ∨ r)
CNF is a little harder to see. The formula says, “I am true if and only
if none of TFF, FTT, FTF, FFT, or FFF occur.” Equivalently, “I am true
precisely when one of TTT, TTF, or TFT occur.”
Using satisfiability to solve a logic puzzle
There are various sorts of logic puzzles in your text. One sort goes like this: There are treasures
in two of the trunks of three trees. Each tree has an inscription, in order, “This trunk is empty,”
“There is a treasure in trunk of the first tree,” and “There is a treasure in the trunk of the second
tree.” Can the queen who never lies state the following, and if so which tree trunks contain the
treasure?
1. All the inscriptions are false.
2. Exactly one inscription is true.
3. Exactly two inscriptions are true.
4. All three inscriptions are true.
A solution: Let pi be the proposition “There is treasure in the trunk of the ith tree.” Our given
inscriptions are: ¬p1, p1, and p2. Translating the queen’s sentences gives:
1. p1 ∧ ¬p1 ∧ ¬p2
1. F
2. (¬p1 ∧ ¬p1 ∧ ¬p2) ∨ (p1 ∧ p1 ∧ ¬p2) ∨ (p1 ∧ ¬p1 2.
∧ p2(¬p
)
∧ ¬p2) ∨ (p1 ∧ ¬p2)
1
3. (¬p1 ∧ p1 ∧ ¬p2) ∨ (¬p1 ∧ ¬p1 ∧ p2) ∨ (p1 ∧ p1 ∧
3. p2)(¬p1 ∧ p2) ∨ (p1 ∧ p2)
4. ¬p1 ∧ p1 ∧ p2
4. F
Puzzle continued
1 inscription is true
2 inscriptions are true
p1
p2
p3
¬ p1 ∧
¬ p2
p1 ∧
¬ p2
(¬p1 ∧ ¬p2) ∨ (p1 ∧
¬p2)
p1 ∧ p 2
¬ p1 ∧ p 2
(¬p1 ∧ p2) ∨ (p1 ∧
p2)
T
T
F
F
F
F
T
F
T
T
F
T
F
T
T
F
F
F
F
F
F
F
T
T
F
T
T
You only consider
cases where treasure
is in two of the three
trunks.
The queen can speak “Exactly one inscription is true” when the treasure is in the trunk of the first
and third trees. In this case the one true inscription is “There is a treasure in trunk of the first tree.”
There are two ways that the queen can speak “Exactly two inscriptions are true.” Namely the
treasure can be in the first and second tree, so that the second and third inscriptions are true, or
the treasure can be in the second and third tree so that the first and second inscriptions are true.
Puzzle alternate solution
It occurred after I wrote the first solution that there is another perhaps simpler method. The
previous method is more general, but for this problem the following works.
si is the ith inscription
S2
S3
p1
p2
p3
¬ p1
T
T
F
F
Exactly two are true.
T
F
T
F
Exactly one is true.
F
T
T
T
Exactly two are true.
s1
Again only consider cases
where treasure is in two of
the three trunks.
Logic Circuits
As shown above any truth function/table can be represented simply in DNF or CNF form. This same
method makes it easy to find circuits that yield any desired boolean function.
Example: Find a circuit that generates the following boolean function. I am
reading the binary input left-to-right so x1x2x3 = 001
input
output
000
0
001
0
010
0
011
0
100
0
101
1
110
1
111
1
CNF: (x1 ∧ x2 ∧ x3) ∨ (x1 ∧ x2 ∧ ¬x3) ∨ (x1 ∧
¬x2 ∧ x3)
x1
x2
x3
f(x1x2x3)
Complete sets of connectives.
We have seen by using CNF and DNF that the sets of connectives {∧,¬} and {∨,¬} are complete,
in that any truth table (truth function) can be realized by a proposition built from just atomic
propositions and those connectives. In terms of circuits, this means {AND, NOT} and {OR, NOT}
gates can create any boolean function.
Definition: NAND (|) and NOR(↓). Here p | q ≡ ¬(p ∧ q) and p ↓ q ≡ ¬(p ∨ q).
Claim: Both {|} and {↓} are complete. To show {|} is complete it suffices to show that both ∧ and
can be defined from |. Showing that ↓ is complete is similar. (Section 1.3, Exercises 54 and 56.)
p
q
p|q
p
p|p
p
q
p|q
(p | q) | (p | q)
T
T
F
T
F
T
T
F
T
T
F
T
F
T
T
F
T
F
F
T
T
F
T
T
F
F
F
T
F
F
T
F
p | p ≡ ¬p
(p | q) | (p | q) ≡ p ∧ q
¬
XOR is not complete
Consider any propositions just built out of XOR, like p ⊕ p, or p ⊕ q, or (p ⊕ q) ⊕ r, etc. The truth table
for each of these always has exactly ½ T’s and ½ F’s. This means that neither of ∧ or ∨ can be
written in terms of ⊕.
To see this notice that there are some simple algebraic facts:
● p ⊕ q = q ⊕ p (commutative)
● p ⊕ (q ⊕ r) = (p ⊕ q) ⊕ r (associative)
● p ⊕ F = p = F ⊕ p (F is the ⊕-identity)
● p ⊕ p = F (idempotent)
● p ⊕ T = ¬p
Consider a {⊕} proposition, for example, (p1 ⊕ p2) ⊕ ((p3 ⊕ p2) ⊕ p4). By associativity this can be
rewritten unambiguously as p1 ⊕ p2 ⊕ p3 ⊕ p2 ⊕ p4. Then by commutativity the proposition can be
rewritten as p1 ⊕ p2 ⊕ p2 ⊕ p3 ⊕ p4. By idempotency this is the same as p1 ⊕ F ⊕ p3 ⊕ p4 and by identity
once more as p1 ⊕ p3 ⊕ p4. This example can be turned into a proof that every such proposition can
be written in the form p1 ⊕ p2 ⊕ p3 ⋯ ⊕ pn or p1 ⊕ p2 ⊕ p3 ⋯ ⊕ pn ⊕ T where each pi is distinct. Thus it
suffices to look at truth tables for these propositions.
XOR is not complete
p
T
T
F
F
q
T
F
T
F
p⊕q
F
T
T
F
p
p⊕T
p
q
r
p⊕q
p⊕q⊕r
p⊕q⊕r⊕T
T
F
T
T
T
F
T
F
F
T
T
T
F
F
F
T
T
F
T
T
F
T
T
F
F
T
T
F
F
T
T
T
F
T
F
T
F
T
T
F
F
F
T
F
T
F
F
F
F
F
F
T
p⊕q⊕T
T
F
F
T
By induction p1 ⊕ p2 ⊕ p3 ⋯ ⊕ pn (⊕ T) has 2n-1 T’s and same number of F’s in it’s truth table.
Some important Logical equivalences
¬(p ∨ q) ≡ ¬p ∧ ¬q
¬(p ∧ q) ≡ ¬p ∨ ¬q
De Morgan’s Laws
p ∧ (q ∨ r) ≡ (p ∧ q) ∨ (p ∧
r)
p ∨ (q ∧ r) ≡ (p ∨ q) ∧ (p ∨
r)
Distributive Law
p → q ≡ ¬p ∨ q
(p → q) ∧ (p → r) ≡ p → (q ∧
r)
(p → q) ∨ (p → r) ≡ p → (q ∨
r)
Dual: If s is a proposition the dual of s,
denoted s* is got by swapping ∧ with ∨ and
T with F.
Fact1: s ≡ t iff s* ≡ t*
Example:
s = p ∧ (q ∨ r) ⇨ s* = p ∨ (q ∧ r)
t = (p ∧ q) ∨ (p ∧ r) ⇨ t* = (p ∨ q) ∧ (p ∨
r)
De Morgan’s Laws are: s ≡ t and s* ≡ t*
(q → p) ∧ (r → p) ≡ (q ∨ r) →
p
1
not (r
hard
prove.
s ≡ t, so s ↔ t ≡ T. Then (s ↔ t)* ≡ T*. Suppose we know ¬(s ↔ t)* = s* ↔ t*, then we know s* ↔ t* ≡
(qThe→fact
p)is∨
→top)
≡ (qSuppose
∧ r) →
T, and hence s* ≡ t*. This is what we wanted.
p
To see that ¬(s ↔ t)* = s* ↔ t* is a computation: s ↔ t = (s ∧ t) ∨ (¬s ∧ ¬t) and so (s ↔ t)* = (s* ∨ t*) ∧ (¬s* ∨ ¬t*) and so ¬(s ↔ t)* = (s*
∧
t*) ∨
∧ ¬t*)
s* ↔
p↔
q (¬s*
≡ ¬p
↔ =¬q
≡ t*.
¬(p ⊕ q)
Using equivalences
Example 1:
Example 1 with truth table:
Show that (p ∨ q) ∧ (¬p ∨ r) → (q ∨ r) is a
tautology.
(p ∨ q) ∧ (¬p ∨ r) → (q ∨ r)
≡ ¬((p ∨ q) ∧ (¬p ∨ r)) ∨ (q ∨ r)
Try to make this false, then (p ∨ q) ∧ (¬p ∨ r) must be
true and (q ∨ r) false.
Negate this
¬(¬((p ∨ q) ∧ (¬p ∨ r)) ∨ (q ∨ r))
≡ ¬¬((p ∨ q) ∧ (¬p ∨ r)) ∧ ¬(q ∨ r)
≡ (p ∨ q) ∧ (¬p ∨ r) ∧ (¬q ∧ ¬r)
≡ ((p ∨ q) ∧ ¬q) ∧ ((¬p ∨ r) ∧ ¬r)
≡ ((p ∧ ¬q) ∨ (q ∧ ¬q)) ∧ ((¬p ∧ ¬r)∨ (r ∧
¬r))
≡ ((p ∧ ¬q) ∨ F) ∧ ((¬p ∧ ¬r)∨ F)
≡ (p ∧ ¬q) ∧ (¬p ∧ ¬r)
≡ (p ∧ ¬p) ∧ ¬q ∧ ¬r
≡ F ∧ (¬q ∧ ¬r)
≡F
So (p ∨ q) ∧ (¬p ∨ r) → (q ∨ r) ≡ T
p
q
r
¬p
p∨q
¬p ∨ r
q∨r
T
F
F
F
T
T/F
F
Assuming we could satisfy the negation we get a
contradiction. This means that the assumption was false.
So the negation is a contradiction and hence the original
is a tautology.1
Using equivalences
Example 2: Show that
(q → p) ∧ (r → p) ≡ (q ∨ r) → p
(q → p) ∧ (r → p)
≡ (¬q ∨ p) ∧ (¬r ∨ p)
≡ (¬q ∧ ¬r) ∨ p
≡ ¬(q ∨ r) ∨ p
≡ (q ∨ r) → p
Example 2 with truth table: Try to make this false, then
either (q → p) ∧ (r → p) is true and (q ∨ r) → p false or
vice-versa.
1
p
q
r
q→p
r→p
q∨r
F
T
?1
T/F
T
T
(q → p) ∧ (r →
p)
T
(q ∨ r) → p
F
One of r or q must be T since q ∨ r is T, but this gives a contradiction.
p
q
F
T
r
q→p
r→p
q∨r
F
?
T
Either way a contradiction is reached.
(q → p) ∧ (r →
p)
F
(q ∨ r) → p
T/F
CNF revisited
p
q
r
f(p,q,r)
¬f(p,q,r)
T
T
T
T
F
T
T
F
T
F
T
F
T
T
F
T
F
F
F
T
F
T
T
F
T
F
T
F
F
T
F
F
T
F
T
F
F
F
F
T
The DNF for ¬f(p,q,r) is
(p ∧ ¬q ∧ ¬r) ∨ (¬p ∧ q ∧ r) ∨ (¬p ∧ q ∧ ¬r) ∨ (¬p ∧ ¬q ∧ r) ∨ (¬p ∧ ¬q ∧
¬r)
So we can negate this to get f(p,q,r):
¬((p ∧ ¬q ∧ ¬r) ∨ (¬p ∧ q ∧ r) ∨ (¬p ∧ q ∧ ¬r) ∨ (¬p ∧ ¬q ∧ r) ∨ (¬p ∧ ¬q
∧ ¬r))
≡
¬(p ∧ ¬q ∧ ¬r) ∧ ¬(¬p ∧ q ∧ r) ∧ ¬(¬p ∧ q ∧ ¬r) ∧ ¬(¬p ∧ ¬q ∧ r) ∧ ¬(¬p ∧
¬q ∧ ¬r)
≡
(¬p ∨ q ∨ r) ∧ (p ∨ ¬q ∨ ¬r) ∧ (p ∨ ¬q ∨ r) ∧ (p ∨ q ∨ ¬r) ∧ (p ∨ q ∨ r)
The last is the CNF for f(p,q,r).
Predicates/Propositional functions
Propositions are nice, but they fall short of what is required to express mathematical concepts or
reason about them. For example in calculus the concept of limit is central, this is expressed by:
Here we will study how to express this and then reason about such sentences using the
predicate calculus.
Definition: A predicate P(x) is is a statement involving variables so that if you replace a
variable with a specific object, a, then P(a) becomes a proposition. In this sense, predicates
are functions P: Objects → {T,F} and hence are also called propositional functions.
The objects that can replace variables might be explicitly stated, but are often implicit. The set
of appropriate objects is called the universe of discourse.
Dependance on the universe of discourse
Example: P(x) = “x2 > 0”
●
This predicate depends on the universe of discourse. If x is a real number, then P(0) is
false, but P(a) is true for any non-zero real. The new predicate Q(x) = “x ≠ 0 → P(x)” is
always true.
●
If x is meant to be a complex number, then for z2 > 0 only for z = a + bi where b = 0 and
a ≠ 0.
●
If x is meant to be a non-zero integer, then P(x) is true for all x.
Quantifiers
There are two primary quantifiers “there exists,” denoted “∃,” and “for all,” denoted “∀.”
●
∃xP(x) is read “There exists an x, such that P(x) holds.” This is true if for some a in the
universe of discourse, P(a) is true.
●
∀xP(x) is read “For all x, P(x) holds.” This is true if for all a in the universe of discourse, P(a)
is true.
∃xP(x) is very much like an “or” over the entire universe of discourse. For example, if the
universe was all positive integers, then
∃xP(x) ≡ P(1) ∨ P(2) ∨ P(3) ∨ P(4) ⋯
Similarly, ∀xP(x) is like an “and” over the entire universe. Again using positive integers,
∀xP(x) ≡ P(1) ∧ P(2) ∧ P(3) ∧ P(4) ⋯
Quantifiers over finite domains
If the domain of discourse is finite, say {a1,a2,...,an}, then the following are literally equivalent:
●
●
∃xP(x) ≡ P(a1) ∨ P(a2) ∨ ⋯ ∨ P(an)
∀xP(x) ≡ P(a1) ∧ P(a2) ∧ ⋯ ∧ P(an)
This allows us to get an intuition for certain rules for quantifiers, in particular De Morgan’s rule tells us:
●
●
¬∃xP(x) ≡ ¬(P(a1) ∨ P(a2) ∨ ⋯ ∨ P(an)) ≡ ¬P(a1) ∧ ¬P(a2) ∧ ⋯ ∧ ¬P(an) ≡ ∀x¬P(x)
¬∀xP(x) ≡ ¬(P(a1) ∧ P(a2) ∧ ⋯ ∧ P(an)) ≡ ¬P(a1) ∨ ¬P(a2) ∨ ⋯ ∨ ¬P(an) ≡ ∃x¬P(x)
We also get something from the distributive laws:
●
●
Q ∧ ∃xP(x) ≡ Q ∧ (P(a1) ∨ P(a2) ∨ ⋯ ∨ P(an)) ≡ (Q ∧ P(a1)) ∨ (Q ∧ P(a2)) ∨ ⋯ ∨ (Q ∧
P(an))
≡ ∃x(Q ∧ P(x))
Q ∨ ∀xP(x) ≡ Q ∨ (P(a1) ∧ P(a2) ∧ ⋯ ∧ P(an)) ≡ (Q ∨ P(a1)) ∧ (Q ∨ P(a2)) ∧ ⋯ ∧ (Q ∨
P(an))
≡ ∀x(Q ∨ P(x))
Free and bound variables (warning!)
The variable x that is in the scope of a quantifier, like ∃xφ(x) or ∀xφ(x). A variable in the scope
of a quantifier is bound, else it is free.
Bound variables are dummy variables, they can be replaced without changing the meaning of
the predicate, for example,
∃xφ(x, y) = ∃uφ(u, y),
but
∃xφ(x, y) ≠ ∃xφ(x, u).
In particular, ψ(y, u) = ∃xφ(x, y) ∧ ∃xφ(x, u) is a formula with two free variables, and is
satisfied by pairs (a, b) such that ∃xφ(x, a) and ∃xφ(x, b). However, ∃xφ(x, y) ∧ ∃xφ(x, u) ≡
∃uφ(u, y) ∧ ∃wφ(w, u) since only a bound variables are changed.
This is important when applying quantifier rules:
∃xQ(x) ∧ ∃xP(x) ≡ ∃x(∃uQ(u) ∧ P(x)) ≡ ∃x∃u(Q(u) ∧ P(x))
It would be incorrect to get ∃x∃x(Q(x) ∧ P(x)) or ∃x(Q(x) ∧ P(x)).
Logical equivalence of quantified statement.
Example: Show that ∃xQ(x) ∨ ∃xP(x) ≡ ∃x(Q(x) ∨ P(x))
Assume ∃xQ(x) ∨ ∃xP(x). This means there is an a such that Q(a) holds or there is a b such that P(b)
holds and so Q(a) ∨ P(a) or Q(b) ∨ P(b), and hence ∃x(Q(x) ∨ P(x)) holds.
Conversely, assume ∃x(Q(x) ∨ P(x)). There is an a so that Q(a) ∨ P(a) holds and so Q(a) holds or P(a)
holds, but then ∃xQ(x) holds or ∃xP(x) holds. Thus ∃xQ(x) ∨ ∃xP(x) holds.
Example: Show that it is not true that ∃xQ(x) ∧ ∃xP(x) ≡ ∃x(Q(x) ∧ P(x))
You might try to prove = instead and try to derive a contradiction. This “sort of works.” Assume
∃xQ(x) ∧ ∃xP(x), then ∃xQ(x) and ∃xP(x) holds. So there is an a and a b so that Q(a) and P(b).
But this does not yield an a such that both Q(a) and P(a) and this is the problem. But here we have
not actually provided a counterexample, rather we have just suggested a problem.
Let P(x) be “x = 0” and Q(x) be “x = 1.” Then clearly ∃xQ(x) ∧ ∃xP(x) holds where the universe is
integers. But ∃x(Q(x) ∧ P(x)) does not hold. Since ∃xQ(x) ∧ ∃xP(x) and ∃x(Q(x) ∧ P(x)) differ
in some universe, we know that it is not the case that ∃xQ(x) ∧ ∃xP(x) ≡ ∃x(Q(x) ∧ P(x)).
De Morgan’s Laws
Prove: ¬∃xP(x) ≡ ∀x¬P(x) and ¬∀xP(x) ≡
∃x¬P(x)
Suppose ¬∃xP(x). Then for all a, P(a) fails. Thus ∀x¬P(x) holds. Conversely, assume ∀x¬P(x), then
for all a, ¬P(a) holds, and thus for all a, P(a) fails. Thus ∃xP(x) fails and so ¬∃xP(x) holds.
The other law is proved the same way or one can appeal to duality.
Duality: Suppose φ is a predicate and φ* is obtained from φ by replacing the following
replacements ∧ ⇔∨, ∃ ⇔ ∀, and T ⇔ F. As with propositions
●
●
φ ≡ ψ ⇒ φ* ≡ ψ*
φ** = φ
So one De Morgan’s rule implies the other. The proof of this is as in the case of propositions.
Practice
Determine which of the following are equivalent to ∀xP(x) → ∀xQ(x)
● ∀x(P(x) → Q(x))
● ∃x(P(x) → ∀xQ(x))
● ∀x(∀xP(x) → Q(x))
● ∀x(P(x) → ∀xQ(x))
Practice
Determine which of the following are equivalent to ∀xP(x) → ∀xQ(x)
● ∀x(P(x) → Q(x))
● ∃x(P(x) → ∀xQ(x))
● ∀x(∀xP(x) → Q(x))
● ∀x(P(x) → ∀xQ(x))
We can see that ∀xP(x) → ∀xQ(x) does not imply ∀x(P(x) → Q(x)). Take an interpretation so that
∀xP(x) is false and so ∀xP(x) → ∀xQ(x) is trivially true. For example, let the universe be positive
integers and P(x) = “x is prime” and Q(x) = “x is composite” (literally ¬P(x)!), then ∀xP(x) → ∀xQ(x)
is true. Now take x = 2, then P(2) ∧ ¬Q(2) so ∀x(P(x) → Q(x)) is false. This is enough to see
that these are not equivalent, in particular: ¬[(∀xP(x) → ∀xQ(x)) → ∀x(P(x) → Q(x))]
We can do more, suppose ∀x(P(x) → Q(x)) and ∀xP(x). Then for any a, P(a) → Q(a) and P(a),
so Q(a), hence ∀xQ(x). Thus ∀x(P(x) → Q(x)) → (∀xP(x) → ∀xQ(x))
Arguing like this is like doing a satisfaction argument for propositional formulas.
Practice
Determine which of the following are equivalent to ∀xP(x) → ∀xQ(x)
● ∀x(P(x) → Q(x))
● ∃x(P(x) → ∀xQ(x))
● ∀x(∀xP(x) → Q(x))
● ∀x(P(x) → ∀xQ(x))
Probably the simplest thing to do here is to use
∀xP(x) → ∀xQ(x) ≡ ¬∀xP(x) ∨ ∀xQ(x)
≡ ∃x¬P(x) ∨ ∀xQ(x)
≡ ∃x(¬P(x) ∨ ∀xQ(x))
≡ ∃x(P(x) → ∀xQ(x))
We used here: ∃x(A(x) ∨ B) ≡ ∃xA(x) ∨ B (x not free in B)
Practice
Determine which of the following are equivalent to ∀xP(x) → ∀xQ(x)
● ∀x(P(x) → Q(x))
● ∃x(P(x) → ∀xQ(x))
● ∀x(∀xP(x) → Q(x))
● ∀x(P(x) → ∀xQ(x))
Probably the simplest thing to do here is to use
∀xP(x) → ∀xQ(x) ≡ ¬∀xP(x) ∨ ∀xQ(x)
≡ ∀x(¬∀xP(x) ∨ Q(x))
≡ ∀x(∀xP(x) → Q(x))
Note that we used: ∀x(A(x) ∨ B) ≡ ∀xA(x) ∨ B (where x is not free in B)
Practice
Determine which of the following are equivalent to ∀xP(x) → ∀xQ(x)
● ∀x(P(x) → Q(x))
● ∃x(P(x) → ∀xQ(x))
● ∀x(∀xP(x) → Q(x))
● ∀x(P(x) → ∀xQ(x))
Suppose ∀x(P(x) → ∀xQ(x)). If ∀xP(x) is false, then trivially, ∀xP(x) → ∀xQ(x) is true. So suppose
∀xP(x) is true. Fix an object a, then P(a) and P(a) → ∀xQ(x) holds. So ∀xP(x) → ∀xQ(x) holds. So
we have shown: ∀x(P(x) → ∀xQ(x)) → (∀xP(x) → ∀xQ(x))
Now suppose ∀xP(x) → ∀xQ(x), must it be the case that ∀x(P(x) → ∀xQ(x))? Fix an a, must it be
that P(a) → ∀xQ(x)? Consider our example from the first one again.
Again, this is sort of like a satisfaction argument. Basically, we must come up with an interpretation (a
model) that makes one of these true and the other false.
Nesting quantifiers
Universal quantifiers are often nested an even implicit, consider commutativity or associativity, usually
written like
●
●
x + y = y + x, and
x + (y + z) = (x + y) + z
More formally, these should be written:
●
●
∀x∀y(x + y = y + x), and
∀x∀y∀z(x + (y + z) = (x + y) + z)
Such implicit universal quantification is common.
A simple example where the nesting is not all of one type would be the following
∀x∃y(x + y = 0 ∧ y + x = 0) – “Existence of additive inverses.”
∃x∀y(y + x = y ∧ x + y = y) – “Existence of additive identity.”
More complicated nesting
Consider the definition of limit in calculus again:
●
●
●
●
●
“f has a limit at a” would be ∃L∀ε>0∃δ>0∀x(0 < | x - a | < δ → |f(x) - L| < ε)
“f has a limit for all a” would be ∀a∃L∀ε>0∃δ>0∀x(0 < | x - a | < δ → |f(x) - L| < ε)
“f is continuous at a” is ∀ε>0∃δ>0∀x(0 < | x - a | < δ → |f(x) - f(a)| < ε)
“f is continuous” is ∀a∀ε>0∃δ>0∀x(0 < | x - a | < δ → |f(x) - f(a)| < ε)
“f is uniformly continuous” is ∀ε>0∃δ>0∀a∀x(0 < | x - a | < δ → |f(x) - f(a)| < ε)
It is useful to compare “f is continuous” and “f is uniformly continuous.” To verify that “f is
continuous” we need a correspondence (a, ε) ↦ δ(a, ε) so that
∀x(0 < | x - a | < δ(a,ε) → |f(x) - f(a)| < ε) (example)
For uniformly continuous we need a correspondence ε ↦ δ(ε) so that
∀a∀x(0 < | x - a | < δ(ε) → |f(x) - f(a)| < ε) (example)
Negating Nested Quantifiers
Again looking at calculus a favorite example is to show f(x) = sin(1/x) has no limit at x = 0. This is
∀L¬(limx→0sin(1/x) = L) ≡ ∀L¬(∀ε>0∃δ>0∀x(0 < |x| < δ → |f(x) - L| < ε))
≡ ∀L∃ε>0∀δ>0∃x¬(0 < |x| < δ → |f(x) - L| < ε)
≡ ∀L∃ε>0∀δ>0∃x(0 < |x| < δ ∧ |f(x) - L| ≥ ε)
It turns out that taking ε = 1/2 works uniformly for all L (see here), so we have
∀L∀δ>0∃x(0 < |x| < δ ∧ |f(x) - L| ≥ 1/2)
Practice
Using M for “me,” F(x,y) for “y is a friend of x,” and E(x, y) for “y is an enemy of x,” and making the
reasonable assumption that F(x,y) → F(y,x) and E(x,y) → E(y,x) which of the following say: “An
enemy of all my enemies is my friend.”?
● ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y))
●
∀y(∀z(E(M,z) → E(y,z)) → F(M,y))
●
∀y(∃z(E(M,z) ∧ E(y,z)) ∨ F(M,y))
●
∀y(∃z(E(M,z) ∨ ¬E(y,z)) ∨ F(M,y))
Practice
Using M for “me,” F(x,y) for “y is a friend of x,” and E(x, y) for “y is an enemy of x,” and making the
reasonable assumption that F(x,y) → F(y,x) and E(x,y) → E(y,x) which of the following say: “An
enemy of all my enemies is my friend.”?
● ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y))
●
∀y(∀z(E(M,z) → E(y,z)) → F(M,y))
●
∀y(∃z(E(M,z) ∧ E(y,z)) ∨ F(M,y))
●
∀y(∃z(E(M,z) ∨ ¬E(y,z)) ∨ F(M,y))
This is not correct. Fix a person, say Bob. ∀z(E(M,z) ∧ E(Bob,z)) says that everyone is an enemy
of Bob and myself! This means that E(M,M) and E(M,Bob) as well! So there is no such y, in any
reasonable universe. Thus ∀z(E(M,z) ∧ E(y,z)) → F(M,y) is true for all y trivially and this does not
depend on F(M,y). In particular, ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y)) is trivially true so long as “I
am not my own enemy.” Thus ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y)) really has no content.
Practice
Using M for “me,” F(x,y) for “y is a friend of x,” and E(x, y) for “y is an enemy of x,” and making the
reasonable assumption that F(x,y) → F(y,x) and E(x,y) → E(y,x) which of the following say: “An
enemy of all my enemies is my friend.”?
● ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y))
●
∀y(∀z(E(M,z) → E(y,z)) → F(M,y))
●
∀y(∃z(E(M,z) ∧ E(y,z)) ∨ F(M,y))
●
∀y(∃z(E(M,z) ∨ ¬E(y,z)) ∨ F(M,y))
This is the correct one. Fix a y, again say Bob (an arbitrary Bob!), then ∀z(E(M,z) → E(y,z)) says
“Anyone who is my enemy is Bob’s enemy.” If this holds, then B(M,Bob), so Bob is my friend. This
is what we were trying to capture.
Practice
Using M for “me,” F(x,y) for “y is a friend of x,” and E(x, y) for “y is an enemy of x,” and making the
reasonable assumption that F(x,y) → F(y,x) and E(x,y) → E(y,x) which of the following say: “An
enemy of all my enemies is my friend.”?
● ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y))
●
∀y(∀z(E(M,z) → E(y,z)) → F(M,y))
●
∀y(∃z(E(M,z) ∧ E(y,z)) ∨ F(M,y))
●
∀y(∃z(E(M,z) ∨ ¬E(y,z)) ∨ F(M,y))
Again fix an arbitrary Bob, then this says that either Bob and I have a common enemy, or we are
friends.
Practice
Using M for “me,” F(x,y) for “y is a friend of x,” and E(x, y) for “y is an enemy of x,” and making the
reasonable assumption that F(x,y) → F(y,x) and E(x,y) → E(y,x) which of the following say: “An
enemy of all my enemies is my friend.”?
● ∀y(∀z(E(M,z) ∧ E(y,z)) → F(M,y))
●
∀y(∀z(E(M,z) → E(y,z)) → F(M,y))
●
∀y(∃z(E(M,z) ∧ E(y,z)) ∨ F(M,y))
●
∀y(∃z(E(M,z) ∧ ¬E(y,z)) ∨ F(M,y))
Finally, once again fix poor Bob. This says “Either there is someone who is my enemy but not
Bob’s enemy, or Bob and I are friends.” Is this what I want? Well, yes!
Formally,
∃z(E(M,z) ∧ ¬E(y,z)) ∨ F(M,y) ≡ ∃z¬(¬E(M,z) ∨ E(y,z)) ≡ ¬∀z(E(M,z) → E(y,z)),
so
∀y(∃z(E(M,z) ∧ ¬E(y,z)) ∨ F(M,y)) ≡ ∀y(¬∀z(E(M,z) → E(y,z))∨ F(M,y))
≡ ∀y(∀z(E(M,z) → E(y,z)) → F(M,y))
Valid Arguments
Definition: An argument is a sequence of propositions p1,p2,p3,...,pn,q. The pi are called the
premises and q is the conclusion. An argument is valid if the conclusion follows from the premises,
put another way, the argument is valid if (p1 ∧ p2 ∧ p3 ∧ … ∧ pn) → q is a tautology. An argument
is sound if it is valid and the premises are true.
Rules of inference: We will start with basic valid arguments called rules of inference and use them to
construct more sophisticated valid arguments.
Modus Ponens
Rule of Inference
p →q
p
────
∴q
Modus Tollens
Tautology
Rule of Inference
Tautology
p ∧ (p → q) → q
p →q
¬q
────
∴ ¬p
¬q ∧ (p → q) → ¬p
Rules of Inference Continued
Simplification
Conjunction
Rule of Inference
Tautology
p∧q p∧q
─── ───
∴p
∴q
(p ∧ q) → p
(p ∧ q) → q
Rule of Inference
Tautology
p
q
────
∴p∧q
p∧q→p∧q
Disjunctive Syllogism
Rule of Inference
p∨q
¬p
────
∴q
Tautology
(p ∨ q) ∧ ¬p → q
Addition
Rule of Inference
Tautology
p
────
∴p∨q
p → (p ∨ q)
Rules of Inference Continued
Hypothetical Syllogism
Rule of
Inference
p→q
q→r
───
∴p→r
Tautology
((p → q) ∧ (q → r)) → (p → r)
Resolution
Rule of
Inference
p∨q
¬p ∨ r
───
∴q∨r
Tautology
((p ∨ q) ∧ (¬p ∨ r)) → (q ∨
r)
When trying to argue for p → q from premises r1, r2, …, rn it is common to add p as a premises and
argue for q, this would give that ((r1 ∧ r2 ∧ … ∧ rn) ∧ p) → q is a tautology. Since
((r1 ∧ r2 ∧ … ∧ rn) ∧ p) → q ≡ (r1 ∧ r2 ∧ … ∧ rn) → (p → q)
We have a valid argument for p → q from premises r1, r2, …, rn. This is used in the following example.
Example: The following example is due to Lewis Carroll (59). Prove that it is a valid argument.
1. All the dated letters in this room are written on blue paper.
2. None of them are in black ink, except those that are written in the third person.
3. I have not filed any of those that I can read.
4. None of those that are written on one sheet are undated.
5. All of those that are not crossed out are in black ink.
6. All of those that are written by Brown begin with “Dear Sir.”
7. All of those that are written on blue paper are filed.
8. None of those that are written on more than one sheet are crossed out.
9. None of those that begin with “Dear sir” are written in the third person.
Therefore, I cannot read any of Brown’s letters
p1: Letter is dated.
p4: Letter is filed.
p7: Letter begins with “Dear Sir”
p2: Letter is on blue paper.
p5: I can read the letter.
p8: Letter is more than one sheet.
p3: Letter is in black ink.
p6: Letter is crossed out.
p9: Letter is written by Brown.
p10: Letter in 3rd person.
p11: Letter on one sheet.
Example: The following example is due to Lewis Carroll (59). Prove that it is a valid argument.
1. p1 → p2
2. p3 → p10
3. p5 → ¬p4
4. p11 → p1
5. ¬p6 → p3
6. p9 → p7
7. p2 → p4
8. p6 → p11
9. p10 → ¬p7
∴ p9 → ¬p5
p9
Assume
¬p3
p9 → p7
(6)
p7
MT
p11 → p1
(4)
p5 → ¬p4
(3)
¬p6 → p3 (5)
p1
MP
∴ ¬p5
MT
MP
¬¬p6
p1 → p2
(1)
p10 → ¬p7
(9)
p6
p2
MP
¬p10
MT
p6 → p11
(8)
p2 → p4
(7)
p3 → p10
(2)
p11
MP
p4
MP
MT
p1: Letter is dated.
p4: Letter is filed.
p7: Letter begins with “Dear Sir”
p2: Letter is on blue paper.
p5: I can read the letter.
p8: Letter is more than one sheet.
p3: Letter is in black ink.
p6: Letter is crossed out.
p9: Letter is written by Brown.
p10: Letter in 3rd person.
p11: Letter on one sheet.
Example: The following example is due to Lewis Carroll (59). Prove that it is a valid argument.
1. p1 → p2
2. p3 → p10
3. p5 → ¬p4
4. p11 → p1
5. ¬p6 → p3
6. p9 → p7
7. p2 → p4
8. p6 → p11
9. p10 → ¬p7
∴ p9 → ¬p5
p9 → p7
Given
p9 → ¬p3
HypSyll
p11 → p1
Given
p5 → ¬p4
Given
p10 → ¬p7
Given
¬p6 → p3
Given
p9 → p1
HypSyll
p4 → ¬p5
ContPos
p7 → ¬p10
ContPos
¬p3 → p6
ContPos
p1 → p2
Given
p9 → ¬p5
HypSyll
p9 → ¬p10
HypSyll
p9 → p6
HypSyll
p9 → p2
HypSyll
p3 → p10
Given
p6 → p11
Given
p2 → p4
Given
¬p10 →
¬p3
ContPos
p9 → p11
HypSyll
p9 → p4
HypSyll
p1: Letter is dated.
p4: Letter is filed.
p7: Letter begins with “Dear Sir”
p2: Letter is on blue paper.
p5: I can read the letter.
p8: Letter is more than one sheet.
p3: Letter is in black ink.
p6: Letter is crossed out.
p9: Letter is written by Brown.
p10: Letter in 3rd person.
p11: Letter on one sheet.
Rules of inference for Quantified statements
Universal Generalization
Rule of Inference
P(a) for arbitrary a
─────────
∴ ∀xP(x)
Tautology
If for all a P(a), then ∀xP(x)
Universal Instantiation
Rule of Inference
∀xP(x)
─────────
∴ P(a)
Tautology
For any a, if ∀xP(x), then P(a).
Rules of inference for Quantified statements
Existential Generalization
Rule of Inference
P(a) for some a
─────────
∴ ∃xP(x)
Tautology
If there is an a such that P(a),
then ∃xP(x)
Existential Instantiation
Rule of Inference
∃xP(x)
─────────
∴ P(a) for some a
Tautology
If ∃xP(x) holds, then for some a, P(a)
holds.
Example
Show that ∃x1∃x2∃x3∃x4(P(x1,x2) ∧ P(x2,x3) ∧ P(x3,x4)) follows from ∀x∃yP(x,y) and
∃x∃yP(x,y)
1. ∃x∃yP(x,y)
Given
9.P(a1,a2) ∧ P(a2,a3) ∧ P(a3,a4)
Conjunction
2. ∃yP(a1,y)
∃-Instantiation
10. ∃x4(P(a1,a2) ∧ P(a2,a3) ∧ P(a3,x4))
∃-Instantiation
2. P(a1,a2)
∃-Instantiation
11. ∃x3∃x4(P(a1,a2) ∧ P(a2,x3) ∧ P(a3,x4))
∃-Instantiation
3. ∀x∃yP(x,y)
Given
∃-Instantiation
4. ∃yP(a2,y)
∀-Instantiation
12. ∃x2∃x3∃x4(P(a1,x2) ∧ P(x2,x3) ∧
P(x3,x4))
∃-Instantiation
5. P(a2,a3)
∃-Instantiation
13. ∃x1∃x2∃x3∃x4(P(x1,x2) ∧ P(x2,x3) ∧
P(x3,x4))
6. ∃yP(a3,y)
∀-Instantiation
7. P(a3,a4)
∃-Instantiation
8. P(a1,a2) ∧
P(a2,a3)
Conjunction
Comments on the example
If we added the premises ∀x∀y(P(x,y) → ¬P(y,x)) and ∀x∀y∀z(P(x,y) ∧ P(y,z) → ¬P(x,z)), then we
would also have that the x1,...,x4 are distinct. As it is P(x,y) = “x=y” is possible.
There is nothing special about 4 so we can get that for any positive integer n, there is a P-chain of
length n, that is: P(xi,xi+1) for 1 ≤ i < n.
Question: Does this mean that we can deduce that there must be an infinite P-chain? Can we actually
express this in the language?
Problem: For any non-negative integer n, show that “There are n, things that satisfy P(x).” is expressible.
Show that “There are ≤ n, things that satisfy P(x).” is expressible in predicate calculus.
Question: What about the proposition “There are finitely many things satisfying P(x).” Can this be
formulated in predicate calculus?
Download