0% found this document useful (0 votes)
55 views

Logic - Stanford

This document discusses propositional logic and its motivation and formal definition. It begins by motivating propositional logic through its application to Boolean circuits and functions. It then formally defines the syntax of propositional logic through an inductive definition of well-formed formulas using a language alphabet and formula-building rules. Finally, it discusses using an induction principle to prove properties about the set of well-formed formulas.

Uploaded by

lightblue1816
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views

Logic - Stanford

This document discusses propositional logic and its motivation and formal definition. It begins by motivating propositional logic through its application to Boolean circuits and functions. It then formally defines the syntax of propositional logic through an inductive definition of well-formed formulas using a language alphabet and formula-building rules. Finally, it discusses using an induction principle to prove properties about the set of well-formed formulas.

Uploaded by

lightblue1816
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 91

CS 257: Advanced Topics in Formal Methods

Fall 2019
Lecture 2

Aleksandar Zeljić
(materials by Clark Barrett)
Stanford University
Outline

I Propositional Logic: Motivation


I Propositional Logic: Syntax
I Propositional Logic: Well-Formed Formulas
I Recognizing Well-Formed Formulas
I Propositional Logic: Semantics
I Truth Tables
I Satisfiability and Tautologies

Material is drawn from Chapter 1 of Enderton.


Propositional Logic: Motivation
Consider an electrical device having n inputs and one output. Assume that to
each input we apply a signal that is either 1 or 0, and that this uniquely
determines whether the output is 1 or 0.

The behavior of such a device is described by a Boolean function:

F (X1 , . . . , Xn ) = the output signal given the input signals X1 , . . . , Xn .

We call such a device a Boolean gate.


The most common Boolean gates are AND, OR, and NOT gates.

AND OR NOT
Propositional Logic: Motivation
The inputs and outputs of Boolean gates can be connected together to form a
combinational Boolean circuit.

D G

A E
I
B

H
F
C

A combinational Boolean circuit corresponds to a directed acyclic graph (DAG)


whose leaves are inputs and each of whose nodes is labeled with the name of a
Boolean gate. One or more of the nodes may be identified as outputs.
A common question with Boolean circuits is whether it is possible to set an
output to true (e.g. when the output represents an error signal).
Suppose your job was to find out if the output of a large Boolean circuit could
ever be true. How would you do it?
Propositional Logic: Motivation
The inputs and outputs of Boolean gates can be connected together to form a
combinational Boolean circuit.

D G

A E
I
B

H
F
C

A combinational Boolean circuit corresponds to a directed acyclic graph (DAG)


whose leaves are inputs and each of whose nodes is labeled with the name of a
Boolean gate. One or more of the nodes may be identified as outputs.
A common question with Boolean circuits is whether it is possible to set an
output to true (e.g. when the output represents an error signal).
Suppose your job was to find out if the output of a large Boolean circuit could
ever be true. How would you do it?
Propositional Logic provides the formalism to answer such questions.
Propositional Logic: Motivation
Propositional (or Sentential) logic is simple but extremely important in
Computer Science

1. It is the basis for day-to-day reasoning (in programming, LSATs, etc.)


2. It is the theory behind digital circuits.
3. Many problems can be translated into propositional logic.
4. It is an important part of more complex logics (such as first-order logic,
also called predicate logic, which we’ll discuss later.)
What is Logic?
A formal logic is defined by its syntax and semantics.
Syntax

I An alphabet is a set of symbols.


I A finite sequence of these symbols is called an expression.
I A set of rules defines the well-formed expressions.

Semantics

I Gives meaning to well-formed expressions


I Formal notions of induction and recursion are required to provide a
rigorous semantics.
Propositional Logic: Syntax
Alphabet
( Left parenthesis Begin group
) Right parenthesis End group
¬ Negation symbol English: not
∧ Conjunction symbol English: and
∨ Disjunction symbol English: or (inclusive)
→ Conditional symbol English: if, then
↔ Bi-conditional symbol English: if and only if
A1 First propositional symbol
A2 Second propositional symbol
...
An nth propositional symbol
...
Propositional Logic: Syntax
Alphabet

I Propositional connective symbols: ¬, ∧, ∨, →, ↔.


I Logical symbols: ¬, ∧, ∨, →, ↔, (, ).
I Parameters or nonlogical symbols: A1 , A2 , A3 , . . .

The meaning of logical symbols is always the same. The meaning of nonlogical
symbols depends on the context.
Propositional Logic: Syntax
An expression is a sequence of symbols. A sequence is denoted explicitly by a
comma separated list enclosed in angle brackets: <a1 , . . . ,am >.
Examples
<(, A1 , ∧, A3 , )>
<(, (, ¬, A1 , ), →, A2 , )>
<), ), ↔, ), A5 >
Propositional Logic: Syntax
An expression is a sequence of symbols. A sequence is denoted explicitly by a
comma separated list enclosed in angle brackets: <a1 , . . . ,am >.
Examples
<(, A1 , ∧, A3 , )> (A1 ∧ A3 )
<(, (, ¬, A1 , ), →, A2 , )> ((¬A1 ) → A2 )
<), ), ↔, ), A5 > )) ↔)A5
For convenience, we will write these sequences as a simple string of symbols,
with the understanding that the formal structure represented is a sequence
containing exactly the symbols in the string.
The formal meaning becomes important when trying to prove things about
expressions.
Propositional Logic: Syntax
An expression is a sequence of symbols. A sequence is denoted explicitly by a
comma separated list enclosed in angle brackets: <a1 , . . . ,am >.
Examples
<(, A1 , ∧, A3 , )> (A1 ∧ A3 )
<(, (, ¬, A1 , ), →, A2 , )> ((¬A1 ) → A2 )
<), ), ↔, ), A5 > )) ↔)A5
For convenience, we will write these sequences as a simple string of symbols,
with the understanding that the formal structure represented is a sequence
containing exactly the symbols in the string.
The formal meaning becomes important when trying to prove things about
expressions.
Not all expressions make sense. Part of the job of defining a syntax is to
restrict the kinds of expressions that will be allowed.
Propositional Logic: Syntax
We define the set W of well-formed formulas (wffs) as follows.

(a) Every expression consisting of a single propositional symbol is in W .


(b) If α and β are in W , so are (¬α), (α ∧ β), (α ∨ β), (α → β), and (α ↔ β).
(c) No expression is in W unless forced by (a) or (b)

This definition is inductive: the set being defined is used as part of the
definition.
Propositional Logic: Syntax
We define the set W of well-formed formulas (wffs) as follows.

(a) Every expression consisting of a single propositional symbol is in W .


(b) If α and β are in W , so are (¬α), (α ∧ β), (α ∨ β), (α → β), and (α ↔ β).
(c) No expression is in W unless forced by (a) or (b)

This definition is inductive: the set being defined is used as part of the
definition.
How would you use this definition to prove that )) ↔)A5 is not a wff ?
Propositional Logic: Syntax
We define the set W of well-formed formulas (wffs) as follows.

(a) Every expression consisting of a single propositional symbol is in W .


(b) If α and β are in W , so are (¬α), (α ∧ β), (α ∨ β), (α → β), and (α ↔ β).
(c) No expression is in W unless forced by (a) or (b)

This definition is inductive: the set being defined is used as part of the
definition.
How would you use this definition to prove that )) ↔)A5 is not a wff ?
Item (c) is too vague for our purposes. To make it more precise we use
induction.
Propositional Logic: Well-Formed Formulas
We can use a formal inductive definition to define the set W of well-formed
formulas in propositional logic.

I U=
I B=
I F =
Propositional Logic: Well-Formed Formulas
We can use a formal inductive definition to define the set W of well-formed
formulas in propositional logic.

I U = the set of all expressions.


I B=
I F =
Propositional Logic: Well-Formed Formulas
We can use a formal inductive definition to define the set W of well-formed
formulas in propositional logic.

I U = the set of all expressions.


I B = the set of expressions consisting of a single propositional symbol.
I F =
Propositional Logic: Well-Formed Formulas
We can use a formal inductive definition to define the set W of well-formed
formulas in propositional logic.

I U = the set of all expressions.


I B = the set of expressions consisting of a single propositional symbol.
I F = the set of formula-building operations:
I E¬ (α) = (¬α)
I E∧ (α, β) = (α ∧ β)
I E∨ (α, β) = (α ∨ β)
I E→ (α, β) = (α → β)
I E↔ (α, β) = (α ↔ β)
Induction
We can call the set generated from B by F simply C .
Now, given any inductive definition of a set, we can prove things about that set
using the following principle.
Induction Principle
If C is the set generated from B by F and S is a set which includes B and is
closed under F (i.e. S is inductive), then C ⊆ S.
Proof
Since S is inductive, and C is the intersection of all inductive sets, it follows
that C ⊆ S.
2
We often use the induction principle to show that an inductive set C has a
particular property. The argument looks like this: (i) Define S to be the subset
of U with some property P; (ii) Show that S is inductive.
This proves that C ⊆ S and thus all elements of C have property P.
Propositional Logic: Well-Formed Formulas
Given our inductive definition of well-formed formulas, we can use the induction
principle to prove things about the set W of well-formed formulas.
Example
Prove that any wff has the same number of left parentheses and right
parentheses.
Proof
Let l(α) be the number of left parentheses and r (α) the number of right
parentheses in an expression α. Let S be the set of all expressions α such that
l(α) = r (α). We wish to show that W ⊆ S. This follows from the induction
principle if we can show that S is inductive.
Base Case:
We must show that B ⊆ S. Recall that B is the set of expressions consisting of
a single propositional symbol. It is clear that for such expressions,
l(α) = r (α) = 0.
Propositional Logic: Well-Formed Formulas
Inductive Case:
We must show that S is closed under each formula-building operator in F .
Propositional Logic: Well-Formed Formulas
Inductive Case:
We must show that S is closed under each formula-building operator in F .

I E¬
Suppose α ∈ S. We know that E¬ (α) = (¬α). It follows that
l(E¬ (α)) = 1 + l(α) and r (E¬ (α)) = 1 + r (α).
But because α ∈ S, we know that l(α) = r (α), so it follows that
l(E¬ (α)) = r (E¬ (α)), and thus E¬ (α) ∈ S.
Propositional Logic: Well-Formed Formulas
Inductive Case:
We must show that S is closed under each formula-building operator in F .

I E¬
Suppose α ∈ S. We know that E¬ (α) = (¬α). It follows that
l(E¬ (α)) = 1 + l(α) and r (E¬ (α)) = 1 + r (α).
But because α ∈ S, we know that l(α) = r (α), so it follows that
l(E¬ (α)) = r (E¬ (α)), and thus E¬ (α) ∈ S.
I E∧
Suppose α, β ∈ S. We know that E∧ (α, β) = (α ∧ β). Thus
l(E∧ (α, β)) = 1 + l(α) + l(β) and r (E∧ (α, β)) = 1 + r (α) + r (β).
As before, it follows from the inductive hypothesis that E∧ (α, β) ∈ S
Propositional Logic: Well-Formed Formulas
Inductive Case:
We must show that S is closed under each formula-building operator in F .

I E¬
Suppose α ∈ S. We know that E¬ (α) = (¬α). It follows that
l(E¬ (α)) = 1 + l(α) and r (E¬ (α)) = 1 + r (α).
But because α ∈ S, we know that l(α) = r (α), so it follows that
l(E¬ (α)) = r (E¬ (α)), and thus E¬ (α) ∈ S.
I E∧
Suppose α, β ∈ S. We know that E∧ (α, β) = (α ∧ β). Thus
l(E∧ (α, β)) = 1 + l(α) + l(β) and r (E∧ (α, β)) = 1 + r (α) + r (β).
As before, it follows from the inductive hypothesis that E∧ (α, β) ∈ S
I The arguments for E∨ , E→ , and E↔ are exactly analogous to the one for
E∧ .

2
Propositional Logic: Well-Formed Formulas
Inductive Case:
We must show that S is closed under each formula-building operator in F .

I E¬
Suppose α ∈ S. We know that E¬ (α) = (¬α). It follows that
l(E¬ (α)) = 1 + l(α) and r (E¬ (α)) = 1 + r (α).
But because α ∈ S, we know that l(α) = r (α), so it follows that
l(E¬ (α)) = r (E¬ (α)), and thus E¬ (α) ∈ S.
I E∧
Suppose α, β ∈ S. We know that E∧ (α, β) = (α ∧ β). Thus
l(E∧ (α, β)) = 1 + l(α) + l(β) and r (E∧ (α, β)) = 1 + r (α) + r (β).
As before, it follows from the inductive hypothesis that E∧ (α, β) ∈ S
I The arguments for E∨ , E→ , and E↔ are exactly analogous to the one for
E∧ .

2
Since S includes B and is closed under the operations in F , it is inductive. It
follows by the induction principle that W ⊆ S.
Propositional Logic: Well-Formed Formulas
Now we can return to the question of how to prove that an expression is not a
wff .
How do we know that )) ↔)A5 is not a wff ?
Propositional Logic: Well-Formed Formulas
Now we can return to the question of how to prove that an expression is not a
wff .
How do we know that )) ↔)A5 is not a wff ?
It does not have the same number of left and right parentheses.
It follows from the theorem we just proved that )) ↔)A5 is not a wff .
An Algorithm for Recognizing WFFs
Lemma
Let α be a wff . Then exactly one of the following is true.

I α is a propositional symbol.
I α = (¬β) where β is a wff .
I α = (β γ) where is one of {∧, ∨, →, ↔}, β is the first
parentheses-balanced initial segment of the result of dropping the first (
from α, and β and γ are wffs.
An Algorithm for Recognizing WFFs
Lemma
Let α be a wff . Then exactly one of the following is true.

I α is a propositional symbol.
I α = (¬β) where β is a wff .
I α = (β γ) where is one of {∧, ∨, →, ↔}, β is the first
parentheses-balanced initial segment of the result of dropping the first (
from α, and β and γ are wffs.

How would you prove this?


An Algorithm for Recognizing WFFs
Lemma
Let α be a wff . Then exactly one of the following is true.

I α is a propositional symbol.
I α = (¬β) where β is a wff .
I α = (β γ) where is one of {∧, ∨, →, ↔}, β is the first
parentheses-balanced initial segment of the result of dropping the first (
from α, and β and γ are wffs.

How would you prove this?


Induction, of course!
An Algorithm for Recognizing WFFs
Input: expression α Output: true or false (indicating whether α is a wff ).

0. Begin with an initial construction tree T containing a single node labeled


with α.
1. If all leaves of T are labeled with propositional symbols, return true.
2. Select a leaf labeled with an expression α1 which is not a propositional
symbol.
3. If α1 does not begin with ( return false.
4. If α1 = (¬β), then add a child to the leaf labeled by α1 , label it with β,
and goto 1.
5. Scan α1 until first reaching (β, where β is a nonempty expression having
the same number of left and right parentheses. If there is no such β,
return false.
6. If α1 = (β γ) where is one of {∧, ∨, →, ↔}, then add two children to
the leaf labeled by α1 , label them with β and γ, and goto 1.
7. Return false.
An Algorithm for Recognizing WFFs
Termination
How do we prove termination of this algorithm?
An Algorithm for Recognizing WFFs
Termination
How do we prove termination of this algorithm?
We can show that the sum of the lengths of all the expressions labeling leaves
decreases on each iteration of the loop.
An Algorithm for Recognizing WFFs
Termination
How do we prove termination of this algorithm?
We can show that the sum of the lengths of all the expressions labeling leaves
decreases on each iteration of the loop.
Soundness
If the algorithm returns true when given input α, then α is a wff .
The proof is by induction on the tree T generated by the algorithm from the
leaves up to the root.
An Algorithm for Recognizing WFFs
Termination
How do we prove termination of this algorithm?
We can show that the sum of the lengths of all the expressions labeling leaves
decreases on each iteration of the loop.
Soundness
If the algorithm returns true when given input α, then α is a wff .
The proof is by induction on the tree T generated by the algorithm from the
leaves up to the root.
Completeness
If α is a wff , then the algorithm will return true.
Proof using the induction principle for the set of wffs.
Notational Conventions

I Larger variety of propositional symbols: A, B, C , D, p, q, r , etc.


I Outermost parentheses can be omitted: A ∧ B instead of (A ∧ B).
I Negation symbol binds stronger than binary connectives and its scope is as
small as possible: ¬A ∧ B means ((¬A) ∧ B).
I {∧, ∨} bind stronger than {→, ↔}: A ∧ B → ¬C ∨ D is
((A ∧ B) → ((¬C ) ∨ D))
I When one symbol is used repeatedly, grouping is to the right: A ∧ B ∧ C is
(A ∧ (B ∧ C ))

Note that conventions are only unambiguous for wffs, not for arbitrary
expressions.
Propositional Logic: Semantics
Intuitively, given a wff α and a value (either T or F) for each propositional
symbol in α, we should be able to determine the value of α.
How do we make this precise?
Let v be a function from B to {F, T}. We call this function a truth assignment.
Now, we define v , a function from W to {F, T} as follows (we compute with F
and T as if they were 0 and 1 respectively).

I For each propositional symbol Ai , v (Ai ) = v (Ai ).


I v (E¬ (α)) = T − v (α)
I v (E∧ (α, β)) = min(v (α), v (β))
I v (E∨ (α, β)) = max(v (α), v (β))
I v (E→ (α, β)) = max(T − v (α), v (β))
I v (E↔ (α, β)) = T − |v (α) − v (β)|

The recursion theorem and the unique readability theorem guarantee that v is
well-defined. (see Enderton)
Truth Tables
There are other ways to present the semantics which are less formal but
perhaps more intuitive.

α β α∧β
α ¬α T T
T T F
F F T
F F

α β α∨β α β α→β α β α↔β


T T T T T T
T F T F T F
F T F T F T
F F F F F F
Truth Tables
There are other ways to present the semantics which are less formal but
perhaps more intuitive.

α β α∧β
α ¬α T T
T F T F
F T F T
F F

α β α∨β α β α→β α β α↔β


T T T T T T
T F T F T F
F T F T F T
F F F F F F
Truth Tables
There are other ways to present the semantics which are less formal but
perhaps more intuitive.

α β α∧β
α ¬α T T T
T F T F F
F T F T F
F F F

α β α∨β α β α→β α β α↔β


T T T T T T
T F T F T F
F T F T F T
F F F F F F
Truth Tables
There are other ways to present the semantics which are less formal but
perhaps more intuitive.

α β α∧β
α ¬α T T T
T F T F F
F T F T F
F F F

α β α∨β α β α→β α β α↔β


T T T T T T T T T
T F T T F F T F F
F T T F T T F T F
F F F F F T F F T
Complex truth tables
Truth tables can also be used to calculate all possible values of v for a given
wff : We associate a column with each propositional symbol and a column with
each propositional connective. There is a row for each possible truth
assignment to the propositional connectives.

A1 A2 A3 (A1 ∨ (A2 ∧ ¬A3 ))


T T T T T
T T F T T
T F T T F
T F F T F
F T T F T
F T F F T
F F T F F
F F F F F
Complex truth tables
Truth tables can also be used to calculate all possible values of v for a given
wff : We associate a column with each propositional symbol and a column with
each propositional connective. There is a row for each possible truth
assignment to the propositional connectives.

A1 A2 A3 (A1 ∨ (A2 ∧ ¬A3 ))


T T T T T F
T T F T T T
T F T T F F
T F F T F T
F T T F T F
F T F F T T
F F T F F F
F F F F F T
Complex truth tables
Truth tables can also be used to calculate all possible values of v for a given
wff : We associate a column with each propositional symbol and a column with
each propositional connective. There is a row for each possible truth
assignment to the propositional connectives.

A1 A2 A3 (A1 ∨ (A2 ∧ ¬A3 ))


T T T T T F F
T T F T T T T
T F T T F F F
T F F T F F T
F T T F T F F
F T F F T T T
F F T F F F F
F F F F F F T
Complex truth tables
Truth tables can also be used to calculate all possible values of v for a given
wff : We associate a column with each propositional symbol and a column with
each propositional connective. There is a row for each possible truth
assignment to the propositional connectives.

A1 A2 A3 (A1 ∨ (A2 ∧ ¬A3 ))


T T T T T T F F
T T F T T T T T
T F T T T F F F
T F F T T F F T
F T T F F T F F
F T F F T T T T
F F T F F F F F
F F F F F F F T
Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
A wff α is satisfiable if there exists some truth assignment v which satisfies α.
Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
A wff α is satisfiable if there exists some truth assignment v which satisfies α.
Suppose Σ is a set of wffs. Then Σ tautologically implies α, Σ |= α, if every
truth assignment which satisfies each formula in Σ also satisfies α.
Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
A wff α is satisfiable if there exists some truth assignment v which satisfies α.
Suppose Σ is a set of wffs. Then Σ tautologically implies α, Σ |= α, if every
truth assignment which satisfies each formula in Σ also satisfies α.
Particular cases:

I If ∅ |= α, then we say α is a tautology or α is valid and write |= α.


Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
A wff α is satisfiable if there exists some truth assignment v which satisfies α.
Suppose Σ is a set of wffs. Then Σ tautologically implies α, Σ |= α, if every
truth assignment which satisfies each formula in Σ also satisfies α.
Particular cases:

I If ∅ |= α, then we say α is a tautology or α is valid and write |= α.


I If Σ is unsatisfiable, then Σ |= α for every wff α.
Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
A wff α is satisfiable if there exists some truth assignment v which satisfies α.
Suppose Σ is a set of wffs. Then Σ tautologically implies α, Σ |= α, if every
truth assignment which satisfies each formula in Σ also satisfies α.
Particular cases:

I If ∅ |= α, then we say α is a tautology or α is valid and write |= α.


I If Σ is unsatisfiable, then Σ |= α for every wff α.
I If α |= β (shorthand for {α} |= β) and β |= α, then α and β are
tautologically equivalent.
Definitions
If α is a wff , then a truth assignment v satisfies α if v (α) = T.
A wff α is satisfiable if there exists some truth assignment v which satisfies α.
Suppose Σ is a set of wffs. Then Σ tautologically implies α, Σ |= α, if every
truth assignment which satisfies each formula in Σ also satisfies α.
Particular cases:

I If ∅ |= α, then we say α is a tautology or α is valid and write |= α.


I If Σ is unsatisfiable, then Σ |= α for every wff α.
I If α |= β (shorthand for {α} |= β) and β |= α, then α and β are
tautologically equivalent.
I Σ |= α if and only if (Σ) → α is valid.
V
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B)
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B)
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B
I {A, ¬A} |= (A ∧ ¬A)
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B
I {A, ¬A} |= (A ∧ ¬A)
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B
I {A, ¬A} |= (A ∧ ¬A)
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B

Suppose you had an algorithm SAT which would take a wff α as input and
return true if α is satisfiable and false otherwise. How would you use this
algorithm to verify each of the claims made above?
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B (A ∧ (A → B) ∧ (¬B))
I {A, ¬A} |= (A ∧ ¬A)
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B

Suppose you had an algorithm SAT which would take a wff α as input and
return true if α is satisfiable and false otherwise. How would you use this
algorithm to verify each of the claims made above?
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B (A ∧ (A → B) ∧ (¬B))
I {A, ¬A} |= (A ∧ ¬A) (A ∧ (¬A) ∧ ¬(A ∧ ¬A))
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B

Suppose you had an algorithm SAT which would take a wff α as input and
return true if α is satisfiable and false otherwise. How would you use this
algorithm to verify each of the claims made above?
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B (A ∧ (A → B) ∧ (¬B))
I {A, ¬A} |= (A ∧ ¬A) (A ∧ (¬A) ∧ ¬(A ∧ ¬A))
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B
¬(¬(A ∧ B) ↔ (¬A ∨ ¬B))

Suppose you had an algorithm SAT which would take a wff α as input and
return true if α is satisfiable and false otherwise. How would you use this
algorithm to verify each of the claims made above?
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B (A ∧ (A → B) ∧ (¬B))
I {A, ¬A} |= (A ∧ ¬A) (A ∧ (¬A) ∧ ¬(A ∧ ¬A))
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B
¬(¬(A ∧ B) ↔ (¬A ∨ ¬B))

Now suppose you had an algorithm CHECKVALID which returns true when α
is valid and false otherwise. How would you verify the claims given this
algorithm?
Examples

I (A ∨ B) ∧ (¬A ∨ ¬B) is satisfiable, but not valid.


I (A ∨ B) ∧ (¬A ∨ ¬B) ∧ (A ↔ B) is unsatisfiable.
I {A, A → B} |= B (A ∧ (A → B) ∧ (¬B))
I {A, ¬A} |= (A ∧ ¬A) (A ∧ (¬A) ∧ ¬(A ∧ ¬A))
I ¬(A ∧ B) is tautologically equivalent to ¬A ∨ ¬B
¬(¬(A ∧ B) ↔ (¬A ∨ ¬B))

Now suppose you had an algorithm CHECKVALID which returns true when α
is valid and false otherwise. How would you verify the claims given this
algorithm?
Satisfiability and validity are dual notions: α is unsatisfiable if and only if ¬α is
valid.
Determining Satisfiability using Truth Tables
An Algorithm for Satisfiability
To check whether α is satisfiable, form the truth table for α. If there is a row
in which T appears as the value for α, then α is satisfiable. Otherwise, α is
unsatisfiable.
Determining Satisfiability using Truth Tables
An Algorithm for Satisfiability
To check whether α is satisfiable, form the truth table for α. If there is a row
in which T appears as the value for α, then α is satisfiable. Otherwise, α is
unsatisfiable.

An Algorithm for Tautological Implication


To check whether {α1 , . . . , αk } |= β, check the satisfiability of
(α1 ∧ · · · ∧ αk ) ∧ (¬β). If it is unsatisfiable, then {α1 , . . . , αk } |= β, otherwise
{α1 , . . . , αk } 6|= β.
Determining Satisfiability using Truth Tables
What is the complexity of this algorithm?
Determining Satisfiability using Truth Tables
What is the complexity of this algorithm?
2n where n is the number of propositional symbols.
Determining Satisfiability using Truth Tables
What is the complexity of this algorithm?
2n where n is the number of propositional symbols.
Can you think of a way to speed up these algorithms?
Determining Satisfiability using Truth Tables
What is the complexity of this algorithm?
2n where n is the number of propositional symbols.
Can you think of a way to speed up these algorithms?
In an upcoming lecture, we will discuss some of the applications and
best-known techniques for the SAT algorithm.
Some tautologies
Associative and Commutative laws for ∧, ∨, ↔
Some tautologies
Associative and Commutative laws for ∧, ∨, ↔
Distributive Laws

I (A ∧ (B ∨ C )) ↔ ((A ∧ B) ∨ (A ∧ C )).
I (A ∨ (B ∧ C )) ↔ ((A ∨ B) ∧ (A ∨ C )).
Some tautologies
Associative and Commutative laws for ∧, ∨, ↔
Distributive Laws

I (A ∧ (B ∨ C )) ↔ ((A ∧ B) ∨ (A ∧ C )).
I (A ∨ (B ∧ C )) ↔ ((A ∨ B) ∧ (A ∨ C )).

Negation

I ¬¬A ↔ A
I ¬(A → B) ↔ (A ∧ ¬B)
I ¬(A ↔ B) ↔ ((A ∧ ¬B) ∨ (¬A ∧ B))
Some tautologies
Associative and Commutative laws for ∧, ∨, ↔
Distributive Laws

I (A ∧ (B ∨ C )) ↔ ((A ∧ B) ∨ (A ∧ C )).
I (A ∨ (B ∧ C )) ↔ ((A ∨ B) ∧ (A ∨ C )).

Negation

I ¬¬A ↔ A
I ¬(A → B) ↔ (A ∧ ¬B)
I ¬(A ↔ B) ↔ ((A ∧ ¬B) ∨ (¬A ∧ B))

De Morgan’s Laws

I ¬(A ∧ B) ↔ (¬A ∨ ¬B)


I ¬(A ∨ B) ↔ (¬A ∧ ¬B)
More Tautologies
Implication

I (A → B) ↔ (¬A ∨ B)
More Tautologies
Implication

I (A → B) ↔ (¬A ∨ B)

Excluded Middle

I A ∨ ¬A
More Tautologies
Implication

I (A → B) ↔ (¬A ∨ B)

Excluded Middle

I A ∨ ¬A

Contradiction

I ¬(A ∧ ¬A)
More Tautologies
Implication

I (A → B) ↔ (¬A ∨ B)

Excluded Middle

I A ∨ ¬A

Contradiction

I ¬(A ∧ ¬A)

Contraposition

I (A → B) ↔ (¬B → ¬A)
More Tautologies
Implication

I (A → B) ↔ (¬A ∨ B)

Excluded Middle

I A ∨ ¬A

Contradiction

I ¬(A ∧ ¬A)

Contraposition

I (A → B) ↔ (¬B → ¬A)

Exportation

I ((A ∧ B) → C ) ↔ (A → (B → C ))
Propositional Connectives
We have five connectives: ¬, ∧, ∨, →, ↔. Would we gain anything by having
more? Would we lose anything by having fewer?
Propositional Connectives
We have five connectives: ¬, ∧, ∨, →, ↔. Would we gain anything by having
more? Would we lose anything by having fewer?
Example: Ternary Majority Connective #
E# (α, β, γ) = (#αβγ)
v ((#αβγ)) = T iff the majority of v (α), v (β), and v (γ) are T.
Propositional Connectives
We have five connectives: ¬, ∧, ∨, →, ↔. Would we gain anything by having
more? Would we lose anything by having fewer?
Example: Ternary Majority Connective #
E# (α, β, γ) = (#αβγ)
v ((#αβγ)) = T iff the majority of v (α), v (β), and v (γ) are T.
What does this new connective do for us?
Propositional Connectives
We have five connectives: ¬, ∧, ∨, →, ↔. Would we gain anything by having
more? Would we lose anything by having fewer?
Example: Ternary Majority Connective #
E# (α, β, γ) = (#αβγ)
v ((#αβγ)) = T iff the majority of v (α), v (β), and v (γ) are T.
What does this new connective do for us?
The extended language obtained by allowing this new symbol has the
same expressive power as the original language.
Propositional Connectives
We have five connectives: ¬, ∧, ∨, →, ↔. Would we gain anything by having
more? Would we lose anything by having fewer?
Example: Ternary Majority Connective #
E# (α, β, γ) = (#αβγ)
v ((#αβγ)) = T iff the majority of v (α), v (β), and v (γ) are T.
What does this new connective do for us?
The extended language obtained by allowing this new symbol has the
same expressive power as the original language.
Every Boolean function can be realized by a wff which uses only the
connectives {¬, ∧, ∨}. We say that this set of connectives is complete.
In fact, we can do better. It turns out that {¬, ∧} and {¬, ∨} are complete as
well.

A formula is in DNF if it is a disjunction of formulas, each of which is a


conjunction of literals, where a literal is either a propositional symbol or its
negation.
In fact, we can do better. It turns out that {¬, ∧} and {¬, ∨} are complete as
well.
Why?

A formula is in DNF if it is a disjunction of formulas, each of which is a


conjunction of literals, where a literal is either a propositional symbol or its
negation.
In fact, we can do better. It turns out that {¬, ∧} and {¬, ∨} are complete as
well.
Why?
α ∨ β ↔ ¬(¬α ∧ ¬β)
α ∧ β ↔ ¬(¬α ∨ ¬β)
Using these identities, the completeness can be easily proved by induction.
A formula is in DNF if it is a disjunction of formulas, each of which is a
conjunction of literals, where a literal is either a propositional symbol or its
negation.
Completeness of Propositional Connectives
Example
Let G be a 3-place Boolean function defined as follows:
G (F, F, F) = F
G (F, F, T) = T
G (F, T, F) = T
G (F, T, T) = F
G (T, F, F) = T
G (T, F, T) = F
G (T, T, F) = F
G (T, T, T) = T
Completeness of Propositional Connectives
Example
Let G be a 3-place Boolean function defined as follows:
G (F, F, F) = F
G (F, F, T) = T
G (F, T, F) = T
G (F, T, T) = F
G (T, F, F) = T
G (T, F, T) = F
G (T, T, F) = F
G (T, T, T) = T
There are four points at which G is true, so a DNF formula which realizes G is
(¬A1 ∧ ¬A2 ∧ A3 ) ∨ (¬A1 ∧ A2 ∧ ¬A3 ) ∨ (A1 ∧ ¬A2 ∧ ¬A3 ) ∨ (A1 ∧ A2 ∧ A3 ).
Completeness of Propositional Connectives
Example
Let G be a 3-place Boolean function defined as follows:
G (F, F, F) = F
G (F, F, T) = T
G (F, T, F) = T
G (F, T, T) = F
G (T, F, F) = T
G (T, F, T) = F
G (T, T, F) = F
G (T, T, T) = T
There are four points at which G is true, so a DNF formula which realizes G is
(¬A1 ∧ ¬A2 ∧ A3 ) ∨ (¬A1 ∧ A2 ∧ ¬A3 ) ∨ (A1 ∧ ¬A2 ∧ ¬A3 ) ∨ (A1 ∧ A2 ∧ A3 ).
Note that another formula which realizes G is A1 ↔ A2 ↔ A3 . Thus, adding
additional connectives to a complete set may allow a function to be realized
more concisely.

You might also like