comp3610_slides pld
comp3610_slides pld
Peter Höfner
1
Section 0
Admin
2
Lecturer
Consultation
Thursday 12pm – 1pm, or by appointment
3
CoLecturer and Tutors
• Dr Fabian Muelboeck
[email protected]
• Abhaas Goyal
[email protected]
• Weiyou Wang
[email protected]
4
Lectures
• Wednesday, 3 pm – 5 pm
Thursday, 11 am – 12 pm
• Rm 5.02 Marie Reay, Bldg 155
• Q/A session in Week 12
• Etiquette
▶ engage
▶ feel free to ask questions
▶ we reject behaviour that strays into harassment,
no matter how mild
5
Tutorials
• Summary
▶ your chance to discuss problems
▶ discuss home work
▶ discuss additional exercises
6
Plan/Schedule I
Resources
web: https://cs.anu.edu.au/courses/comp3610/
wattle: https://wattlecourses.anu.edu.au/course/view.php?id=41142
edstem: https://edstem.org/
(you will be registered at the end of the week)
Workload
The average student workload is 130 hours for a six unit course.
That is roughly 11 hours/week.
https://policies.anu.edu.au/ppl/document/ANUP_000691
7
Plan/Schedule II
Assessment criteria
• Quizz: 0% (for feedback only)
• Assignments: 35%, 4 assignments (35marks)
• Oral exam: 65% (65 marks) [hurdle]
• hurdle: minimum of 40% in the final exam
Assessments (tentative)
No Hand Out Hand In Marks
0 31/07 03/08 0
1 02/08 10/08 5
2 16/08 31/08 10
3 20/09 12/10 10
4 18/10 02/11 10
8
About the Course I
9
About the Course II
Topics (tentative)
The following schedule is tentative and likely to change.
Topic
0 Admin
1 introduction
2 IMP and its Operational Semantics
3 Types
4 Derivation and Proofs
5 Functions, Call-by-Value, Call-by-Name
6 Typing for Call-By-Value
7 Data Types and Subtyping
8 Denotational Semantics
9 Axiomatic Semantics
10 Concurrency
11 Formal Verification
10
About the Course IV
Disclaimer
This is has been redesigned fairly recently.
The material in these notes has been drawn from several different
sources, including the books and similar courses at some other
universities. Any errors are of course all the author’s own work.
As it is a newly designed course, changes in timetabling are quite likely.
Feedback (oral, email, survey, . . . ) is highly appreciated.
11
Academic Integrity
12
Reading Material
13
Section 1
Introduction
14
Foundational Knowledge of Disciplines
Mechanical Engineering
Students learn about torque
d(r × ω) dω dr
=r× + ×ω
dt dt dt
15
Foundational Knowledge of Disciplines
Electrical Engineering / Astro Physics
Students learn about complex impedance
ejωt = cos(ωt) + j sin(ωt)
16
Foundational Knowledge of Disciplines
Civil Engineering / Surveying
Students learn about trigonometry
sin(θ + ϕ) = sin θ cos ϕ + cos θ sin ϕ
17
Foundational Knowledge of Disciplines
Software Engineering / Computer Science
Students learn about ???
Figure: First Ariane 5 Flight, 1996 [ESA] Figure: Heartbleed, 2014 [Wikipedia]
18
Programming Languages
19
Constituents
20
Use of Semantics
21
Style of Description (Syntax and Semantics)
• natural language
• definition ‘by’ compiler behaviour
• mathematically
22
Introductory Examples: C
23
Introductory Examples: C♯
In C♯ , what is the output of the following?
delegate i n t IntThunk ( ) ;
class C {
p u b l i c s t a t i c v o i d Main ( ) {
I n t T h u n k [ ] f u n c s = new I n t T h u n k [ 1 1 ] ;
f o r ( i n t i = 0 ; i <= 1 0 ; i ++)
{
funcs [ i ] = delegate ( ) { r e t u r n i ; } ;
}
foreach ( IntThunk f i n funcs )
{
System . Console . W r i t e L i n e ( f ( ) ) ;
}
}
}
24
Introductory Examples: JavaScript
f u n c t i o n bar ( x ) {
return function () {
var x = x ;
return x ;
};
}
v a r f = bar ( 2 0 0 ) ;
f ()
25
About This Course
26
Use of formal, mathematical semantics
Implementation issues
Machine-independent specification of behaviour. Correctness of program
analyses and optimisations.
Language design
Can bring to light ambiguities and unforeseen subtleties in programming
language constructs. Mathematical tools used for semantics can suggest
useful new programming styles. (E.g. influence of Church’s lambda
calculus (circa 1934) on functional programming).
Verification
Basis of methods for reasoning about program properties and program
specifications.
27
Styles of semantics
Operational
Meanings for program phrases defined in terms of the steps of
computation they can take during program execution.
Denotational
Meanings for program phrases defined abstractly as elements of some
suitable mathematical structure.
Axiomatic
Meanings for program phrases defined indirectly via the axioms and
rules of some logic of program properties.
28
Section 2
IMP
and its Operational Semantics
29
‘Toy’ languages
30
Design choices, from Micro to Macro
• basic values
• evaluation order
• what is guaranteed at compile-time and run-time
• how effects are controlled
• how concurrency is supported
• how information hiding is enforceable
• how large-scale development and re-use are supported
• ...
31
IMP1 – Introductory Example
1 Basically the same as in Winskel 1993 (IMP) and in Hennessy 1990 (WhileL)
32
IMP – Syntax
Booleans b ∈ B = {true, false}
Integers (Values) n ∈ Z = {. . . , −1, 0, 1, . . . }
Locations l ∈ L = {l, l0 , l1 , l2 , . . . }
Operations op ::= + | ≥
Expressions
E ::= n | b | E op E |
l := E | !l |
skip | E ; E |
if E then E else E
while E do E
33
Transition systems
34
IMP Semantics (1 of 4) – Configurations
35
Transitions – Examples
⟨l := 2 + !l , {l 7→ 3}⟩
−→ ⟨l := 2 + 3 , {l 7→ 3}⟩
−→ ⟨l := 5 , {l 7→ 3}⟩
−→ ⟨skip , {l 7→ 5}⟩
−→
̸
36
IMP Semantics (2 of 4) – Rules (basic operations)
(op+) ⟨n1 + n2 , s⟩ −→ ⟨n , s⟩ if n = n1 + n2
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(op1)
⟨E1 op E2 , s⟩ −→ ⟨E1′ op E2 , s′ ⟩
⟨E2 , s⟩ −→ ⟨E2′ , s′ ⟩
(op2)
⟨v op E2 , s⟩ −→ ⟨v op E2′ , s′ ⟩
37
Rules (basic operations) – Examples
⟨(2 + 3) + (4 + 5) , ∅⟩
38
IMP Semantics (3 of 4) – Store and Sequencing
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(assign2)
⟨l := E , s⟩ −→ ⟨l := E ′ , s′ ⟩
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(seq2)
⟨E1 ; E2 , s⟩ −→ ⟨E1′ ; E2 , s′ ⟩
39
Store and Sequencing – Examples
40
Store and Sequencing – Examples
⟨l := 3 ; l := !l , {l 7→ 0}⟩ −→ ?
⟨42 + !l , ∅⟩ −→ ?
41
IMP Semantics (4 of 4) – Conditionals and While
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(if3)
⟨if E1 then E2 else E3 , s⟩ −→ ⟨if E1′ then E2 else E3 , s′ ⟩
(while)
⟨while E1 do E2 , s⟩ −→ ⟨if E1 then (E2 ; while E1 do E2 ) else skip , s⟩
42
IMP – Examples
If
E = l2 := 0 ; while !l1 ≥ 1 do (l2 := !l2 + !l1 ; l1 := !l1 + −1)
s = {l1 7→ 3, l2 7→ 0}
then
⟨E , s⟩ −→∗ ?
43
Determinacy
Theorem (Determinacy)
If ⟨E , s⟩ −→ ⟨E1 , s1 ⟩ and ⟨E , s⟩ −→ ⟨E2 , s2 ⟩
then ⟨E1 , s1 ⟩ = ⟨E2 , s2 ⟩.
Proof.
later ⊔
⊓
44
Reminder
45
Language design I
Order of Evaluation
IMP uses left-to-right evaluation. For example
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(op2’)
⟨E1 op v , s⟩ −→ ⟨E1′ op v , s′ ⟩
In this language
46
Language design II
Assignment results
Recall
(assign1) ⟨l := n , s⟩ −→ ⟨skip , s + {l 7→ n}⟩ if l ∈ dom(s)
(seq1’) ⟨v ; E2 , s⟩ −→ ⟨E2 , s⟩
47
Language design III
Store initialisation
Recall
(deref) ⟨!l , s⟩ −→ ⟨n , s⟩ if l ∈ dom(s) and s(l) = n
Assumes l ∈ dom(s).
48
Language design IV
Storable values
• our language only allows integer values (store: L ⇀ Z)
• could we store any value? Could we store locations, or even
programs?
• store is global and cannot create new locations
49
Language design V
50
Expressiveness
51
Section 3
Types
52
Type systems
53
Run-time errors
Trapped errors
Cause execution to halt immediately.
Examples: jumping to an illegal address, raising a top-level exception.
Innocuous?
Untrapped errors
May go unnoticed for a while and later cause arbitrary behaviour.
Examples: accessing data past the end of an array, security loopholes in
Java abstract machines.
Insidious!
54
Formal type systems
55
Types of IMP
Types of expressions
56
Type Judgement (1 of 3)
(int) Γ ⊢ n : int if n ∈ Z
(bool) Γ ⊢ b : bool if b ∈ B = {true, false}
Γ ⊢ E1 : int Γ ⊢ E2 : int
(op+)
Γ ⊢ E1 + E2 : int
Γ ⊢ E1 : int Γ ⊢ E2 : int
(op≥)
Γ ⊢ E1 ≥ E2 : bool
Γ ⊢ E1 : bool Γ ⊢ E2 : T Γ ⊢ E3 : T
(if)
Γ ⊢ if E1 then E2 else E3 : T
57
Type Judgement – Example
( INT ) ( INT )
{} ⊢ 3 : int {} ⊢ 4 : int
( BOOL ) ( INT ) ( OP+)
{} ⊢ false : bool {} ⊢ 2 : int {} ⊢ 3 + 4 : int
( IF )
{} ⊢ if false then 2 else 3 + 4 : int
58
Type Judgement (2 of 3)
Γ(l) = intref
(deref)
Γ ⊢ !l : int
59
Type Judgement (3 of 3)
Γ ⊢ E1 : unit Γ ⊢ E2 : T
(seq)
Γ ⊢ E1 ; E2 : T
Γ ⊢ E1 : bool Γ ⊢ E2 : unit
(while)
Γ ⊢ while E1 do E2 : unit
60
Type Judgement – Properties
Theorem (Progress)
If Γ ⊢ E : T and dom(Γ) ⊆ dom(s) then either E is a value or there exist
E ′ and s′ such that ⟨E , s⟩ −→ ⟨E ′ , s′ ⟩.
61
Type Safety
62
Type checking, typeability, and type inference
Type inference is usually harder than type checking, for a type T needs
to be computed.
63
Properties
Moreover
Theorem (Uniqueness of typing)
If Γ ⊢ E : T and Γ ⊢ E : T ′ then T = T ′ .
64
Section 4
65
Why Proofs
66
(Mathematical) Induction
67
Natural Induction I
68
Natural Induction II
Theorem
∀n ∈ IN .Φ(n).
Proof.
Base case: show Φ(0)
Induction step: ∀k. Φ(k) =⇒ Φ(k + 1)
For that we fix an arbitrary k.
Assume Φ(k) derive Φ(k + 1). ⊔
⊓
n·(n+1)
Example: 0 + 1 + 2 + · · · + n = 2 .
69
Natural Induction III
Theorem
∀n ∈ IN .Φ(n).
Proof.
Base case: show Φ(0)
Induction step: ∀i, k.0 ≤ i ≤ k. Φ(i) =⇒ Φ(k + 1)
For that we fix an arbitrary k.
Assume ϕ(i) for all i ≤ k derive ϕ(k + 1). ⊔
⊓
φn −ψ n
Example: Fn = φ−ψ , √
1+ 5
with Fn is the n-th Fibonacci number, φ = 2 (the golden ratio) and
√
ψ = 1−2 5 .
70
Structural Induction I
71
Structural Induction II
72
Structural Induction over Expressions
73
Abstract Syntax
E ::= n | b | E op E |
l := E | !l |
if E then E else E |
skip | E ; E |
while E do E
74
Abstract Syntax Tree I
if then else
≥ skip ;
!l 0 skip l :=
75
Abstract Syntax Tree II
2 2 4
• ambiguity, e.g., (1 + 2) + 3 ̸= 1 + (2 + 3)
+ +
+ 3 1 +
1 2 2 3
Parentheses are only used for disambiguation – they are not part of
the grammar
76
Structural Induction (for abstract syntax)
Theorem
∀E ∈ IMP. Φ(E)
Proof.
Base case(s): show Φ(E) for each unary tree constructor (leaf)
Induction step(s): show it for the remaining constructors
⊔
⊓
77
Structural Induction (syntax IMP)
78
Proving Determinacy – Outline
Theorem (Determinacy)
For all E, s, E1 , s1 , E2 and s2 ,
if ⟨E , s⟩ −→ ⟨E1 , s1 ⟩ and ⟨E , s⟩ −→ ⟨E2 , s2 ⟩
then ⟨E1 , s1 ⟩ = ⟨E2 , s2 ⟩.
Proof.
Choose
def
Φ(E) = ∀s, E ′ , s′ , E ′′ , s′′ .
(⟨E , s⟩ −→ ⟨E ′ , s′ ⟩ ∧ ⟨E , s⟩ −→ ⟨E ′′ , s′′ ⟩)
=⇒ ⟨E ′ , s′ ⟩ = ⟨E ′′ , s′′ ⟩
79
Proving Determinacy – Sketch
80
Proving Determinacy – auxiliary lemma
Proof.
• E is a value iff it is of the form n, b, skip
• By examination of the rules . . .
there is no rule with conclusion of the form ⟨E , s⟩ −→ ⟨E ′ , s′ ⟩ for E
a value
⊔
⊓
81
Inversion I
In proofs involving inductive definitions. one often needs an inversion
property.
Given a tuple in one inductively defined relation, gives you a case
analysis of the possible “last rule” used.
Lemma (Inversion for −→)
If ⟨E , s⟩ −→ ⟨Ê , ŝ⟩ then either
1. (op+): there exists n1 , n2 and n such that E = n1 op n2 , Ê = n,
ŝ = s and n = n1 + n2 ,
(Note: +s have different meanings in this statements), or
2. (op1): there exists E1 , E2 , op and E1′ such that E = E1 op E2 ,
Ê = E1′ op E2 and ⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩, or
3. . . .
82
Inversion II
83
Determinacy – Intuition
+ 3
!l 2
84
Rule Induction
85
Inductive Definition of −→
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(op1)
⟨E1 op E2 , s⟩ −→ ⟨E1′ op E2 , s′ ⟩
86
Derivation Tree (Transition Relation) – Example
( OP+)
⟨2 + 2 , {}⟩ −→ ⟨4 , {}⟩
( OP 1)
⟨(2 + 2) + 3 , {}⟩ −→ ⟨4 + 3 , {}⟩
( OP 1)
⟨(2 + 2) + 3 ≥ 5 , {}⟩ −→ ⟨4 + 3 ≥ 5 , {}⟩
87
Derivation Tree (Typing Judgement) – Example
Γ(l) = intref
( DERREF ) ( INT )
Γ ⊢!l : int Γ ⊢ 2 : int
( OP +) ( INT )
Γ ⊢ !l + 2 : int Γ ⊢ 3 : int
( OP +)
Γ ⊢ (!l + 2) + 3 : int
88
Principle of Rule Induction I
For any property Φ(a) of elements a of A, and any set of rules which
define a subset SR of A, to prove
∀a ∈ SR . Φ(a)
89
Principle of Rule Induction II
For any property Φ(a) of elements a of A, and any set of rules which
define a subset SR of A, to prove
∀a ∈ SR . Φ(a)
h1 . . . hk
c
90
Proving Progress I
Theorem (Progress)
If Γ ⊢ E : T and dom(Γ) ⊆ dom(s) then either E is a value or there exist
E ′ and s′ such that ⟨E , s⟩ −→ ⟨E ′ , s′ ⟩.
Proof.
Choose
91
Proving Progress II
92
Proving Progress III
Case (op+):
Γ ⊢ E1 : int Γ ⊢ E2 : int
(op+)
Γ ⊢ E1 + E2 : int
93
Proving Progress IV
Case (op+) (cont’d):
• we have to show
∃E ′ , s′ . ⟨E1 + E2 , s⟩ −→ ⟨E ′ , s′ ⟩
(op+) ⟨n1 + n2 , s⟩ −→ ⟨n , s⟩ if n = n1 + n2
94
Proving Progress V
Lemma
For all Γ, E, T , if Γ ⊢ E : T is a value with T = int
then there exists n ∈ Z with E = n.
95
Derivation Tree (Typing Judgement) – Example
Γ(l) = intref
( DEREF ) ( INT )
Γ ⊢ !l : int Γ ⊢ 2 : int
( OP +) ( INT )
Γ ⊢ !l + 2 : int Γ ⊢ 3 : int
( OP +)
Γ ⊢ (!l + 2) + 3 : int
96
Which Induction Principle to Use?
97
Structural Induction (Repetition)
98
Why care about Proofs?
99
Section 5
Functions
100
Functions, Methods, Procedures, . . .
101
Examples
add one : : I n t −> I n t
add one n = n + 1
p u b l i c i n t add one ( i n t x ) {
return ( x +1);
}
< s c r i p t t y p e =” t e x t / v b s c r i p t ”>
f u n c t i o n addone ( x )
addone = x+1
end f u n c t i o n
</ s c r i p t >
102
Introductory Examples: C♯
In C♯ , what is the output of the following?
delegate i n t IntThunk ( ) ;
class C {
p u b l i c s t a t i c v o i d Main ( ) {
I n t T h u n k [ ] f u n c s = new I n t T h u n k [ 1 1 ] ;
f o r ( i n t i = 0 ; i <= 1 0 ; i ++)
{
funcs [ i ] = delegate ( ) { r e t u r n i ; } ;
}
foreach ( IntThunk f i n funcs )
{
System . Console . W r i t e L i n e ( f ( ) ) ;
}
}
}
In my opinion, the design was wrong.
103
Functions – Examples
(fn x : int ⇒ x + 1)
(fn x : int ⇒ x + 1) 7
(fn y : int ⇒ (fn x : int ⇒ x + y))
(fn y : int ⇒ (fn x : int ⇒ x + y)) 1
(fn x : int → int ⇒ (fn y : int ⇒ x (x y)))
(fn x : int → int ⇒ (fn y : int ⇒ x (x y))) (fn x : int ⇒ x + 1)
(fn x : int → int ⇒ (fn y : int ⇒ x (x y))) (fn z : int ⇒ z + 1) 7
104
Functions – Syntax
105
Variable Shadowing
106
Alpha conversion
107
Alpha conversion: free and bound variables
17
x+y
(fn x : int ⇒ x + 2)
(fn x : int ⇒ x + z)
if y then 2 + x else ((fn x : int ⇒ x + 2) z)
108
Alpha Conversion – Binding Examples
(fn x : int ⇒ x + 2)
(fn x : int ⇒ x + z)
(fn y : int ⇒ y + z)
(fn z : int ⇒ z + z)
109
Alpha Conversion – Convention
For example
110
Syntax Trees up to Alpha Conversion
+ + +
x z y z z z
111
Syntax Trees up to Alpha Conversion II
Add pointers
+ + +
• z • z • •
112
Syntax Trees up to Alpha Conversion III
+ +
• z • z
113
Syntax Trees up to Alpha Conversion IV
Application and function type
(fn • : int ⇒ )
@ @
(fn • : int ⇒ ) 7 @ •
• • •
114
De Bruijn indices
• these pointers are known as De Bruijn indices
• each occurrence of a bound variable is represented by the number
of fn-nodes you have to pass
(fn • : int ⇒ (fn • : int ⇒ v0 + 2)) ̸= (fn • : int ⇒ (fn • : int ⇒ v1 + 2))
+ +
• 2 • 2
115
Free Variables
fv(x) = {x}
fv(E1 op E2 ) = fv(E1 ) ∪ fv(E2 )
fv((fn x : T ⇒ E)) = fv(E) − {x}
116
Substitution – Examples
Examples
{3/x} (x ≥ x) = (3 ≥ 3)
117
Substitution
Definition
def E if x = z
{E/z} x =
x otherwise
def
{E/z} (fn x : T ⇒ E1 ) = (fn x : T ⇒ ({E/z} E1 )) if x ̸= z and x ̸∈ fv(E)(∗ )
def
{E/z} (E1 E2 ) = ({E/z} E1 ) ({E/z} E2 )
...
118
Substitution – Example
119
Simultaneous Substitution
120
Definition Substitution [for completeness]
121
Function Behaviour
122
Function Behaviour
E = (fn x : unit ⇒ (l := 1) ; x) (l := 2)
123
Choice 1: Call-by-Value
Idea: reduce left-hand-side of application to an fn-term;
then reduce argument to a value;
then replace all occurrences of the formal parameter in the fn-term by
that value.
E = (fn x : unit ⇒ (l := 1) ; x) (l := 2)
⟨E , {l 7→ 0}⟩
−→ ⟨(fn x : unit ⇒ (l := 1) ; x) skip , {l 7→ 2}⟩
−→ ⟨(l := 1) ; skip , {l 7→ 2}⟩
−→ ⟨skip ; skip , {l 7→ 1}⟩
−→ ⟨skip , {l 7→ 1}⟩
124
Call-by-Value – Semantics
Values
v ::= b | n | skip | (fn x : T ⇒ E)
SOS rules
all sos rules we used so far, plus the following
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(app1)
⟨E1 E2 , s⟩ −→ ⟨E1′ E2 , s′ ⟩
⟨E2 , s⟩ −→ ⟨E2′ , s′ ⟩
(app2)
⟨v E2 , s⟩ −→ ⟨v E2′ , s′ ⟩
125
Call-by-Value – Example I
126
Call-by-Value – Example II
127
Choice 2: Call-by-Name
Idea: reduce left-hand-side of application to an fn-term;
then replace all occurrences of the formal parameter in the fn-term by
that argument.
E = (fn x : unit ⇒ (l := 1) ; x) (l := 2)
⟨E , {l 7→ 0}⟩
−→ ⟨(l := 1) ; (l := 2) , {l 7→ 0}⟩
−→ ⟨skip ; (l := 2) , {l 7→ 1}⟩
−→ ⟨l := 2 , {l 7→ 1}⟩
−→ ⟨skip , {l 7→ 2}⟩
128
Call-by-Name – Semantics
SOS rules
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(CBN-app)
⟨E1 E2 , s⟩ −→ ⟨E1′ E2 , s′ ⟩
129
Choice 3: Full Beta
130
Full Beta – Semantics
Values
v ::= b | n | skip | (fn x : T ⇒ E)
SOS rules
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(beta-app1)
⟨E1 E2 , s⟩ −→ ⟨E1′ E2 , s′ ⟩
⟨E2 , s⟩ −→ ⟨E2′ , s′ ⟩
(beta-app2)
⟨E1 E2 , s⟩ −→ ⟨E1 E2′ , s′ ⟩
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(beta-fn2)
⟨(fn x : T ⇒ E) , s⟩ −→ ⟨(fn x : T ⇒ E ′ ) , s′ ⟩
131
Full Beta – Example
(fn x : int ⇒ x + x) (2 + 2)
(fn x : int ⇒ x + x) 4 (2 + 2) + (2 + 2)
4 + (2 + 2) (2 + 2) + 4
4+4
132
Choice 4: Normal-Order Reduction
133
Section 6
134
Typing Functions - TypeEnvironment
135
Typing Functions
Γ, x : T ⊢ E : T ′
(fn)
Γ ⊢ (fn x : T ⇒ E) : T → T ′
Γ ⊢ E1 : T → T ′ Γ ⊢ E2 : T
(app)
Γ ⊢ E1 E2 : T ′
136
Typing Functions – Example I
( VAR ) ( INT )
x : int ⊢ x : int x : int ⊢ 2 : int
( OP+)
x : int ⊢ x + 2 : int
( FN ) ( INT )
{} ⊢ (fn x : int ⇒ x + 2) : int → int {} ⊢ 2 : int
( APP )
{} ⊢ (fn x : int ⇒ x + 2) 2 : int
137
Typing Functions – Example II
138
Properties Typing
139
Proving Type Preservation
Proof outline.
Choose
140
Proving Type Preservation – Auxiliary Lemma
Lemma (Substitution)
If E closed, Γ ⊢ E : T and Γ, x : T ⊢ E ′ : T ′ with x ̸∈ dom(Γ) then
Γ ⊢ {E/x} E ′ : T ′ .
141
Type Safety
142
Normalisation
Theorem (Normalisation)
In the sublanguage without while loops, if Γ ⊢ E : T and E closed then
there does not exist an infinite reduction sequence
Proof.
See B. Pierce, Types and Programming Languages, Chapter 12. ⊔
⊓
143
Section 7
Recursion
144
Scoping
Name Definitions
restrict the scope of variables
• x is a binder for E2
• can be seen as syntactic sugar:
145
Derived sos-rules and typing
Γ ⊢ E1 : T Γ, x : T ⊢ E2 : T ′
(let)
Γ ⊢ let val x : T = E1 in E2 end : T ′
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(let1)
⟨let val x : T = E1 in E2 end , s⟩ −→ ⟨let val x : T = E1′ in E2 end , s′ ⟩
146
Recursion – An Attempt
Consider
r = (fn y : int ⇒ if y ≥ 1 then y + (r (y + −1)) else 0)
where r is the recursive call (variable occurring in itself).
What is the evaluation of r 3?
We could try
E ::= . . . | let val rec x : T = E in E ′ end
where x is a binder for both E and E ′ .
147
However . . .
148
Recursive Functions
149
Recursive Functions – Syntax and Typing
150
Recursive Functions – Semantics
151
Redundancies?
• Do we need E1 ; E2 ?
No: E1 ; E2 ≡ (fn y : unit ⇒ E2 ) E1
• Do we need while E1 do E2 ?
No:
152
Redundancies?
• Do we need recursion?
Yes! Previously, normalisation theorem effectively showed that
while adds expressive power; now, recursion is even more powerful.
153
Side remarks I
• naive implementations (in particular substitutions) are inefficient
(more efficient implementations are shown in courses on compiler
construction)
• more concrete – closer to implementation or machine code – are
possible
• usually refinement to prove compiler to be correct
(e.g. CompCert or CakeML)
154
Side remarks I – CakeML
155
Side remarks II: Big-step Semantics
• we have seen a small-step semantics
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
⟨E , s⟩ ⇓ ⟨E ′ , s′ ⟩
For example
Data
157
Recap and Missing Steps
158
Products – Syntax
T ::= . . . | T ∗ T
159
Products – Typing
Γ ⊢ E1 : T1 Γ ⊢ E2 : T2
(pair)
Γ ⊢ (E1 , E2 ) : T1 ∗ T2
Γ ⊢ E : T1 ∗ T2
(proj1)
Γ ⊢ fst E : T1
Γ ⊢ E : T1 ∗ T2
(proj2)
Γ ⊢ snd E : T2
160
Products – Semantics
Values
v ::= . . . | (v, v)
SOS rules
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(pair1)
⟨(E1 , E2 ) , s⟩ −→ ⟨(E1′ , E2 ) , s′ ⟩
⟨E2 , s⟩ −→ ⟨E2′ , s′ ⟩
(pair2)
⟨(v, E2 ) , s⟩ −→ ⟨(v, E2′ ) , s′ ⟩
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩ ⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(proj3) (proj4)
⟨fst E , s⟩ −→ ⟨fst E ′ , s′ ⟩ ⟨snd E , s⟩ −→ ⟨snd E ′ , s′ ⟩
161
Sums (Variants, Tagged Unions) – Syntax
T ::= . . . | T + T
162
Sums – Typing I
Γ ⊢ E : T1
(inl)
Γ ⊢ inl E : T1 + T2 : T1 + T2
Γ ⊢ E : T2
(inr)
Γ ⊢ inr E : T1 + T2 : T1 + T2
Γ ⊢ E : T1 + T2 Γ, x : T1 ⊢ E1 : T Γ, y : T2 ⊢ E2 : T
(case)
Γ ⊢ case E of inl x : T1 ⇒ E1 | inr y : T2 ⇒ E2 : T
163
Sums – Typing II
164
Sums – Semantics
Values
v ::= . . . | inl v : T | inr v : T
SOS rules
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩ ⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(inl) (inr)
⟨inl E : T , s⟩ −→ ⟨inl E ′ : T , s′ ⟩ ⟨inr E : T , s⟩ −→ ⟨inr E ′ : T , s′ ⟩
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(case1)
⟨case E of inl x : T1 ⇒ E1 | inr y : T2 ⇒ E2 , s⟩
−→ ⟨case E ′ of inl x : T1 ⇒ E1 | inr y : T2 ⇒ E2 , s′ ⟩
(case2) ⟨case inl v : T of inl x : T1 ⇒ E1 | inr y : T2 ⇒ E2 , s⟩
−→ ⟨{v/x} E1 , s⟩
165
Constructors and Destructors
T ∗T (, ) fst snd
166
Proofs as Programs
(var) Γ, x : T ⊢ x : T Γ, P ⊢ P
Γ, x : T ⊢ E : T ′ Γ, P ⊢ P ′
(fn)
Γ ⊢ (fn x : T ⇒ E) : T → T ′ Γ ⊢ P → P′
Γ ⊢ E1 : T → T ′ Γ ⊢ E2 : T Γ ⊢ P → P′ Γ⊢P
(app) (modus ponens)
Γ ⊢ E1 E2 : T ′ Γ⊢P ′
...
167
Proofs as Programs: The Curry-Howard
correspondence
(var) Γ, x:T ⊢ x : T Γ, P ⊢ P
Γ, x:T ⊢ E : T ′ Γ, P ⊢ P ′
(fn)
Γ ⊢ (fn x : T ⇒ E) : T → T ′ Γ ⊢ P → P′
Γ ⊢ E1 : T → T ′ Γ ⊢ E2 : T Γ ⊢ P → P′ Γ⊢P
(app) (modus ponens)
Γ ⊢ E1 E2 : T ′ Γ ⊢ P′
Γ ⊢ E1 : T1 Γ ⊢ E2 : T2 Γ ⊢ P1 Γ ⊢ P2
(pair)
Γ ⊢ (E1 , E2 ) : T1 ∗ T2 Γ ⊢ P1 ∧ P2
Γ ⊢ E : T1 ∗ T2 Γ ⊢ E : T1 ∗ T2 Γ ⊢ P1 ∧ P2 Γ ⊢ P1 ∧ P2
(proj1) (proj2)
Γ ⊢ fst E : T1 Γ ⊢ snd E : T2 Γ ⊢ P1 Γ ⊢ P2
Γ ⊢ E : T1 Γ ⊢ E : T2 Γ ⊢ P1 Γ ⊢ P2
(inl) (inr)
Γ ⊢ inl E : T1 + T2 : T1 + T2 Γ ⊢ inr E : T1 + T2 : T1 + T2 Γ ⊢ P1 ∨ P2 Γ ⊢ P1 ∨ P2
Γ ⊢ E : T1 + T2 Γ, x : T1 ⊢ E1 : T Γ, y : T2 ⊢ E2 : T Γ ⊢ P1 ∨ P2 Γ, P1 ⊢ P Γ, P2 ⊢ P
(case)
Γ ⊢ case E of inl x : T1 ⇒ E1 | inr y : T2 ⇒ E2 : T Γ⊢P
168
Curry-Howard correspondence (abstract)
169
Datatypes in Haskell
Datatypes in Haskell generalise both sums and products
data P a i r = P I n t Double
data E i t h e r = I I n t | D Double
The expression
data Expr = I n t V a l I n t
| BoolVal Bool
| P a i r V a l I n t Bool
is (roughly) like saying
170
More Datatypes - Records
A generalisation of products.
Labels lab ∈ LAB for a set LAB = {p, q, ...}
171
Records – Typing
Γ ⊢ E1 : T1 ... Γ ⊢ Ek : Tk
(record)
Γ ⊢ {lab1 = E1 , . . . , labk = Ek } :{lab1 : T1 , . . . , labk : Tk }
Γ ⊢ E :{lab1 : T1 , . . . , labk : Tk }
(recordproj)
Γ ⊢ #labi E : Ti
172
Records – Semantics
Values
v ::= . . . | {lab1 = v1 , . . . , labk = vk }
SOS rules
⟨Ei , s⟩ −→ ⟨Ei′ , s′ ⟩
(record1)
⟨{lab1 = v1 , . . . , labi−1 = vi−1 , labi = Ei , . . . , labk = Ek } , s⟩
−→ ⟨{lab1 = v1 , . . . , labi−1 = vi−1 , labi = Ei′ , . . . , labk = Ek } , s′ ⟩
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(record3)
⟨#labi E , s⟩ −→ ⟨#labi E ′ , s′ ⟩
173
Mutable Store I
1. our approach
E ::= . . . | l := E | !l | x
174
Mutable Store II
void foo ( x : i n t ) {
l = l + x
...
}
175
References
T ::= . . . | T ref
E ::= · · · | ll := E | !l
| E1 := E2 | !E | ref E | l
176
References – Typing
Γ ⊢ E :T
(ref)
Γ ⊢ ref E : T ref
Γ ⊢ E1 : T ref Γ ⊢ E2 : T
(assign)
Γ ⊢ E1 := E2 : unit
Γ ⊢ E : T ref
(deref)
Γ ⊢ !E : T
Γ(l) = T ref
(loc)
Γ ⊢ l : T ref
177
References – Semantics I
Values
A location is a value v ::= . . . | l
Stores s were finite partial functions L ⇀ Z.
We now take them to be finite partial functions from L to all values.
SOS rules
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(ref2)
⟨ref E , s⟩ −→ ⟨ref E ′ , s′ ⟩
178
References – Semantics II
(deref1) ⟨!l , s⟩ −→ ⟨v , s⟩ if l ∈ dom(s) and s(l) = v
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(deref2)
⟨!E , s⟩ −→ ⟨!E ′ , s′ ⟩
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(assign2)
⟨l := E , s⟩ −→ ⟨l := E ′ , s′ ⟩
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(assign3)
⟨E := E2 , s⟩ −→ ⟨E ′ := E2 , s′ ⟩
179
Type Checking the Store
180
Type Checking – Example
Example
⟨E , {}⟩
∗
−→ ⟨E1 , {l1 7→ (fn z : int ⇒ z)⟩
−→∗ ⟨E2 , {l1 7→ (fn z : int ⇒ if z ≥ 1 then z + ((!l1 )(z + −1)) else 0)}⟩
−→∗ ⟨6 , . . . ⟩
181
Progress and Type Preservation
Theorem (Progress)
If E closed, Γ ⊢ E : T and Γ ⊢ s then either E is a value or there exist E ′
and s′ such that ⟨E , s⟩ −→ ⟨E ′ , s′ ⟩.
182
Type Safety
183
Section 9
Exceptions
184
Motivation
Trapped errors
Cause execution to halt immediately.
Examples: jumping to an illegal address, raising a top-level exception.
Innocuous?
Untrapped errors
May go unnoticed for a while and later cause arbitrary behaviour.
Examples: accessing data past the end of an array, security loopholes in
Java abstract machines.
Insidious!
E ::= . . . | error
(err) Γ ⊢ error : T
186
Errors – Semantics
SOS rules
187
Errors
188
Choice 2: Handling Exceptions
189
Handling Exceptions – Typing and Semantics
try E1 with E2 means ‘return result of evaluating E1 , unless it aborts, in
which case the handler E2 is evaluated’
Typing
Γ ⊢ E1 : T Γ ⊢ E2 : T
(try)
Γ ⊢ try E1 with E2 : T
SOS rules
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(try3)
⟨try E1 with E2 , s⟩ −→ ⟨try E1′ with E2 , s′ ⟩
190
Choice 3: Exceptions with Values
191
Exceptions with Values – Typing
Typing
Γ ⊢ E : Tex
(try ex)
Γ ⊢ raise E : T
Γ ⊢ E1 : T Γ ⊢ E2 : Tex → T
(try v)
Γ ⊢ try E1 with E2 : T
192
Exceptions with Values – Semantics
SOS rules
⟨E , s⟩ −→ ⟨E1′ , s′ ⟩
(rai)
⟨raise E , s⟩ −→ ⟨raise E ′ , s′ ⟩
(rai2) ⟨raise (raise v) , s⟩ −→ ⟨raise v , s⟩
193
The Type Tex (I)
• Tex = nat: corresponds to errno in Unix OSs;
0 indicates success; other values report various exceptional
conditions.
(similar in C++).
• Tex = string: avoids looking up error codes; more descriptive; error
handling may now require parsing a string
• Tex could be of type record
194
The Type Tex (II)
195
Section 10
Subtyping
196
Motivation (I)
197
Polymorphism
198
Motivation (II)
Γ ⊢ E1 : T → T ′ Γ ⊢ E2 : T
(app)
Γ ⊢ E1 E2 : T ′
we cannot type
even though the function gets a ‘better’ argument, with more structure
199
Subsumption
better: any term of type {p : int, q : int} can be used wherever a term of
type {p : int} is expected.
T <: T ′
200
Example
201
The Subtype Relation <:
(s-refl) T <: T
T <: T ′ T ′ <: T ′′
(s-trans)
T <: T ′′
202
Subtyping – Records
(s-rcd1) {lab1 :T1 , . . . , labk :Tk , labk+1 :Tk+1 , .., labk+n :Tk+n }
<: {lab1 :T1 , . . . , labk :Tk }
π a permutation of 1, . . . , k
(s-rcd3)
{lab1 :T1 , . . . , labk :Tk } <: {labπ(1) : Tπ(1) , . . . , labπ(k) :Tπ(k) }
203
Subtyping – Functions (I)
204
Subtyping – Functions (II)
If f : T1 → T2 then
– f can use any argument which is a subtype of T1 ;
– the result of f can be regarded as any supertype of T2
205
Subtyping – Functions (III)
we have
206
Subtyping – Top and Bottom
207
Subtyping – Products and Sums
Products
Sums
Exercise
208
Subtyping – References (I)
No
209
Subtyping – References (II)
T <: T ′ T ′ <: T
(s-ref)
T ref <: T ′ ref
Example:
{a:int, b:bool} ref <: {b:bool, a:int} ref
210
Typing – Remarks
Semantics
no change required (we did not change the grammar for expressions)
Properties
Type preservation, progress and type safety hold
Implementation
Type inference is more complicated; good run-time is also tricky due to
re-ordering
211
Down Casts
The rule (sub) permits up-casting. How down-casting?
E ::= . . . | (T ) E
Typing rule
Γ ⊢ E :T′
Γ ⊢ (T ) E : T
212
Section 11
(Imperative) Objects
Case Study
213
Motivation
214
(Simple) Objects
215
Reminder
Scope Restriction
• x is a binder for E2
• can be seen as syntactic sugar:
216
Objects – Example
A Counter Object
let val c : {get : unit → int, inc : unit → unit} =
let val x : int ref = ref 0 in
{get = (fn y : unit ⇒ !x),
inc = (fn y : unit ⇒ x := !x + 1)}
end
in
(#inc c)() ; (#get c)()
end
217
Objects – Example
Subtyping I
let val c : {get : unit → int, inc : unit → unit , reset : unit → unit} =
let val x : int ref = ref 0 in
{get = (fn y : unit ⇒ !x),
inc = (fn y : unit ⇒ x := !x + 1)}
reset = (fn y : unit ⇒ x := 0)}
end
in
(#inc c)() ; (#get c)()
end
ResCounter = {get : unit → int, inc : unit → unit, reset : unit → unit}
218
Objects – Example
Subtyping II
219
Objects – Example
Object Generators
220
Simple Classes
221
Reusing Method Code
222
(Simple) Classes
223
IMP vs. Java
c l a s s Counter
{ protected i n t p ;
Counter ( ) { t h i s . p =0; }
i n t get ( ) { r e t u r n t h i s . p ; }
v o i d i n c ( ) { t h i s . p++ ; }
};
224
(Simple) Classes
(fn ResCounterClass : CounterRep → ResCounter ⇒
(fn x : CounterRep ⇒
let val super : Counter = CounterClass x in
{get = #get super,
inc = #inc super,
reset = (fn y : unit ⇒ (#p x) := 0)}
end))
225
IMP vs. Java
c l a s s ResetCounter
extends Counter
{ v o i d r e s e t ( ) { t h i s . p =0;}
};
226
(Simple) Classes
Implementing IMP
228
Motivation
229
Implementations of IMP I
• ML
https://www.cl.cam.ac.uk/teaching/2021/Semantics/L2/
P. Sewell
• C
“any compiler”
• Java
https://www.cl.cam.ac.uk/teaching/2021/Semantics/L1/l1.java
M. Parkinson
• Haskell
(several implementations available)
230
Implementations of IMP II
• Coq
https://softwarefoundations.cis.upenn.edu/lf-current/Imp.html
B. Pierce
• Isabelle
https://isabelle.in.tum.de (src/HOL/IMP)
G. Klein and T. Nipkow
231
Section 13
IMP in Isabelle/HOL
232
Motivation/Disclaimer
233
Isabelle/HOL – Introduction
234
Isabelle/HOL – Terms (Expressions)
• Functions
▶ application: f E
call of function f with parameter E
▶ abstraction: λx. E
function with parameter x (of some type) and result E ((fn x : T? ⇒ t))
▶ Convention (as always) f E1 E2 E3 ≡ ((f E1 ) E2 ) E3
235
Isabelle/HOL – Types I
• Basic syntax (Isabelle)
τ ::= (τ )
| bool | int | string | . . . base types
′
| a | ′b | . . . type variables
| τ ⇒τ functions
| τ ×τ pairs
| τ list lists
| τ set sets
| ... user-defined types
Convention: τ1 ⇒ τ2 ⇒ τ3 ≡ τ1 ⇒ (τ2 ⇒ τ3 )
• Terms must be well-typed; in particular
t :: τ1 ⇒ τ2 u :: τ1
t u :: τ2
236
Isabelle/HOL – Types II
Type inference
• automatic
Currying
• curried vs. tupled
f τ1 ⇒ τ2 ⇒ τ3 vs f τ1 × τ2 ⇒ τ3
f a1 where a1 :: τ1
237
Isabelle (Cheatsheet I)
Syntax: theory M yT h
imports T h1 , . . . , T hn
begin
(definitions, lemmas, theorems, proofs, . . . )∗
end
238
IMP – Syntax (recap)
Booleans b ∈ B = {true, false}
Integers (Values) n ∈ Z = {. . . , −1, 0, 1, . . . }
Locations l ∈ L = {l, l0 , l1 , l2 , . . . }
Operations op ::= + | ≥
Expressions
E ::= n | b | E op E |
l := E | !l |
if E then E else E |
skip | E ; E |
whileE do E
239
IMP – Syntax (aexp and bexp)
Booleans b∈B
Integers (Values) n∈Z
Locations l ∈ L = {l, l0 , l1 , l2 , . . . }
Operations aop ::= +
Expressions
aexp ::= n | !l | aexp aop aexp
bexp ::= b | bexp ∧ bexp | aexp ≥ aexp
com ::= n | b | E op E |
l ::= aexp | !l |
IF bexp THEN com ELSE com |
SKIP | com ;; com |
WHILE bexp DO com
240
IMP – Syntax (Isabelle)
Booleans bool
Integers (Values) int
Locations string
Expressions
datatype aexp ::= N int | L loc | Plus aexp aexp
datatype bexp ::= B bool | Geq aexp aexp
datatype com ::= Assign loc aexp |
If bexp com com |
SKIP | Seq com com |
WHile bexp com
241
IMP – Syntax (Isabelle)
LINK: /src/HOL/IMP
242
Isabelle (Cheatsheet II)
243
Small-step semantics
244
IMP Semantics
(deref) ⟨!l , s⟩ −→ ⟨n , s⟩ if l ∈ dom(s) and s(l) = n
(assign1) ⟨l := n , s⟩ −→ ⟨skip , s + {l 7→ n}⟩ if l ∈ dom(s)
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
(assign2)
⟨l := E , s⟩ −→ ⟨l := E ′ , s′ ⟩
(seq1) ⟨skip; E2 , s⟩ −→ ⟨E2 , s⟩
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(seq2)
⟨E1 ; E2 , s⟩ −→ ⟨E1 ; E2 , s⟩
(if1) ⟨if true then E2 else E3 , s⟩ −→ ⟨E2 , s⟩
(if2) ⟨if false then E2 else E3 , s⟩ −→ ⟨E3 , s⟩
⟨E1 , s⟩ −→ ⟨E1′ , s′ ⟩
(if3)
⟨if E1 then E2 else E3 , s⟩ −→ ⟨if E1′ then E2 else E3 , s′ ⟩
(while) ⟨whileE1 do E2 , s⟩ −→ ⟨if E1 then (E2 ; whileE1 do E2 ) then skip , s⟩
245
IMP Semantics
246
IMP Semantics (Isabelle)
247
IMP – Examples
248
Isabelle (Cheatsheet III)
249
Big-step semantics
(in Isabelle/HOL)
250
Another View: Big-step Semantics
• we have seen a small-step semantics
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩
⟨E , s⟩ ⇓ ⟨E ′ , s′ ⟩
For example
251
Final State
⟨E , s⟩ =⇒ s′
252
Semantics
(Skip) ⟨SKIP , s⟩ =⇒ s
(Assign) ⟨l := a , s⟩ =⇒ s + {l 7→ aval a s)
⟨E1 , s⟩ =⇒ s′ ⟨E2 , s′ ⟩ =⇒ s′′
(Seq)
⟨E1 ; E2 , s⟩ =⇒ s′′
bval b s = true ⟨E1 , s⟩ =⇒ s′
(IfT)
⟨if b then E1 else E2 , s⟩ =⇒ s′
bval b s = false ⟨E2 , s⟩ =⇒ s′
(IfF)
⟨if b then E1 else E2 , s⟩ =⇒ s′
bval b s = false
(WhileF)
⟨while b do E , s⟩ =⇒ s
bval b s = true ⟨E , s⟩ =⇒ s′ ⟨while b do E , s′ ⟩ =⇒ s′′
(WhileT)
⟨while b do E , s⟩ =⇒ s′′
253
IMP Semantics (Isabelle)
• inversion rules
• induction set up
• see Nipkow/Klein for more details and explanation
254
Are big and small-step semantics equivalent?
255
Isabelle (Cheatsheet IV)
256
From Big to Small
Theorem
If cs ⇒ s′ then cs −→∗ ⟨SKIP , s′ ⟩.
Proof by rule induction (on cs ⇒ s′ ).
257
From Small to Big
Theorem
If cs −→∗ ⟨SKIP , s′ ⟩ then cs ⇒ s′ .
Proof by rule induction (on cs −→∗ ⟨SKIP , s′ ⟩).
258
Equivalence
Corollary
cs −→∗ ⟨SKIP , s′ ⟩ if and only if cs ⇒ s′ .
259
But are they really equivalent?
• What about premature termination?
• What about (non) termination?
Lemma
1. final ⟨E , s⟩ if and only if E = SKIP.
2. ∃s. cs ⇒ s if and only if ∃cs′ . cs −→∗ cs′ ∧ final cs′ .
where final cs ≡ (¬∃cs′ . cs → cs′ )
Proof.
1. induction and rule inversion
2. (∃s. cs ⇒ s) ⇔ ∃s. cs −→∗ ⟨SKIP , s⟩ (by big = small)
⇔ ∃cs′ . cs −→∗ cs′ ∧ final cs′ (by final = SKIP)
⊔
⊓
260
Typing
(almost straight-forward)
LINK: /src/HOL/Types
inductive btyping :: ''tyenv ⇒ bexp ⇒ bool''(infix ''⊢''50)
where
B ty:''Γ ⊢ Bc v'' |
Not ty:''Γ ⊢ b =⇒ Γ ⊢ Not b'' |
And ty:''Γ ⊢ b1 =⇒ Γ ⊢ b2 =⇒ Γ ⊢ And b1 b2'' |
Less ty:''Γ ⊢ a1 : τ =⇒ Γ ⊢ a2 : τ =⇒ Γ ⊢ Less a1 a2''
261
Section 14
Semantic Equivalence
262
Operational Semantics (Reminder)
• describe how to evaluate programs
• a valid program is interpreted as sequences of steps
• small-step semantics
▶ individual steps of a computation
▶ more rules (compared to big-step)
▶ allows to reason about non-terminating programs, concurrency, . . .
• big-step semantics
▶ overall results of the executions
‘divide-and-conquer manner’
▶ can be seen as relations
▶ fewer rules, simpler proofs
▶ no non-terminating behaviour
• allow non-determinism
263
Motivation CakeML
• compiler construction
• program optimisation
• refinement
• ...
264
Equivalence: Intuition I
? ?
l :=!l + 2 ≃ l :=!l + (1 + 1) ≃ l :=!l + 1 ; l :=!l + 1
265
Equivalence: Intuition II
?
l := 0 ; 4 ≃ l := 1 ; 3 +!l
C[l := 0 ; 4] = (l := 0 ; 4) +!l
̸≃
C[l := 1 ; 3 +!l] = (l := 1 ; 3 + !l) +!l
266
Equivalence: Intuition III
267
Exercise
let val x : int ref = ref 0 in (fn y : int ⇒ (x :=!x + y) ;!x) end
?
≃
let val x : int ref = ref 0 in (fn y : int ⇒ (x :=!x − y) ; (0 −!x)) end
268
Exercise II
op := . . . | =
Γ ⊢ E1 : T ref Γ ⊢ E2 : T ref
(op =)
Γ ⊢ E1 = E2 : bool
(op=1) ⟨l = l′ , s⟩ −→ ⟨b , s⟩ if b = (l = l′ )
(op=2) ...
269
Exercise II
?
f ≃g
for
and
270
Exercise II (cont’d)
?
f ≃g NO
Consider C[ ] = t with
⟨t f , s⟩ −→∗ ?
⟨t g , s⟩ −→∗ ?
271
A ‘good’ notion of semantic equivalence
We might
• understand what a program is
• prove that some particular expressions to be equivalent
(e.g. efficient algorithm vs. clear specification)
• prove the soundness of general laws for equational reasoning about
programs
• prove some compiler optimisations are sound (see CakeML or
CertiCos)
• understand the differences between languages
272
What does ‘good’ mean?
(∃s, s1 , s2 , v1 , v2 .
⟨E1 , s⟩ −→∗ ⟨v1 , s1 ⟩ ∧
⟨E2 , s⟩ −→∗ ⟨v2 , s2 ⟩ ∧
v1 ̸= v2 )
⇒ E1 ̸≃ E2
273
What does ‘good’ mean?
3. ≃ must be an equivalence relation, i.e.
reflexivity E≃E
symmetry E1 ≃ E2 ⇒ E2 ≃ E1
transitivity E1 ≃ E2 ∧ E2 ≃ E3 ⇒ E1 ≃ E3
274
Semantic Equivalence for (simple) Typed IMP
Definition
E1 ≃TΓ E2 iff for all stores s with dom(Γ) ⊆ dom(s) we have
Γ ⊢ E1 : T and Γ ⊢ E2 : T ,
and either
(i) ⟨E1 , s⟩ −→ω and ⟨E2 , s⟩ −→ω , or
(ii) for some v, s′ we have ⟨E1 , s⟩ −→∗ ⟨v , s′ ⟩ and ⟨E2 , s⟩ −→∗ ⟨v , s′ ⟩.
275
Justification
Part (ii) requires same value v and same store s′ . If a program generates
different stores, we can distinguish them using contexts:
• If T = unit then C[ ] = ;!l
• If T = bool then C[ ] = if then !l else !l
• If T = int then C[ ] = (l1 := ;!l)
276
Equivalence Relation
Theorem
The relation ≃TΓ is an equivalence relation.
Proof.
trivial ⊔
⊓
277
Congruence for (simple) Typed IMP
contexts are:
C[ ] ::= op E2 | E1 op |
if then E2 else E3 |
if E1 then else E3 |
if E1 then E2 else |
l := |
; E2 | E1 ;
while do E2 | while E1 do
Definition
The relation ≃TΓ has the congruence property if, for all E1 and E2 ,
whenever E1 ≃TΓ E2 we have for all C and T ′ , if Γ ⊢ C[E1 ] : T ′ and
Γ ⊢ C[E2 ] : T ′ then
′
C[E1 ] ≃TΓ C[E2 ]
278
Congruence Property
Theorem (Congruence for (simple) typed IMP)
The relation ≃TΓ has the congruence property.
Proof.
By case distinction, considering all contexts C. ⊔
⊓
For each context C (and arbitrary expression E and store s) consider the
possible reduction sequence
⟨E , s⟩ −→ ⟨Ê1 , s1 ⟩ −→ . . .
Case C = (l := )
Suppose E ≃TΓ E ′ , Γ ⊢ l := E : T ′ and Γ ⊢ l := E ′ : T ′ .
By examination of the typing rule, we have T = int and T ′ = unit.
′
To show (l := E) ≃TΓ (l := E ′ ) we have to show that for all stores s if
dom(Γ) ⊆ dom(s) then
• Γ ⊢ l := E : T ′ , (obvious)
• Γ ⊢ l := E ′ : T ′ ,(obvious)
• and either
(i) ⟨l := E , s⟩ −→ω and ⟨l := E ′ , s⟩ −→ω
(ii) for some v, s′ we have ⟨l := E , s⟩ −→∗ ⟨v , s′ ⟩ and
⟨l := E ′ , s⟩ −→∗ ⟨v , s′ ⟩.
280
Proof of Congruence Property
Subcase ⟨l := E , s⟩ −→ω
That is
⟨l := E , s⟩ −→ ⟨E1 , s1 ⟩ −→ ⟨E2 , s2 ⟩ −→ . . .
All these must be instances of Rule (assign2), with
⟨E , s⟩ −→ ⟨Ê1 , s1 ⟩ −→ ⟨Ê2 , s2 ⟩ −→ . . .
281
Proof of Congruence Property
All these must be instances of Rule (assign2), except the last step which
is an instance of (assign1)
282
Proof of Congruence Property
283
Congruence Proofs
284
Back to Examples
• 1 + 1 ≃int
Γ 2 for any Γ
• (l := 0 ; 4) ̸≃int
Γ (l := 1 ; 3 + !l) for any Γ
285
General Laws
Conjecture
E1 ; (E2 ; E3 ) ≃TΓ (E1 ; E2 ) ; E3
for any Γ, T , E1 , E2 and E3 such that Γ ⊢ E1 : unit, Γ ⊢ E2 : unit and
Γ ⊢ E3 : T .
Conjecture
((if E1 then E2 else E3 ) ; E) ≃TΓ (if E1 then E2 ; E else E3 ; E)
for any Γ, T , E, E1 , E2 and E3 such that Γ ⊢ E1 : bool, Γ ⊢ E2 : unit,
Γ ⊢ E3 : unit, and Γ ⊢ E : T .
Conjecture
(E ; (if E1 then E2 else E3 )) ̸≃TΓ (if E1 then E ; E2 else E ; E3 )
286
General Laws
287
A Philosophical Question
What is a typed expression Γ ⊢ E : T ?
1. a list of tokens (after parsing) [IF, DEREF, LOC "l", GTEQ, ...]
2. an abstract syntax tree
3. the function taking store s to the reduction sequence
⟨E , s⟩ −→ ⟨E1 , s1 ⟩ −→ ⟨E2 , s2 ⟩ −→ . . .
288
Section 15
Denotational Semantics
289
Operational Semantics (Reminder)
• describe how to evaluate programs
• a valid program is interpreted as sequences of steps
• small-step semantics
▶ individual steps of a computation
▶ more rules (compared to big-step)
▶ allows to reason about non-terminating programs, concurrency, . . .
• big-step semantics
▶ overall results of the executions
‘divide-and-conquer manner’
▶ can be seen as relations
▶ fewer rules, simpler proofs
▶ no non-terminating behaviour
• allow non-determinism
290
Operational vs Denotational
⟨E , s⟩ −→ ⟨E ′ , s′ ⟩ and ⟨E , s⟩ ⇓ ⟨v , s′ ⟩
291
Big Picture
292
IMP – Syntax (aexp and bexp)
Booleans b∈B
Integers (Values) n∈Z
Locations l ∈ L = {l, l0 , l1 , l2 , . . . }
Operations aop ::= +
Expressions
aexp ::= n |!l | aexp aop aexp
bexp ::= b | bexp ∧ bexp | aexp ≥ aexp
com ::= l := aexp |
if bexp then com else com |
skip | com ; com |
while bexp do com
293
Semantic Domains
294
Partial Functions
295
Denotational Semantics for IMP
Arithmetic Expressions
296
Denotational Semantics for IMP
Boolean Expressions
297
Denotational Semantics for IMP
Arithmetic and Boolean Expressions in Function-Style
A[[n]](s) = n
A[[!l]](s) = s(l) if l ∈ dom(s)
A[[a1 + a2 ]](s) = A[[a1 ]](s) + A[[a2 ]](s)
B[[true]](s) = true
B[[false]](s) = false
B[[a1 ∧ a2 ]](s) = B[[b1 ]](s) ∧ B[[b2 ]](s)
true if A[[a1 ]](s) ≥ A[[a2 ]](s)
B[[b1 ≥ a2 ]](s) =
false otherwise
298
Denotational Semantics for IMP
Commands
C[[c1 ; c2 ]] = {(s, s′′ ) | ∃s′ . (s, s′ ) ∈ C[[c1 ]] ∧ (s′ , s′′ ) ∈ C[[c2 ]]}
C[[if b then c1 else c2 ]] = {(s, s′ ) | (s, true) ∈ B[[b]] ∧ (s, s′ ) ∈ C[[c1 ]]} ∪
{(s, s′ ) | (s, false) ∈ B[[b]] ∧ (s, s′ ) ∈ C[[c2 ]]}
299
Denotational Semantics for IMP
Commands in Function-Style
C[[skip]](s) = s
301
Recursive Equations – Example
0 if x = 0
f (x) =
f (x − 1) + 2x − 1 otherwise
302
Recursive Equations – Example II
g(x) = g(x) + 1
303
Recursive Equations – Example III
x
h(x) = 4 · h
2
304
Solving Recursive Equations
Build a solution by approximation (interpret functions as sets)
f0 = ∅
0 if x = 0
f1 =
f0 (x − 1) + 2x − 1 otherwise
= {(0, 0)}
0 if x = 0
f2 =
f1 (x − 1) + 2x − 1 otherwise
= {(0, 0), (1, 1)}
0 if x = 0
f3 =
f2 (x − 1) + 2x − 1 otherwise
= {(0, 0), (1, 1), (2, 4)}
305
Solving Recursive Equations
where
0 if x = 0
(F (f ))(x) =
f (x − 1) + 2x − 1 otherwise
306
Fixed Point
Definition
Given a function F : A → A, a ∈ A is a fixed point of F if F (a) = a.
Notation: Write a = fix (F ) to indicate that a is a fixed point of F .
f =fix (F )
=f0 ∪ f1 ∪ f2 ∪ . . .
=∅ ∪ F (∅) ∪ F (F (∅)) ∪ . . .
∞
[
= F i (∅)
i≥0
307
Denotational Semantics for while
308
Denotational Semantics – Example
C[[while !l ≥ 0 do m :=!l + !m ; l :=!l + (−1)]]
f0 = ∅
s if !l < 0
f1 =
undefined otherwise
s if !l < 0
f2 = s + {l 7→ −1, m 7→ s(m)} if !l = 0
undefined otherwise
s if !l < 0
s + {l 7→ −1} if !l = 0
f3 =
s + {l 7→ −1, m 7→ 1 + s(m)} if !l = 1
undefined otherwise
s if !l < 0
s + {l 7→ −1} if !l = 0
f4 = s + {l 7→ −1, m 7→ 1 + s(m)} if !l = 1
s + {l 7→ −1, m 7→ 3 + s(m)} if !l = 2
undefined otherwise
309
Fixed Points
310
Fixed Point Theory
Lemma
Every suprema-preserving function F is monotone increasing.
X ⊆ Y =⇒ F (X) ⊆ F (Y )
311
Kleene’s fixed point theorem
Theorem
Let F be a suprema-preserving function. The least fixed point of F exists
and is equal to [
F i (∅)
i≥0
312
C[[while b do c]]
C[[while b do c]](s)
= fix (F )
C[[c]]k (s) if k ≥ 0 such that B[[b]](C[[c]]k (s)) = false
= and B[[b]](C[[c]]i (s)) = true for all 0 ≤ i < k
undefined if B[[b]](C[[c]]i (s)) = true for all i ≥ 0
This may be what you would have expected, but now it is grounded on
well-known mathematics
313
Exercises
314
Section 16
315
Styles of semantics
Operational
Meanings for program phrases defined in terms of the steps of
computation they can take during program execution.
Denotational
Meanings for program phrases defined abstractly as elements of some
suitable mathematical structure.
Axiomatic
Meanings for program phrases defined indirectly via the axioms and
rules of some logic of program properties.
316
Styles of semantics
Operational
– how to evaluate programs (interpreter)
– close connection to implementations
Denotational
Meanings for program phrases defined abstractly as elements of some
suitable mathematical structure.
Axiomatic
Meanings for program phrases defined indirectly via the axioms and
rules of some logic of program properties.
317
Styles of semantics
Operational
– how to evaluate programs (interpreter)
– close connection to implementations
Denotational
– what programs calculate (compiler)
– simplifies equational reasoning (semantic equivalence)
Axiomatic
Meanings for program phrases defined indirectly via the axioms and
rules of some logic of program properties.
318
Styles of semantics
Operational
– how to evaluate programs (interpreter)
– close connection to implementations
Denotational
– what programs calculate (compiler)
– simplifies equational reasoning (semantic equivalence)
Axiomatic
– describes properties of programs
– allows reasoning about the correctness of programs
319
Assertions
Axiomatic semantics describe properties of programs. Hence it requires
• a language for expressing properties
• proof rules to establish the validity of properties w.r.t. programs
Examples
• value of l is greater than 0
• value of l is even
• value of l is prime
• eventually the value of l will 0
• ...
320
Applications
• proving correctness
• documentation
• test generation
• symbolic execution
• bug finding
• malware detection
• ...
321
Assertion Languages
• (English)
• first-order logic (∀, ∃, ∧, ¬, =, R(x), . . . )
• temporal and modal logic (2, ⋄, ⊚, Until, . . . )
• special-purpose logics (Alloy, Z3, . . . )
322
Assertions as Comments
assertions are (should) be used in code regularly
/ * P r e c o n d i t i o n : 0 <= i < A . l e n g t h * /
/ * Postcodition : returns A[ i ] * /
p u b l i c i n t get ( i n t i ) {
return A[ i ] ;
}
323
Partial Correctness
{P } c {Q}
Meaning: if P holds before c, and c executes and terminates then Q
holds afterwards
324
Partial Correctness – Examples
• {l = 21} l := !l + !l {l = 42}
• {l = 0 ∧ m = i}
k := 0 ;
while !l ̸= !m
do
k := !k − 2 ;
l := !l + 1
{k = −i − i}
325
Partial Correctness – Examples
326
Partial Correctness – Examples
327
Total Correctness
[P ] c [Q]
Meaning: if P holds, then c will terminate and Q holds afterwards
328
Total Correctness – Example
• [l = 0 ∧ m = i∧ i ≥ 0]
k := 0 ;
while !l ̸= !m
do
k := !k − 2 ;
l := !l + 1
[k = −i − i]
329
Assertions
330
Assertions – Syntax
Booleans b∈B
Integers (Values) n∈Z
Locations l∈L = {l, l0 , l1 , l2 , . . . }
Logical variables i ∈ LVar = {i, i0 , i1 , i2 , . . . }
Operations aop ::= +
Expressions
aexpi ::= n | l | i | aexpi aop aexpi
assn ::= b | aexpi ≥ aexpi |
assn ∧ assn | assn ∨ assn |
assn ⇒ assn | ¬assn |
∀i. assn | ∃i. assn
331
Assertions – Satisfaction
when does a store s satisfy an assertion
• need interpretation for logical variables
I : LVar → Z
AI [[n]](s, I) = n
AI [[l]](s, I) = s(l), l ∈ dom(s)
AI [[i]](s, I) = I(i), i ∈ dom(I)
AI [[a1 + a2 ]](s, I) = AI [[a1 ]](s, I) + A[[a2 ]](s, I)
332
Assertion Satisfaction
define satisfaction relation for assertions on a given state s
s |=I true
s |=I a1 ≥ a2 if AI [[a1 ]](s, I) ≥ AI [[a2 ]](s, I)
s |=I P1 ∧ P2 if s |=I P1 and s |=I P2
s |=I P1 ∨ P2 if s |=I P1 or s |=I P2
s |=I P1 ⇒ P2 if s ̸|=I P1 or s |=I P2
s |=I ¬P if s ̸|=I P
s |=I ∀i. P if ∀n ∈ Z. s |=I+{i7→n} P
s |=I ∃i. P if ∃n ∈ Z. s |=I+{i7→n} P
333
Partial Correctness – Satisfiability
334
Partial Correctness – Validity
Assertion validity
An assertion P is valid (holds) (|= P ) if it is valid in any store under
interpretation.
|= P :⇐⇒ ∀s, I. s |=I P
335
Proving Specifications
336
Section 17
Axiomatic Semantics
337
Floyd-Hoare Logic
Judgement
⊢ {P } c {Q}
338
Floyd-Hoare Logic – Skip
(skip) ⊢ {P } skip {P }
339
Floyd-Hoare Logic – Assignment
(assign) ⊢ {P [a/l]} l := a {P }
Example
{7 = 7} l := 7 {l = 7}
340
Floyd-Hoare Logic – Incorrect Assignment
(wrong1) ⊢ {P } l := a {P [a/l]}
Example
{l = 0} l := 7 {7 = 0}
(wrong2) ⊢ {P } l := a {P [l/a]}
Example
{l = 0} l := 7 {l = 0}
341
Floyd-Hoare Logic – Sequence, If, While
⊢ {P ∧ b} c {P }
(while)
⊢ {P } while b do c {P ∧ ¬b}
342
Floyd-Hoare Logic – Consequence
343
Floyd-Hoare Logic – Consequence
|= P ⇒ P ′ ⊢ {P ′ } c {Q′ } |= Q′ ⇒ Q
(cons)
⊢ {P } c {Q}
344
Floyd-Hoare Logic – Summary
(skip) ⊢ {P } skip {P }
(assign) ⊢ {P [a/l]} l := a {P }
345
Floyd-Hoare Logic – Exercise
{l0 = n ∧ n > 0}
l1 := 1 ;
while !l0 > 0 do
l1 := !l1 · !l0 ;
l0 := !l0 − 1
{l1 = n!}
346
Soundness and Completeness
Soundness:
if a partial correctness statement can be derived (⊢) then is is valid (|=)
Completeness:
if the statement is valid (|=) then a derivation exists (⊢)
347
Soundness and Completeness
Theorem (Soundness)
If ⊢ {P } c {Q} then |= {P } c {Q}.
Proof.
Induction on the derivation of ⊢ {P } c {Q}. ⊔
⊓
348
Soundness and Completeness
Conjecture (Completeness)
If |= {P } c {Q} then ⊢ {P } c {Q}.
Can we derive |= P ⇒ P ′ ?
No, according to Gödel’s incompleteness theorem (1931)
349
Soundness and Completeness
350
Decorated Programs
351
(Informal) Rules for Decoration
skip
pre and post-condition should be the same
{P } (skip) ⊢ {P } skip {P }
skip
{P }
352
(Informal) Rules for Decoration
assignment
use the substitution from the rule
sequencing
{P } c1 {R} and {R} c2 {Q} should be (recursively) locally consistent
⊢ {P } c1 {R} ⊢ {R} c2 {Q}
{P } (seq)
⊢ {P } c1 ; c2 {Q}
c1 ;
{R}
c2
{Q}
353
(Informal) Rules for Decoration
if then
both branches are locally consistent; add condition to both
⊢ {P ∧ b} c1 {Q} ⊢ {P ∧ ¬b} c2 {Q}
{P } (if)
⊢ {P } if b then c1 else c2 {Q}
if b then
{P ∧ b}
c1
{Q}
else
{P ∧ ¬b}
c2
{Q}
{Q}
354
(Informal) Rules for Decoration
while
add/create loop invariant
⊢ {P ∧ b} c {P }
{P } (while)
⊢ {P } while b do c {P ∧ ¬b}
while b do
{P ∧ b}
c
{P }
{P ∧ ¬b}
355
(Informal) Rules for Decoration
consequence
always write a (valid) implication
|= P ⇒ P ′ ⊢ {P ′ } c {Q′ } |= Q′ ⇒ Q
{P } ⇒ (cons)
⊢ {P } c {Q}
{P ′ }
356
Floyd-Hoare Logic – Exercise
{l0 = n ∧ n > 0}
l1 := 1 ;
while !l0 > 0 do
l1 := !l1 · l0 ;
l0 := !l0 − 1
{l1 = n!}
357
Floyd-Hoare Logic – Exercise
{l0 = n ∧ n > 0} ⇒
{1 = 1 ∧ l0 = n ∧ n > 0}
l1 := 1 ;
{l1 = 1 ∧ l0 = n ∧ n > 0} ⇒
{l1 · l0 ! = n! ∧ l0 ≥ 0}
while !l0 > 0 do
{l1 · l0 ! = n! ∧ l0 > 0 ∧ l0 ≥ 0} ⇒
{l1 · l0 · (l0 − 1)! = n! ∧ (l0 − 1) ≥ 0}
l1 := !l1 · l0 ;
{l1 · (l0 − 1)! = n! ∧ (l0 − 1) ≥ 0}
l0 := !l0 − 1
{l1 · l0 ! = n! ∧ l0 ≥ 0}
{l1 · l0 ! = n! ∧ (l0 ≥ 0) ∧ ¬(l0 > 0)} ⇒
{l1 = n!}
358
Section 18
Weakest Preconditions
359
Generating Preconditions
{ ? } c {Q}
360
Weakest Liberal Preconditions
361
Weakest Preconditions
wlp(skip, Q) = Q
wlp(l := a, Q) = Q[a/l]
wlp((c1 ; c2 ), Q) = wlp(c1 , wlp(c2 , Q))
wlp(if b then c1 else c2 , Q) = (b =⇒ wlp(c1 , Q)) ∧
(¬b =⇒ wlp(c2 , Q))
wlp(while b do c, Q) = (b =⇒ wlp(c, wlp(while b do c, Q))) ∧
(¬b =⇒ Q)
^
= Fi (Q)
i
where
F0 (Q) = true
Fi+1 (Q) = (¬b =⇒ Q) ∧ (b =⇒ wlp(c, Fi (Q)))
(Greatest fixed point)
362
Properties of Weakest Preconditions
363
Soundness and Completeness
364
Total Correctness
Definition (Weakest Precondition)
P is a weakest precondition of c and Q (wp(c, Q)) if
all rules are the same, except the one for while. This requires a fresh
ghost variable that guarantees termination
365
Strongest Postcondition
{P } c { ? }
• wlp motivates backwards reasoning
• this seems unintuitive and unnatural
• however, often it is known what a program is supposed to do
• sometimes forward reasoning is useful, e.g. reverse engineering
366
Strongest Postcondition
sp(skip, P ) = P
sp(l := a, P ) = ∃v. (l = a[v/l] ∧ P [v/l])
sp((c1 ; c2 ), P ) = sp(c2 , sp(c1 , P ))
sp(if b then c1 else c2 , P ) = (sp(c1 , b ∧ P )) ∨ (sp(c2 , ¬b ∧ P ))
sp(while b do c, P ) = sp(while b do c, sp(c, P ∧ b)) ∨ (¬b ∧ P )
where
F0 (P ) = false
Fi+1 (P ) = (¬b ∧ P ) ∨ (sp(c, Fi (P ∧ b)))
367
Section 19
Concurrency
368
Concurrency and Distribution
369
Problems
aim: languages that can be used to model computations that execute in
parallel and on distributed architectures
problems
• state-space explosion
with n threads, each of which can be in 2 states, the system has 2n states
• state-spaces become complex
• computation becomes nondeterministic
• competing for access to resources may deadlock or suffer starvation
• partial failure (of some processes, of some machines in a network, of some
persistent storage devices)
• communication between different environments
• partial version change
• communication between administrative regions with partial trust (or, indeed,
no trust)
• protection against malicious attack
• ...
370
Problems
371
Process Calculi
372
IMP – Parallel Commands
we extend our while-language that is based on aexp, bexp and com
Syntax
com ::= . . . | com ∥ com
Semantics
⟨c0 , s⟩ −→ ⟨c′0 , s′ ⟩
(par1)
⟨c0 ∥ c1 , s⟩ −→ ⟨c′0 ∥ c1 , s′ ⟩
⟨c1 , s⟩ −→ ⟨c′1 , s′ ⟩
(par2 )
⟨c0 ∥ c1 , s⟩ −→ ⟨c0 ∥ c′1 , s′ ⟩
373
IMP – Parallel Commands
Typing
Γ ⊢ c : unit
(thread)
Γ ⊢ c : proc
Γ ⊢ c0 : proc Γ ⊢ c1 : proc
(par )
Γ ⊢ c0 ∥ c1 : proc
374
Parallel Composition: Design Choices
375
Asynchronous Execution
• semantics allow interleavings
⟨skip ∥ l := 2 , {l 7→ 1}⟩ / ⟨skip ∥ skip , {l 7→ 2}⟩
2
⟨l := 1 ∥ l := 2 , {l 7→ 0}⟩
,
⟨l := 1 ∥ skip , {l 7→ 2}⟩ / ⟨skip ∥ skip , {l 7→ 1}⟩
376
Asynchronous Execution
• interleavings in ⟨(l := 1+!l) ∥ (l := 7+!l) , {l 7→ 0}⟩
( &
⟨(l := 1 + 0) ∥ (l := 7+!l) , {l 7→ 0}⟩ ⟨(l := 1) ∥ (l := 7 + 0) , {l 7→ 0}⟩ ⟨skip ∥ (l := 7) , {l 7→ 1}⟩
w / ⟨skip ∥ skip , {l 7→ 7}⟩
5 6 8
r r + + w
) (
⟨(l := 1+!l) ∥ (l := 7+!l) , {l 7→ 0}⟩ ⟨(l := 1 + 0 ∥ (l := 7 + 0) , {l 7→ 0}⟩ ⟨(l := 1) ∥ (l := 7) , {l 7→ 0}⟩
5 6
r r + + w
) ( &
⟨(l := 1+!l) ∥ (l := 7 + 0) , {l 7→ 0}⟩ ⟨(l := 1 + 0) ∥ (l := 7) , {l 7→ 0}⟩ ⟨(l := 1) ∥ skip , {l 7→ 7}⟩
w / ⟨skip ∥ skip , {l 7→ 1}⟩
6 8
+ r w +
) (
⟨(l := 1+!l) ∥ (l := 7) , {l 7→ 0}⟩ ⟨(l := 1 + 0) ∥ skip , {l 7→ 7}⟩
w
(
⟨(l := 1+!l) ∥ skip , {l 7→ 0}⟩
r /• +
/• w / ⟨(skip ∥ (skip , {l 7→ 8}⟩
377
Morals
• combinatorial explosion
• drawing state-space diagrams only works for really tiny examples
• almost certainly the programmer does not want all those 3
outcomes to be possible
• complicated/impossible to analyse without formal methods
378
Parallel Commands – Nondeterminism
Semantics
⟨c0 , s⟩ −→ ⟨c′0 , s′ ⟩
(par1)
⟨c0 ∥ c1 , s⟩ −→ ⟨c′0 ∥ c1 , s′ ⟩
⟨c1 , s⟩ −→ ⟨c′1 , s′ ⟩
(par2 )
⟨c0 ∥ c1 , s⟩ −→ ⟨c0 ∥ c′1 , s′ ⟩
(+maybe rules for termination)
• study of nondeterminism
• ∥ is not a partial function from state to state; big-step semantics
needs adaptation
• can we achieve parallelism by nondeterministic interleavings
• communication via shared variable
379
Study of Parallelism (or Concurrency)
includes
Study of Nondeterminism
380
Dijkstra’s Guarded Command Language (GCL)
381
GCL – Syntax
• arithmetic expressions: aexp (as before)
• Boolean expressions: bexp (as before)
• Commands:
• Guarded Commands:
382
GCL – Semantics
• assume we have semantic rules for bexp and aexp (standard)
we skip the deref-operator from now on
• assume a new configuration fail
Guarded Commands
⟨b , s⟩ −→ ⟨true , s⟩ ⟨b , s⟩ −→ ⟨false , s⟩
(pos) (neg)
⟨b → c , s⟩ −→ ⟨c , s⟩ ⟨b → c , s⟩ −→ fail
⟨gc0 , s⟩ −→ ⟨c , s′ ⟩ ⟨gc1 , s⟩ −→ ⟨c , s′ ⟩
(par1) (par2)
⟨gc0 [] gc1 , s⟩ −→ ⟨c , s′ ⟩ ⟨gc0 [] gc1 , s⟩ −→ ⟨c , s′ ⟩
383
GCL – Semantics
Commands
⟨gc , s⟩ −→ ⟨c , s′ ⟩
(cond)
⟨if gc fi , s⟩ −→ ⟨c , s′ ⟩
⟨gc , s⟩ −→ fail
(loop1) †
⟨do gc od , s⟩ −→ ⟨⟨s⟩⟩
⟨gc , s⟩ −→ ⟨c , s′ ⟩
(loop2)
⟨do gc od , s⟩ −→ ⟨c ; do gc od , s′ ⟩
†
new notation: behaves like skip
384
Processes
do b1 → c1 [] · · · [] bn → cn od
385
GCL – Examples
• compute the maximum of x and y
if
x ≥ y → max := x
[]
y ≥ x → max := y
fi
• Euclid’s algorithm
do
x > y → x := x − y
[]
y > x → y := y − x
od
386
GCL and Floyd-Hoare logic
{x = m ∧ y = n ∧ m > 0 ∧ n > 0}
Euclid
{x = y = gcd(m, n)}
387
Proving Euclid’s Algorithm Correct
ℓ|m, n ⇒ ℓ| gcd(m, n)
388
Synchronised Communication
• communication by “handshake”
• possible exchange of value
(localised to process-process (CSP) or to a channel (CCS))
• abstracts from the protocol underlying coordination
• invented by Hoare (CSP) and Milner (CCS)
389
Extending GCL
390
Extending GCL – Semantics
transitions may carry labels when possibility of interaction
⟨a , s⟩ −→ ⟨n , s⟩
α?n α!n
⟨α?x , s⟩ −→ ⟨⟨s + {x 7→ n}⟩⟩ ⟨α!a , s⟩ −→ ⟨⟨s⟩⟩
λ
⟨c0 , s⟩ −→ ⟨c′0 , s′ ⟩
λ
(+ symmetric)
⟨c0 ∥ c1 , s⟩ −→ ⟨c′0 ∥ c1 , s′ ⟩
α?n α!n
⟨c0 , s⟩ −→ ⟨c′0 , s′ ⟩ ⟨c1 , s⟩ −→ ⟨c′1 , s⟩
(+ symmetric)
⟨c0 ∥ c1 , s⟩ −→ ⟨c′0 ∥ c′1 , s′ ⟩
λ
⟨c , s⟩ −→ ⟨c′ , s′ ⟩
λ
λ ̸∈ {α?n, α!n}
⟨c\α , s⟩ −→ ⟨c′ \α , s′ ⟩
• forwarder:
do α?x → β!x od
• buffer of capacity 2:
do α?x → β!x od
∥ do β?x → γ!x od \β
392
External vs Internal Choice
the following two processes are not equivalent w.r.t. deadlock capabilities
393
Section 20
394
Towards an Abstract Mechanism for Concurrency
395
Actions and Communications
396
(Decorated) CCS – Syntax
Expressions:
arithmetic a and Boolean b
Processes:
p ::= nil nil process
| (τ → p) silent/internal action
| (α!a → p) output
| (α?x → p) input
| (b → p) Boolean guard
| p+p nondeterministic choice
| p∥p parallel composition
| p\L restriction (L a set of channel identifiers)
| p[f ] relabelling (f a function on channel identifiers)
| P (a1 , . . . , ak ) process identifier
397
(Decorated) CCS – Syntax
Process Definitions:
def
P (x1 , . . . , xk ) = p
398
Restriction and Relabelling – Examples
399
Operational semantics of CCS
Guarded processes
silent action
τ
(τ → p) −→ p
output
a −→ n
α!n
(α!a → p) −→ p
input
α?n
(α?x → p) −→ p[n/x]
Boolean
λ
b → true p −→ p′
λ
(b → p) −→ p′
400
Operational semantics of CCS
Sum
λ λ
p0 −→ p′0 p1 −→ p′1
λ λ
p0 + p1 −→ p′0 p0 + p1 −→ p′1
Parallel composition
λ α?n α!n
p0 −→ p′0 p0 −→ p′0 p1 −→ p′1
τ
λ
p0 ∥ p1 −→ p′0 ∥ p1 p0 ∥ p1 −→ p′0 ∥ p′1
λ α!n α?n
p1 −→ p′1 p0 −→ p′0 p1 −→ p′1
τ
λ
p0 ∥ p1 −→ p0 ∥ p′1 p0 ∥ p1 −→ p′0 ∥ p′1
401
Operational semantics of CCS
Restriction λ
p −→ p′
λ
if λ ∈ {α?n, α!n} then α ̸∈ L
p\L −→ p′ \L
λ
Relabelling p −→ p′
f (λ)
p[f ] −→ p′ [f ]
where f is extended to labels as f (τ ) = τ and f (α?n) = f (α)?n and
f (α!n) = f (α)!n
Identifiers λ
p[a1 /x1 , . . . , an /xn ] −→ p′ def
λ
P (x1 , . . . , xn ) = p
P (a1 , . . . , an ) −→ p′
Nil process
no rules
402
A Derivation
τ
(((α!3 → nil + P ) ∥ τ → nil) ∥ α?x → nil)\{α} −→ ((nil ∥ τ → nil) ∥ nil)\{α}
403
More Examples
v )
nil β!3 → nil
β!3
nil
404
Linking Process
(some syntactic sugar)
Let
def
P =in?x → out!x → P
def
Q =in?y → out!y → Q
∩
P Q = (P [c/out] ∥ Q[c/in])\{c}
405
Euclid’s algorithm in CSS
def
E(x, y) = x = y → gcd!x → nil
+ x < y → E(x, y − x)
+ y < x → E(x − y, x)
def
Euclid = in?x → in?y → E(x, y)
406
Section 21
Pure CCS
407
Towards a more basic language
aim: removal of variables to reveal symmetry of input and output
• transitions for value-passing carry labels τ , a?n, a!n
α?x → p
α?0 / p[0/x]
α?n
%
p[n/x]
408
Pure CCS
• Actions: a, b, c, . . .
• Complementary actions: ā, b̄, c̄,. . .
• Internal action: τ
• ¯=a
Notational convention: ā
• Processes:
p ::= P
λ.p prefix λ ranges over τ, a, ā for any action
| i∈I pi sum I is an index set
| p0 ∥ p1 parallel
| p\L restriction L a set of actions
| p[f ] relabelling f a relabelling function on actions
| P process identifier
• Process definitions:
def
P =p
409
Pure CCS – Semantics
Guarded processes (prefixing)
λ
λ.p −→ p
Sum
λ
pj −→ p′
λ
j∈I
′
P
i∈I pi −→ p
Parallel composition
λ λ
p0 −→ p′0 p1 −→ p′1
λ λ
p0 ∥ p1 −→ p′0 ∥ p1 p0 ∥ p1 −→ p0 ∥ p′1
a ā
p0 −→ p′0 p1 −→ p′1
τ
p0 ∥ p1 −→ p′0 ∥ p′1
410
Pure CCS – Semantics
Restriction
λ
p −→ p′
λ
λ ̸∈ L ∪ L
p\L −→ p′ \L
where L = {ā | a ∈ L}
Relabelling
λ
p −→ p′
λ
p[f ] −→ p′ [f ]
where f is a function such that f (τ ) = τ and f (ā) = f (a)
Identifiers
λ
p −→ p′ def
λ
P =p
P −→ p′
411
From Value-passing to Pure CCS
translation from a value-passing CCS closed term p to a pure CCS term pb
p pb
nil nil
(τ → p) τ.b
p
(α!a → p) αm.bp where a evaluates to m
P \
(α?x → p) m∈int αm.p[m/x]
(b → p) pb if b evaluates to true
nil if b evaluates to false
p0 + p1 pb0 + pb1
p0 ∥ p1 pb0 ∥ pb1
p\L pb\{αm | α ∈ L ∧ m ∈ int}
P (a1 , . . . , ak ) Pm1 ,...,mk where ai evaluates to mi
For every definition P (x1 , . . . , xk ) we have a collection of definitions
Pm1 ,...,mk indexed by m1 ,. . . ,mk ∈ int
412
Correspondence
Theorem
λ λ
p −→ p′ iff pb −→ pb′
b
413
Section 22
Semantic Equivalences
414
Labelled Transition Systems
415
Trace equivalence
416
Four Kinds of Trace Equivalence
417
A Lattice of Semantic Equivalence Relations
A relation ∼ ⊆ S × S on processes is an equivalence relation if it is
• reflexive: p ∼ p,
• symmetric: if p ∼ q then q ∼ p,
• and transitive: if p ∼ q and q ∼ r then p ∼ r.
Let [p]∼ be the equivalence class of p: the set of all processes that are
∼-equivalent to p.
[p]∼ := {q ∈ S | q ∼ p}.
p ∼ q ⇒ p ≈ q.
419
Safety and Liveness Properties
A safety property says that something bad will never happen.
A liveness property says that something good will happen eventually.
420
Compositionality
If p ∼ q then C[p] ∼ C[q].
Here C[ ] is a context, made from operators of some language.
421
Congruence closure
422
Bisimulation equivalence
423
Weak bisimulation equivalence
424
Semantic Equivalences – Summary
425
Section 23
426
Motivation
• Owicki-Gries Logic/Method
▶ a.k.a. interference freedom
▶ Susan Owicki and PhD supervisor David Gries
▶ add a construct to the programming language for threads
▶ study the impact for Hoare triples
427
Floyd-Hoare Logic and Decorated Programs
Floyd-Hoare logic
• each of the individual processes has an assertion
▶ before its first statement (precondition)
▶ between every pair of its statements (pre-/postcondition), and
▶ after its last statement (postcondition)
• Hoare-triples can be checked (local correctness)
• Floyd-Hoare logic is compositional
428
Motivation
add pre- and postcondition for system, and a rule
429
Simple Example
{x == 0}
{x == 0 ∨ x == 2} {x == 0 ∨ x == 1}
x := x + 1 ∥ x := x + 2
{x == 1 ∨ x == 3} {x == 2 ∨ x == 3}
{x == 3}
430
The Rule of Owicki Gries
431
Interference Freedom
it is a bit tricky
• interference freedom is a property of proofs, not Hoare triples
• identifying which parts of a proof need to be considered requires
some effort
432
Formalising Interference Freedom
433
Formalising Interference Freedom
{P1 } c1 {Q1 } . . . {Pn } cn {Qn } interference freedom
(par)
{P1 ∧ · · · ∧ Pn } c1 ∥ · · · ∥ cn {Q1 ∧ · · · ∧ Qn }
434
Interference Freedom – Remark
435
Simple Example
{x == 0} {x == 0}
x := x + 1 ∥ x := x + 2
{x == 1} {x == 1}
{x == 1}
436
Soundness
Theorem
If {P } c {Q} is derivable using the proof rules seen so far then c is valid
437
Completeness
438
Incompleteness
Lemma
The following valid Hoare triple cannot be derived using the rules so far.
{true} x := x + 2 ∥ x := 0 {x == 0 ∨ x == 2}
Proof.
By contradiction. Suppose there were such a proof. Then there would be Q, R such that
{true} x := x + 2 {Q}
{true} x := 0 {R}
Q ∧ R =⇒ x == 0 ∨ x == 2
By (assign) {P [a/l]} l := a {P } , true =⇒ Q[x + 2/x] holds. Similarly, R[0/x] holds.
By (par), {R ∧ true} x := x + 2 {R} holds, meaning R ⇒ R[x + 2/x] is valid.
But then by induction, ∀x. (x ≥ 0 ∧ even(x)) =⇒ R is true. Since
Q ∧ R =⇒ x = 0 ∨ x = 2, it follows that
∀x. (x ≥ 0 ∧ even(x)) =⇒ (x == 0 ∨ x == 2) ,
which is a contradiction. ⊓
⊔
439
Fixing the Problem
We showed
• R must hold for all even, positive x
• R must hold after execution of x := 0
• R must also hold both before and after execution of x := x + 2
440
Auxiliary Variables
variables that are put into a program just to reason about progress in
other processes
done := 0 ;
(
x, done := x + 2, 1
∥
x := 0
)
441
Decorated Programs with Auxiliary Variables
{true}
done := 0 ;
{done == 0}
(
{done == 0}
x, done := x + 2, 1
{true}
∥
{true}
x := 0
{(x == 0 ∨ x == 2) ∧ (done == 0 ⇒ x == 0)}
)
{c == 0 ∨ x == 2}
443
Problem
444
Peterson’s Algorithm for Mutual exclusion
the following 4 lines of (symmetric) code took 15 years to discover
(mid 60’s to early 80s)
{¬a ∧ ¬b}
other code of A other code of B
a := true b := true
t := A t := B
await (¬b ∨ t == B) await (¬a ∨ t == A)
critical section A critical section B
a := false b := false
445
Notes on Peterson’s Algorithm
446
Yet Another Example
FindFirstPositive
i := 0 ; j := 1 ; x := |A| ; y := |A| ;
while i < min(x, y) do while j < min(x, y) do
if A[i] > 0 then if A[j] > 0 then
x := i ∥ y := j
else else
i := i + 2 j := j + 2
r := min(x, y)
447
i := 0 ; j := 1 ; x := |A| ; y := |A| ;
{P1 ∧ P2 }
{P1 } {P2 }
while i < min(x, y) do while j < min(x, y) do
{P1 ∧ i < x ∧ i < |A|} {P2 ∧ j < y ∧ j < |A|}
if A[i] > 0 then if A[j] > 0 then
{P1 ∧ i < x ∧ i < |A| ∧ A[i] > 0} {P2 ∧ j < y ∧ j < |A| ∧ A[j] > 0}
x := i y := j
{P1 } ∥ {P2 }
else else
{P1 ∧ i < x ∧ i < |A| ∧ A[i] ≤ 0} {P2 ∧ j < y ∧ j < |A| ∧ A[j] ≤ 0}
i := i + 2 j := j + 2
{P1 } {P2 }
{P1 } {P2 }
{P1 ∧ i ≥ min(x, y)} {P2 ∧ j ≥ min(x, y)}
{P1 ∧ P2 ∧ i ≥ min(x, y) ∧ j ≥ min(x, y)}
r := min(x, y)
{r ≤ |A| ∧ (∀k. 0 ≤ k < r ⇒ A[k] ≤ 0) ∧ (r < |A| ⇒ A[r] > 0)}
P1 = x ≤ |A| ∧ (∀k. 0 ≤ k < i ∧ k even ⇒ A[k] ≤ 0) ∧ i even ∧ (x < |A| ⇒ A[x] > 0)
P2 = y ≤ |A| ∧ (∀k. 0 ≤ k < j ∧ k odd ⇒ A[k] ≤ 0) ∧ j odd ∧ (y < |A| ⇒ A[y] > 0)
448
Section 24
Rely-Guarantee
449
Motivation
• Owicki-Gries is not compositional
• generalise it to make it compositional
{P } c ∥ E {Q}
/⃝ +3 ⃝ /Q
7? ⃝
P /⃝ c +3 ⃝ /⃝ /⃝ c +3 ⃝ /⃝ /⃝ +3 ⃝ /⃝ +3 Q
#
⃝ +3 ⃝ /⃝ +3 Q
450
Motivation
P
∗ /⃝ c +3 ⃝ ∗ /⃝ c +3 ⃝ ∗ /Q
∗
−→: any state transition that can be done by any other thread, repeated zero or more
times
451
Rely-Guarantee
{P, R} c {G, Q}
If
• the initial state satisfies P , and
• every state change by another thread satisfies the rely condition R,
and
then c is executed and terminates,
then
• every final state satisfies Q, and
• every state change in c satisfies the guarantee condition G.
452
Rely-Guarantee – Parallel Rule
453
Rely-Guarantee – Consequence Rule
R ⇒ R′ {P, R′ } c {G′ , Q} G′ ⇒ G
{P, R} c {G, Q}
454
From Floyd-Hoare to Rely-Guarantee
{P } c {Q} ???
{P, R} c {G, Q}
R R R R R
{
{
{
{
{
P /P /P +3 Q /Q /Q /Q
{
455
Back to Stores
456
Making Assertions Stable
Assume
R = (x 7→ n ⇝ x 7→ n − 1)
= {(s, s′ ) | ∃n. s(x) = n ∧ s′ (x) = s + {x 7→ n − 1}}
G = (x 7→ n ⇝ x 7→ n + 1)
= {(s, s′ ) | ∃n. s(x) = n ∧ s′ (x) = s + {x 7→ n + 1}}
{x == 2, R} x := x + 1 {G, x == 3}
457
Making Assertions Stable
Assume
R = (x 7→ n ⇝ x 7→ n − 1)
= {(s, s′ ) | ∃n. s(x) = n ∧ s′ (x) = s + {x 7→ n − 1}}
G = (x 7→ n ⇝ x 7→ n + 1)
= {(s, s′ ) | ∃n. s(x) = n ∧ s′ (x) = s + {x 7→ n + 1}}
{x ≤ 2, R} x := x + 1 {G, x ≤ 3}
458
FindFirstPositive
i := 0 ; j := 1 ; x := |A| ; y := |A| ;
{P1 ∧ P2 }
{P1 , G2 } {P2 , G1 }
while i < min(x, y) do while j < min(x, y) do
{P1 ∧ i < x ∧ i < |A|}
...
∥ {P2 ∧ j < y ∧ j < |A|}
...
{P1 } {P2 }
{G1 , P1 ∧ i ≥ min(x, y)} {G2 , P2 ∧ j ≥ min(x, y)}
P1 = x ≤ |A| ∧ (∀k. 0 ≤ k < i ∧ k even ⇒ A[k] ≤ 0) ∧ i even ∧ (x < |A| ⇒ A[x] > 0)
P2 = y ≤ |A| ∧ (∀k. 0 ≤ k < j ∧ k odd ⇒ A[k] ≤ 0) ∧ j odd ∧ (y < |A| ⇒ A[y] > 0)
G1 = {(s, s′ )|s′ (y) = s(y) ∧ s′ (j) = s(j) ∧ s′ (x) ≤ s(x)}
G2 = {(s, s′ )|s′ (x) = s(x) ∧ s′ (i) = s(i) ∧ s′ (y) ≤ s(y)}
459
Rely-Guarantee Abstraction
Forgets
• which thread performs the action
• in what order the actions are performed
• how many times the action is performed
Usually, this is fine. . .
460
Verify This
{x == 0}
{x == 0 ∨ x == 1} {x == 0 ∨ x == 1}
x := x + 1 ∥ x := x + 1
{x == 1 ∨ x == 2} {x == 1 ∨ x == 2}
{x == 2}
G1 , G2 = (x 7→ n ⇝ x 7→ n + 1)
461
Verify This
{x == 0}
{∃n ≥ 0. x 7→ n, G2 } {∃n ≥ 0. x 7→ n, G1 }
x := x + 1 ∥ x := x + 1
{G1 , ∃n ≥ 1. x 7→ n} {G2 , ∃n ≥ 1. x 7→ n}
{∃n ≥ 1. x 7→ n}
G1 , G2 = (x 7→ n ⇝ x 7→ n + 1)
462
From Floyd-Hoare to Rely-Guarantee (recap)
{P } c {Q} ???
{P, R} c {G, Q}
P stable under R if and only if {P } R∗ {P }
R R R R R
{
{
{
{
{
P /P /P +3 Q /Q /Q /Q
{
463
Section 25
Conclusion
464
Learning Outcome I
▶ IMP language
▶ operational semantics
▶ denotational semantics
▶ axiomatic semantics
▶ functions
(call-by-name, call-by-value)
▶ references
▶ extensions
(data structures, error handling, object-orientation,. . . )
465
Learning Outcome II
466
Learning Outcome III
▶ small-step vs big-step
▶ operational vs denotational vs axiomatic (vs algebraic)
467
Learning Outcome IV
468
Learning Outcome V
▶ types
▶ subtypes
▶ progress and preservation properties
▶ Curry-Howard correspondence
469
Learning Outcome VI
▶ Isabelle/HOL
▶ semantic equivalences
▶ decorated programs
▶ Floyd-Hoare logic, wlp
▶ Owicki-Gries, Rely-Guarantee
470
Learning Outcome VII
471
Learning Outcome VIII
8. Additional Outcomes
▶ structural induction
▶ substitution
▶ ...
472
We covered A LOT
473
The Message I
474
The Message II
475
Trend: Verified Software
• increasingly important
• “rough consensus and running code” (trial and error)
is not sufficient
• develop operational models of real-world languages/applications
476
Are We Done
• more ‘standard’ features
▶ dependent types
▶ continuations
▶ lazy evaluation
▶ side effects
• more applications
▶ optimisations
▶ code generation
477
More Features – Dependent Types
478
More Features – Dependent Types
479
More Features – Dependent Types
480
More Features – Dependent Types
Example: typing lists with lengths
• using and checking dependent types
481
More Feature – Hardware Model
Fundamental Question
482
More Feature – Hardware Model
483
More Feature – Hardware Model
) '
⟨(l := 1 + 0) ∥ (l := 7 + !l) , {l 7→ 0}⟩ ⟨(l := 1) ∥ (l := 7 + 0) , {l 7→ 0}⟩ ⟨skip ∥ (l := 7) , {l 7→ 1}⟩
w / ⟨skip ∥ skip , {l 7→ 7}⟩
4 5 7
r r + + w
) (
⟨(l := 1 + !l) ∥ (l := 7 + !l) , {l 7→ 0}⟩ ⟨(l := 1 + 0 ∥ (l := 7 + 0) , {l 7→ 0}⟩ ⟨(l := 1) ∥ (l := 7) , {l 7→ 0}⟩
5 6
r r + + w
* ) '
⟨(l := 1 + !l) ∥ (l := 7 + 0) , {l 7→ 0}⟩ ⟨(l := 1 + 0) ∥ (l := 7) , {l 7→ 0}⟩ ⟨(l := 1) ∥ skip , {l 7→ 7}⟩
w / ⟨skip ∥ skip , {l 7→ 1}⟩
5 7
+ r w +
) (
⟨(l := 1 + !l) ∥ (l := 7) , {l 7→ 0}⟩ ⟨(l := 1 + 0) ∥ skip , {l 7→ 7}⟩
w
)
⟨(l := 1 + !l) ∥ skip , {l 7→ 0}⟩
r /• +
/• w / ⟨(skip ∥ (skip , {l 7→ 8}⟩
484
More Feature – Hardware Model
485
More Feature – Hardware Model
New problem?
486
More Feature – Hardware Model
But still a research question
• more fundamentally:
▶ it has been (and in significant ways still is) unclear how we can specify
that precisely
▶ if we can do that, we can build on top:
explanation, testing, emulation, static/dynamic analysis,
model-checking, proof-based verification,. . .
487
More Features – Broadcast
Motivation:
model communication
• network protocols
• communication protocols
• ...
488
Broadcast in CCS α α
α P −→ P ′ Q −→ Q′
α.P −→ P α α
P + Q −→ P ′ P + Q −→ Q′
η c c̄ η
P −→ P ′ P −→ P ′ , Q −→ Q′ Q −→ Q′
η τ η
P |Q −→ P ′ |Q P |Q −→ P ′ |Q′ P |Q −→ P |Q′
ℓ ℓ ℓ
P −→ P ′ P −→ P ′ P −→ P ′ def
ℓ
(c̸=ℓ̸=c̄) ℓ
(A = P )
f (ℓ)
P [f ] −→ P ′ [f ] P \c −→ P ′ \c A −→ P′
489
Broadcast in CCS
490
Case Study: AODV
Ad Hoc On-Demand Distance Vector Protocol
• routing protocol for wireless mesh networks
(wireless networks without wired backbone)
491
Case Study: AODV
Main Mechanism
• if route is needed
BROADCAST RREQ
• if node has information about a destination
UNICAST RREP
• if unicast fails or link break is detected
GROUPCAST RERR
492
Case Study: AODV
Formal Specification Language (Process Algebra)
493
Case Study: AODV
Specification
494
Case Study: AODV
Full specification of AODV (IETF Standard)
Specification details
• around 5 types and 30 functions
• around 120 lines of specification
(in contrast to 40 pages English prose)
Properties of AODV
route correctness ✓
loop freedom ✓ (for some interpretations)
route discovery ✗
packet delivery ✗
495
Final Oral Exam
GOOD LUCK
496
Feedback
497
The ‘Final’ Slide
• Q/A sessions
▶ Thursday, November 2 (11am-12pm),
Marie Reay room 5.02
▶ topics: all questions you prepare
▶ no questions, no session
• I hope you. . .
▶ had some fun (I had),
even despite the challenging times
▶ learnt something useful
498
COMP3610/6361 done – what’s next?
499
Logic Summer School
December 04 – December 15, 2021
Lectures include
• Fundamentals of Metalogic
(John Slaney, ANU)
• Defining and Reasoning About Programming Languages
(Fabian Muehlboeck, ANU)
• Propositions and Types, Proofs and Programs
(Ranald Clouston, ANU)
• Gödel’s Theorem Without Tears
(Dominik Kirst, Ben-Gurion University)
• Foundations for Type-Driven Probabilistic Modelling
(Ohad Kammar, U Edinburgh)
• ...
Registration is A$150
http://comp.anu.edu.au/lss
500
— THE END —
501
Section 27
Add-On
Program Algebras:
Floyd-Hoare Logic meets Regular Expressions
502
Motivation
503
Beyond Floyd-Hoare Logic
504
Trace Model – Intuition
a program can be interpreted as set of program runs/traces
A ⊆ Σ × (Act × Σ)∗
505
Guarded Commands – Intuition
506
Properties
507
Regular expressions
508
Kleene Algebra (KA)
is the algebra of regular expressions
(traces/guarded commands without ‘states’)
Examples
• ab + ba
{ab, ba}
• (ab)∗ a = a(ba)∗
{a, aba, ababa, . . . }
509
Regular Sets – Intuition
510
Axioms of Kleene Algebra
A Kleene algebra is a structure (K, +, ·, 0, 1,∗ ) such that
• K is an idempotent semiring under +, ·, 0, 1
(a + b) + c = a + (b + c) (a, ·b) · c = a · (b · c)
a+b=b+a a·1=1·a=a
a+a=a a·0=0·a=0
a+0=a
a · (b + c) = a · b + a · c
(a + b) · c = a · c + b · c
• a∗ b = least x such that b + ax ≤ x
• ba∗ = least x such that b + xa ≤ x
x≤y ⇔x+y =y
multiplication symbol is omitted
511
Characterising Iteration
• complete semiring/quantales (suprema exist)
a∗ = Σn≥0 an
supremum with respect to ≤
• Horn axiomatisation
▶ a∗ b = least x such that b + ax ≤ x:
1 + aa∗ ≤ a∗
b + ax ≤ x ⇒ a∗ b ≤ x
1 + a∗ a ≤ a∗
b + ax ≤ x ⇒ ba∗ ≤ x
512
Models & Properties
513
Kleene Algebra with Tests (KAT)
A Kleene algebra with tests is a structure (K, B, +, ·,∗ , ¬, 0, 1), such that
• (K, +, ·,∗ , 0, 1) is a Kleene algebra
• (B, +, ·, ¬, 0, 1) is a Boolean algebra
• B⊆K
• a, b, c, . . . range over K
• p, q, r, . . . range over B
514
Kleene Algebra with Tests (KAT)
pq = p ∧ q p+q =p∨q
515
Models
• Trace models
K: sets of traces s0 c1 s1 c2 . . . sn−1 cn1 sn
B: sets of traces of length 0
• Language-theoretic models K: sets of guarded strings
α0 c1 α1 c2 . . . αn−1 cn1 αn
B: atoms of a finite free Boolean algebra
516
Modelling Programs
[Fischer & Ladner 79]
• a ; b = ab
• if p then a else c = pa + ¬pc
• while p do c = (pc)∗ ¬p
517
Floyd-Hoare Logic vs KAT
Theorem
KAT subsumes propositional Floyd-Hoare logic (PHL)
(Floyd-Hoare logic without assignment rule)
518
Floyd-Hoare logic
{p ∧ r} a {p}
pra¬p = 0 =⇒ p(ap)∗ ¬(¬rp) = 0
{p} while r do a {¬r ∧ p}
519
Crucial Theorems
Theorem
These are all theorems of KAT
(proof is an exercise)
520
Advantages of Kleene Algebra
• unifying approach
• equational reasoning + Horn clauses
some decidability & automation
• but, missing out assignment rule of Floyd-Hoare logic
521
Other Applications of KA(T)
There are more applications
• automata and formal languages
▶ regular expressions
• relational algebra
• program logic and verification
▶ dynamic Logic
▶ program analysis
▶ optimisation
• design and analysis of algorithms
▶ shortest paths
▶ connectivity
• others
▶ hybrid systems
▶ ...
522
Rely-Guarantee Reasoning
Hoare triple
{p} c {q} ⇔ pc¬q = 0
• R ∥ (S · T ) = (R ∥ S) · (R ∥ T )
• R ∥ (S ∥ T ) = (R ∥ S) ∥ (R ∥ T )
523
Rely-Guarantee Reasoning
Hoare triple
{p} c {q} ⇔ pc¬q = 0
• R ∥ (S · T ) = (R ∥ S) · (R ∥ T )
• R ∥ (S ∥ T ) = (R ∥ S) ∥ (R ∥ T )
524