Lec 2 3 Uninformed Search P 1
Lec 2 3 Uninformed Search P 1
MIU
2: Problem Solving
Dr.Diaa Salama
Goal
• The theory and technology of building agents
that can plan ahead to solve problems
• These problems characterized by having many
states.
• Example: navigation problem where there are
many states and you need to pick the right
choice now and in all the following states
streaming together a sequence of actions that
guarantee to reach the goal.
Not covered
• Navigation in fog is a different problem where
the environment is partially observable and the
possible paths are unknown .
Problem
• Find a route from Arad to Bucharest
Problem definition
1. Initial state
2. Actions (state) -> {action1 ,action2 ,action3, …. }
3. Result (state, action) -> new state
4. GoalTeast (state) -> T / F
5. PathCost ( … ) -> cost
In this problem, the cost of the path is sum of the costs of Individual steps. So
we need step cost function.
StepCost( , , ) -> cost #it may be number of KM or number of minuts
Problem definition
1. TreeSearch(problem)
2.
3. frontiers={initial}
4.
5. loop:
6. if frontier is empty: return Fail
7.
8. path = frontiers.remove() //choose frontier
9. s=path.end
10.
11. if s is a goal: return path // test goal
12.
13. for a in actions(s): //expand the path
14. path.end(result(s,a) )
15. frontiers.add(path)
Breadth First Search
• Shortest path first search A
B C
D E F G
2 2
O
71
151 N 87
1
Z
75
I
140 1
A 92
S 99 F 2
118
80
V
R 2
1 T
2 97 211 142
P 3
111
L 85 98
70 H
146 101 U
M 3 B 86
75 138 4 3
120 90
4 D
C
G E
3 4
Enhance Solution
To avoid repeated paths we replace the tree search by a graph search
1. GraphSearch(problem)
2.
3. frontiers={initial}; explored={}
4.
5. loop:
6. if frontier is empty: return Fail
7.
8. path = frontiers.remove() //choose frontier
9. s=path.end; explored.add(s)
10.
11. if s is a goal: return path // test goal
12.
13. for a in actions(s): //expand the path
14. if result(s,a) ∉ explored and path.end(result(s,a)) ∉ frontiers
15 path.end(result(s,a)) //add to the end of the path
16. frontiers.add(path)
Breadth First Search cont.
• Shortest path first search
2 2
O
71
151 N 87
1
Z
75
I
140 1
A 92
S 99 F 2
118
80
V
R 2
1 T
2 97 211 142
P 3
111
L 85 98
70 H
146 101 U
M 3 B 86
75 138 4 3
120 90
4 D
C
G E
3 4
BFS and Brute-force
1. GraphSearch(problem)
2.
3. frontiers={initial}; explored={}
4.
5. loop:
6. if frontier is empty: return Fail
7.
8. path = frontiers.remove() //choose frontier
9. s=path.end; explored.add(s)
10.
11. if s is a goal: return path // test goal
12.
13. for a in actions(s): //expand the path
14. if result(s,a) ∉ explored and path.end(result(s,a)) ∉ frontiers
15. path.end(result(s,a))
16. frontiers.add(path)
• Why we test the goal (line 11) after choosing the frontier , not after
expanding the path???
• Does Breadth first search require this order?
• It depends on what “shortest” means, if it means length we can
enhance this order to terminate after expanding. However, if it
means the distance, this order becomes important.
• So the answer here is No, BFS don’t need this order and we can
test the goal directly after expanding.
Uniform-Cost Search
• Cheapest first search
146 291
O
71
151 N 87
75
Z
75
I
140 140
A 92
S 99 F 239
118
80
V
R 220
118 T
229 97 211 142
P 317
111
L 85 98
70 H
146 101 U
M 299 B
460 86
75 138 418
120 90
374 D
C
G E
366 455
UCS and Brute-force
1. GraphSearch(problem)
2.
3. frontiers={initial}; explored={}
4.
5. loop:
6. if frontier is empty: return Fail
7.
8. path = frontiers.remove() //choose frontier
9. s=path.end; explored.add(s)
10.
11. if s is a goal: return path // test goal
12.
13. for a in actions(s): //expand the path
14. if result(s,a) ∉ explored and path.end(result(s,a)) ∉ frontiers
15. path.end(result(s,a))
16. frontiers.add(path)
• Again: Why we test the goal (line 11) after choosing the frontier , not
after expanding the path???
• Does Uniform-Cost search require this order?
• Yes, this order is important here.
Depth First Search
• longest path first search
O
71
151 N 87
1
Z
1
75
I
140 3
A 92
S 99 F 4
118
80
V
R 4
1 T
2 97 211 142
P 3
111
L 85 98
70 H
146 101 U
M 3 B 86
75 138 4 5
120 90
4 D
C
G E
3 4
Optimality
• Which algorithm is optimal with respect to the
frontiers Selection Criteria ?
5 2
3 2 4 2
Memory
• Depth First search is not optimal, why we still using it?
▫ Because it saves the memory when we have a large tree.
1 1 1
2 2 2
3 3 3
. . .
. . .
. . .
n n n
Completeness
• Does Depth First guaranty to find a solution? (Completeness)
▫ No, because the solution depends on the problem space. In other words, if the problem
has an infinite space then Depth First could miss the right path forever.
Depth first
Infinite space
Refereeing to the Uniform-Cost Search
• In uniform-cost search the searching contour
with respect to the cost (distance) looks as
following:
s G
The Idea
• If we have some more information about the
distance between S and G we can direct the
search toward the goal.
s G
s G
g h
s x G
• Benefits:
▫ Minimizing ‘g’ keep the path cheap.
▫ Minimizing ‘h’ keep focus on finding the goal.
A* search
• best estimated total path cost first)
O
2 h
71
151 N 87
1
Z
75
I
140 1
A 92
S 99 F 2
118
80
V
R 2
1 T
2 97 211 142
P 3
111
L 85 98
70 H
146 101 U
M 3 B 86
75 138 4 3
120 90
4 D
C
G E
3 4
A* search
• Best estimated total path cost first)
• F=g+h
291
+380=671
O
h
71
75 151 N 87
+374=449 Z
75
140 I
+253=393
140
A 92
S 99 F 239
+178=417
118
80 220
+193=413
V
118
+329=447 T R
97 317 211 142
+98=415
111
L 85 98
P H
70
146 101 U
M B 86
75 418
138 450
+0=418
+0=450
120 90
D
C
366 455 G E
+160=426 +160=615
A* analysis
• Does A* always find the lost cost path?
Yes
No, depends on the problem
No, depends on h
2 * 28
Vacuum cleaner state space
• Goal formulation: intuitively, we want all the dirt cleaned up. Formally,
the goal is { state 7, state 8 }.
• Problem formulation: we already know what the set of all possible states
is. The actions/operators are "move left", "move right", and "vacuum".
Why we need search techniques?
• h 2=
Sliding blocks puzzle
h1= no. of misplaced blocks
h2=
• Fully Observable
• Discrete
• Deterministic
• Static
Implementation
s x G
S X G
Null SX XG
0 170 230
Null S X