EmbIntellSyst AutonomNavig 20210405
EmbIntellSyst AutonomNavig 20210405
IMU, 3-axis
Encoder
Raspberry Pi
USB Connector
Asynchronous
Serial
Communication
(8bit, 1stop, No
Parity)
LDS 01 Mapping
Measure
LIDAR Map
Update Correct
Black Cells:
Obstacles,
Weight is high,
e.g. = 100.000
14.08.2018 12
Autonomous Intelligent Systems / Nauth
Path Planning: A* Algorithm
05.04.2021 13
Embedded Intelligent Systems / P. Nauth
1 Axis Model
Robot moves from p(k)
to p(k+1) whereby
D(k): Distance travelled
within k to k+1
Θ(k): Orientation at k
p(k) = (x(k), y(k), Θ(k))
Θ(k) + ΔΘ(k):
p(k+1)
Orientation at k+1
Δθ(k)/2 valid, if orientation changes gradually from θ(k) to θ(k) + Δθ(k) within T
T is time intervall the robot needs to change from k to k+1
Final Position:
Real State: Robot
Table in kitchen
True pose at k+1 Laser Scanner
p*(k+1) Trajectory planned
Real State:
True pose Predicted State p(k+1):
at k Estimated pose due to input u(k)
Belief State
State p(k):
Belief State
Input u(k) for calculation of predicted
state is measured by encoders
Start Position:
Bed room
05.04.2021 Embedded Intelligent
Seite Systems
24 / P. Nauth
EKF - SLAM
• State = Robot Pose and Landmark Positions
p(k+1) = (x(k+1), y(k+1), θ(k+1), xL1(k+1), yL1(k+1), …)
p(k) = (x(k), y(k), θ(k ), xL1(k), yL1(k), ….)
• Input
u(k) = (D(k) cos(θ(k)+Δθ(k)/2)),
D(k) sin(θ(k) +Δθ(k)/2)),
Δθ(k))
D(k) and Δθ(k) measured by Encoder (Odometrie)
zm
Prediction Predicted Update LMs
p(k+1) Prediction
k+1 Updated
→k ^p(k+1)
Start Position
Path
Final Position
Planning
Map
xL2
yL2
D(k)cos(θ(k)+Δθ(k)/2) 100
u(k)= D(k)sin(θ(k)+Δθ(k)/2) 010
3x1 Δθ(k) 001
G= 000
(3 + 2n) x 3 000
000
000
1 0 0 -1 0 0 0 x - xL1
H= 0 1 0 0 -1 0 0 H p(k) = z(k) = y - yL1
1 0 0 0 0 -1 0 x - xL2
0 1 0 0 0 0 -1 y - yL2
2n x (3 + 2n) 2n x 1
k1 0 k2 0 k1 ΔxL1 + k2 ΔxL2
0 k1 0 k2 k1 ΔyL1 + k2 ΔyL2
0 0 0 0 0
K= 0 0 0 0 K v(k+1) = 0
0 0 0 0 0
0 0 0 0 0
0 0 0 0 0
(3 + 2n) x 2n (3 + 2n) x 1