0% found this document useful (0 votes)
144 views

Visual SLAM

This document provides an introduction to visual simultaneous localization and mapping (SLAM). It discusses that visual SLAM uses camera sensors to simultaneously estimate a robot's pose within an environment while building and optimizing a consistent map of that environment. Key aspects of visual SLAM covered include feature-based and dense metric map representations, using relative pose and 3D information from image pairs to construct a map, and tracking the robot's pose within a known 3D map using point correspondences between map features and camera frames. The document also notes that Kalman filtering is a important component used in visual SLAM for state estimation.

Uploaded by

Hupe Paxen
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
144 views

Visual SLAM

This document provides an introduction to visual simultaneous localization and mapping (SLAM). It discusses that visual SLAM uses camera sensors to simultaneously estimate a robot's pose within an environment while building and optimizing a consistent map of that environment. Key aspects of visual SLAM covered include feature-based and dense metric map representations, using relative pose and 3D information from image pairs to construct a map, and tracking the robot's pose within a known 3D map using point correspondences between map features and camera frames. The document also notes that Kalman filtering is a important component used in visual SLAM for state estimation.

Uploaded by

Hupe Paxen
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Introduction

to Visual
SLAM
Contents
1. Introduction to Visual SLAM
2. Kalman Filter
3. Implementation (programming)
What is SLAM?
• SLAM stands for Simultaneous Localization and Mapping
Simultaneous:
 estimation of the state of a robot using on-board sensors
 construction of a map of the environment that
the sensors are perceiving

Example demo:

Monocular ORB SLAM


ORB-SLAM for large environments
Lidar SLAM
What is SLAM?

• SLAM stands for Simultaneous Localization


and Mapping
Simultaneous:
 Mapping: Continuously expanding and
optimizing a consistent map while exploring
the environment
 Localization: Localization within the map

Jing Dong “GTSAM 4.0 Tutorial” License CC BY-NC-SA 3.0


What is visual SLAM?

• SLAM stands for Simultaneous Localization


and Mapping
Simultaneous:
 Mapping: Continuously expanding and
optimizing a consistent map while exploring
the environment
 Localization: Localization within the map
(tracking the map in image frames)
Visual SLAM example
What is the map?

A model of the environment that lets us


• limit the localization error by
recognizing previously visited
areas
• (support other tasks, such as
obstacle avoidance and path
planning)
Examples of map representations
Feature-based metric maps
Examples of map representations
Dense metric maps
How do we build a map?
Relative pose and 3D from two views
How do we track a map?
Pose from known 3D map
Pose from known 3D map
Pose from known 3D map
Pose from point correspondences
Multi-view mapping
Multi-view mapping
Components of SLAM
Kalman Filter
We will study following this source

https://www.bzarg.com/p/how-a-kalman-filter-works-in-pictures/#mjx-eqn-
gainformula
Part 2: Implementation of Kalman Filter
Final Kalman Filter Algorithm

You might also like