Final SLAM Report (Capstone Project)
Final SLAM Report (Capstone Project)
Simultaneous Localization
SAIF ALSAQQA
Page |1
Contents
Table of Figures ........................................................................................................................... 2
Abstract ......................................................................................................................................... 3
1. Introduction .......................................................................................................................... 4
2. Literature Review ................................................................................................................ 5
3. Theoretical Background ................................................................................................... 7
4. Methodology ..................................................................................................................... 12
Section 1: SLAM using MATLAB................................................................................................ 12
Section 2: Active SLAM Using TurtleBot3 (Burger) .................................................................. 13
Section 3: SLAM Using ROSMASTER X1 Robot ........................................................................ 14
5. Virtual Machine................................................................................................................. 15
6. Used robot .......................................................................................................................... 18
7. Simulation & Results ......................................................................................................... 19
Section 1: SLAM using MATLAB environment: ........................................................................ 19
Section 2: Active SLAM using ROS Environment (TurtleBot): ................................................. 21
Section 3: SLAM using ROS Environment (ROSMASTER X1 Robot): ....................................... 29
8. Project Timeline................................................................................................................. 38
9. Conclusion and Future Work ......................................................................................... 39
10. References .................................................................................................................... 40
Page |2
Table of Figures
FIGURE 1. COMMUNICATION WITH MATLAB......................................................................................................... 12
FIGURE 2. COMMUNICATION WITH TURTLEBOT3 .................................................................................................... 13
FIGURE 3. COMMUNICATION WITH ROSMASTER X1 ............................................................................................. 14
FIGURE 4. VMWARE INTERFACE ........................................................................................................................... 15
FIGURE 5. RVIZ INTERFACE ................................................................................................................................. 16
FIGURE 6. GAZEBO INTERFACE ........................................................................................................................... 17
FIGURE 7. ROSMASTER X1 ROBOT .................................................................................................................... 18
FIGURE 8. MONTE CARLO LOCALIZATION RESULTS USING MATLAB ........................................................................... 19
FIGURE 9. SLAM WITH LOOP CLOSURE RESULTS USING MATLAB ............................................................................. 20
FIGURE 10. GITHB.COM INTERFACE ...................................................................................................................... 22
FIGURE 11. PACKAGE/TOOL INSTALLATION ............................................................................................................ 22
FIGURE 12. CHECK PACKAGE INSTALLATION ........................................................................................................... 22
FIGURE 13. OPEN GAZEBO INTERFACE USING LINUX COMMANDS ............................................................................ 23
FIGURE 14. OPEN RVIZ TO VISUALIZE THE MAP...................................................................................................... 24
FIGURE 15. BUILDING THE MAP USING KEYBOARD .................................................................................................. 25
FIGURE 16. RQT GRAPH OF MAP BULIT USING KEYBOARD ....................................................................................... 25
FIGURE 17. BUILDING THE MAP USING NAVIGATION TOOL ....................................................................................... 26
FIGURE 18. RQT GRAPH OF MAP BULIT USING NAVIGATION TOOL ............................................................................ 26
FIGURE 19. BUILDING THE MAP USING ACTIVE SLAM ALGORITHM ............................................................................ 27
FIGURE 20. MAP USING ACTIVE SLAM ................................................................................................................. 28
FIGURE 21. RQT GRAPH OF MAP BULIT USING ACTIVE SLAM ALGORITHM ................................................................. 28
FIGURE 22. ROSMASTER X1 LIDAR DATA VISUALIZATION ..................................................................................... 31
FIGURE 23. LIDAR DATA ON THE CONSOLE............................................................................................................ 31
FIGURE 24. RQT GRAPH OF MAP CONSTRUCTION OF ROSMASTER X1..................................................................... 32
FIGURE 25. GMAPPING STRUCTURE ................................................................................................................... 32
FIGURE 26. ROSMASTER X1 MAP USING 𝑔𝑚𝑎𝑝𝑝𝑖𝑛𝑔 ALGORITHM (HIGH SPEED) ................................................... 33
FIGURE 27. ROSMASTER X1 MAP USING 𝑔𝑚𝑎𝑝𝑝𝑖𝑛𝑔 ALGORITHM (LOW SPEED) .................................................... 34
FIGURE 28. RQT GRAPH FOR 𝑔𝑚𝑎𝑝𝑝𝑖𝑛𝑔 ALGORITHM .......................................................................................... 34
FIGURE 29. ROSMASTER X1 MAP USING HECTOR ALGORITHM (HTU BUILDING) ...................................................... 35
FIGURE 30. ROSMASTER X1 MAP USING HECTOR ALGORITHM (HOME) .................................................................. 36
FIGURE 31. RQT GRAPH FOR HECTOR ALGORITHM ................................................................................................. 37
FIGURE 32. PROJECT TIMELINE ............................................................................................................................ 38
Page |3
Abstract
Simultaneous Localization and Mapping (SLAM) is crucial for enabling
autonomous robots to navigate and understand unknown environments. This
study explores the application of SLAM through a comprehensive three-stage
simulation and evaluation process, addressing the challenges and potential of
SLAM in real-world applications. The first stage utilizes MATLAB's SLAM-Toolbox to
establish a foundational understanding and simulate a mobile robot in a Gazebo
environment via the Robot Operating System (ROS). The second stage advances
the simulations to a Linux virtual machine using the Gazebo environment and
mobile robot, focusing on evaluating different SLAM packages. The final stage
involves real-world testing with the ROSMASTER X1 robot. Results demonstrate the
effectiveness of the ROSMASTER X1 in replicating conditions necessary for
autonomous navigation. However, the study also highlights the need for more
powerful computational resources to scale SLAM applications to full-scale
autonomous vehicles. Future work should prioritize algorithm optimization and
hardware advancements to enhance scalability and performance, ensuring
SLAM's broader applicability in autonomous systems.
Page |4
1. Introduction
Over the past few years, unprecedented progress in robotics has made it possible
to create intelligent, self-governing systems that can interact and navigate their
environment. Simultaneous Localization and Mapping (SLAM), a basic technique
that enables robots to autonomously develop maps of their environments while
simultaneously determining their positions within these maps, is a crucial element
in realizing such capabilities. This technology has enormous potential for a wide
range of uses, including autonomous cars and robotic exploration.
In this investigation outlines a thorough plan to study and apply SLAM techniques
on a small-scale robotic platform in preparation for their eventual use on larger
vehicles. Moreover, it focuses especially on how these techniques will eventually
be integrated with automotive systems. In this work, a detailed simulation of SLAM
algorithms will be conducted using different scenarios.
This research project aims to accomplish two main goals which are to put SLAM
techniques into practice and assess them on a small-scale robotic platform to
create a solid basis for further applications and to create the framework for SLAM
technologies to be applied to bigger cars, with a focus on potential integration
with automotive systems.
This project is divided into three basic stages, in the first stage, the simulation will
be conducted on MATLAB using SLAM-Toolbox to understand the main concept
of SLAM as well as in this stage the TurtleBot 3 robot will be used and simulated in
a Gazebo environment through the robot operating system (ROS). In the second
stage, the simulation will be conducted on a virtual machine using Linux software
and a Gazebo environment using TurtleBot 3 burger type, in this stage the
simulation is only using a virtual machine and gazebo environment with some
SLAM packages. In the last stage, the simulation will be conducted on the
ROSMASTER X1 robot which is a physical robot as well and the simulation will be in
the real world.
Page |5
2. Literature Review
Arbnor Pajazit et al,[5] investigated and described a ROS-based control system for
the TurtleBot robot for mapping and navigation in indoor environments.
3. Theoretical Background
Localization:
- Odometry: Uses data from motion sensors such as wheel encoders or IMUs
to estimate the robot's change in position over time.
- Dead Reckoning: A method like odometry but integrates sensor data over
time, which can accumulate errors.
- Beacon-based Localization: Utilizes known positions of beacons (e.g., GPS,
RFID) to triangulate the robot's position.
- Vision-based Localization (Visual localization): Employs cameras and
image processing techniques to identify landmarks and features within the
environment.
Page |8
Mapping:
SLAM can be framed as a state estimation problem. The state vector typically
encompasses the robot's pose and the map features (landmarks). Sensor
measurements establish a relationship between the robot's state and the
observations. The core challenge lies in estimating both the robot's pose and the
map features given these inherently noisy sensor measurements. Moreover, SLAM
draws upon various mathematical tools to achieve this estimation:
SLAM Algorithms:
Robot localization involves determining the robot's pose, which is a vector defined
in various reference frames. Measurements from sensors like cameras and
rangefinders are converted from their local frames to the robot's frame for
accurate localization. The absolute pose of the robot and landmarks is
determined relative to a global reference frame. Multiple sensors, data
association, and correction techniques are employed to achieve precise
localization. Key methods include:
Miscellaneous Algorithms
• Kalman and Extended Kalman Filters: Used for sensor fusion in linear and
non-linear systems.
• Particle Filter: A non-parametric state estimation technique using multiple
particle hypotheses.
• Feature Extraction Algorithms: SIFT, SURF, ORB, and BRIEF for visual SLAM.
• Deep Learning Algorithms: For object detection, tracking, and recognition
to identify obstacles and predict collisions.
• Collision Detection: Preemptive motion analysis to avoid dynamic
obstacles.
• Motion Prediction: Uses object classification to anticipate and prevent
collisions.
Active SLAM
- light detection and Ranging (LIDAR) which simply uses laser light to measure
distances to create a 3D representation of the environment.
- Camera: Gather visual data so that the robot can identify features and
landmarks in pictures.
- Inertial Measurement Unit (IMU): gives details about the angular velocity,
acceleration, and orientation of the robot.
- Radio Detection and Ranging (RADAR): it measures an object's distance
using radio waves.
P a g e | 12
4. Methodology
As mentioned in the introduction, this work is divided into three main stages or
sections to implement SLAM concept:
This stage aims to understand the working principle of applying active SLAM using
virtual machines before applying it in the real world. The simulation will be
conducted using TurtleBot robot as it will be simulated in the Gazebo environment
as well as it will be visualized using Rviz. Moreover, the below schematic shows the
whole structure of implementing active SLAM in the simulation using ROS, gazebo,
and RVIZ without MATLAB. So, as shown in the schematic the procedure of
implementing active SLAM and how it works the communication between the
TurtleBot3 (Burger) and the user interface, also it is acquiring just one operating
system as a user interface to be able to log in to it through the IP communication
and can get this IP from the TurtleBot consequently, the TurtleBot is considered a
hotspot in the Gazebo environment, basically can access the robot through an IP
address. Furthermore, ROS is used basically for creating nodes, topics, subscribers,
publishers, and messages that come from sensors like IMU and LiDAR. As well as
the visualization process is also done using RVIZ to plot/ visualize the 2D
environment of the robot surroundings based on the data that comes from the
ROS which weighs up messages coming from the LiDAR and IMU.
5. Virtual Machine
In this report, the focus was to implement the simulation using a virtual machine as
well as Linux will be installed as the main operating system on the virtual
machine/box. Moreover, the virtual machine allows you to create a separate and
isolated environment on your physical computer as well as software like VirtualBox
or VMware can be used to set up a virtual machine. The first step of the simulation
was to install VMware has been installed with ROS Noetic version as well as it
contains a virtual machine. The use of virtual machines has many advantages
such as isolation which ensures your main operating system is not affected by any
changes or installations made for ROS and snapshots can be taken and reverted
if something goes wrong.
Here is the user interface of the VMware workstation as it is already packaged with
ROS Noetic Humble Gazebo 11.
After installing VMware, we started to understand the main ROS tools as ROS's core
functionality is augmented by a variety of tools that allow developers to visualize
and record real-time data. These tools are provided in packages like any other
algorithm.
P a g e | 16
6. Used robot
The ROSMASTER X1 ROS robot uses the Astra Pro depth camera to enhance its
environmental perception and navigation capabilities. The Astra Pro utilizes
structured light technology, where an infrared (IR) laser projector emits a known
pattern onto the environment. The IR camera then captures the deformation of
this pattern caused by the surfaces and objects it hits. By analyzing these
deformations, the camera calculates the depth information of the scene,
creating a 3D map. The Astra Pro is equipped with a high-resolution RGB camera
and an IR depth sensor, allowing it to capture both color and depth data
simultaneously. This depth of information is crucial for tasks such as obstacle
detection, object recognition, and 3D mapping, enabling the robot to navigate
and interact with its environment more effectively and autonomously.
P a g e | 19
The simulation has been divided into three main sections, the first one was using
TurtleBot, MATLAB, Virtual Machine, ROS, and Gazebo, the second one was using
TurtleBot, RVIZ, Virtual Machine, ROS, and Gazebo, and the third one was using
ROSMASTER X1 Robot, RVIZ, Virtual Machine, ROS, and Gazebo.
In the first simulation, TurtleBot robot has been used as well as Monte Carlo
Localization (MCL) is used which is a particle filter-based algorithm used to
estimate a robot's pose within a known map, leveraging motion and sensor data.
It begins with an initial belief about the robot's pose, represented by particles
distributed according to this belief. As the robot moves, particles propagate
according to its motion model. When new sensor readings are received, each
particle evaluates its accuracy by calculating the likelihood of such readings at its
current pose. The algorithm resamples particles to bias them towards higher
accuracy. This process iterates, with particles converging to the robot's true pose.
Adaptive Monte Carlo Localization (AMCL), a variant of MCL, dynamically adjusts
the number of particles based on KL distance to ensure a close resemblance to
the true robot state distribution. In practical implementation, the algorithm
updates with laser scan and odometry data, computes the robot's pose, updates
the estimated pose and covariance, and drives the robot to the next pose. Initially,
particles are uniformly distributed, but after several updates (1, 8, and 60), they
converge to areas with higher likelihood, ultimately aligning closely with the robot's
true pose and the map outlines.
This simulation has been conducted using an ROS environment along with a
Gazebo simulator. Moreover, this is an implementation of an active slam algorithm
that autonomously navigates to new goal locations and explores the area to form
a map of the environment.
To start the simulation, some prerequisite tools and packages are required to be
installed from the ROS Wiki website. These prerequisites are 𝑡𝑢𝑒𝑟𝑡𝑙𝑒𝑏𝑜𝑡3_𝑠𝑖𝑚𝑢𝑙𝑎𝑡𝑖𝑜𝑛𝑠,
𝑡𝑢𝑟𝑡𝑙𝑒𝑏𝑜𝑡3_𝑚𝑠𝑔𝑠, 𝑡𝑢𝑟𝑡𝑙𝑒𝑏𝑜𝑡3, 𝑔𝑎𝑧𝑒𝑏𝑜_𝑟𝑜𝑠 and 𝑠𝑙𝑎𝑚_𝑔𝑚𝑎𝑝𝑝𝑖𝑛𝑔. Please refer to the ROS
Wiki for more installation instructions on setting the ROS for using gazebo turtlebot3
simulator for navigation.
After installing the prerequisite tools and packages, here are some steps to
complete the simulation: Note. bule words are commands that can be used in the
simulation.
- First, you need to open the virtual machine and then open the terminal.
- Second, export the IP address and the robot type using the following two
commands. Note. You need to enter the IP address that is displayed on
your virtual machine.
export ROS_IP = 192.168.140.128
export TURTLEBOT3_MODEL = burger
- Third, check if the IP address and the robot type have been exported
successfully using the following command.
𝑒𝑐ℎ𝑜 $𝑅𝑂𝑆_𝐼𝑃
𝑒𝑐ℎ𝑜 $TURTLEBOT3_MODEL
For example: I will show you how to install the active slam package:
Then go to you terminal and open the 𝑠𝑟𝑐 folder and then install the package
using the above-mentioned command.
Check if the package is installed by opening and listing what inside the 𝑠𝑟𝑐 folder
- The final step is to open three terminals, enter the 𝑐𝑎𝑡𝑘𝑖𝑛_𝑤𝑠 in the three
terminals, then run the following three commands in order:
1- 𝑟𝑜𝑠𝑙𝑎𝑢𝑛𝑐ℎ 𝑡𝑢𝑟𝑡𝑙𝑒𝑏𝑜𝑡3_𝑔𝑎𝑧𝑒𝑏𝑜 𝑡𝑢𝑟𝑡𝑙𝑒𝑏𝑜𝑡3_𝑤𝑜𝑟𝑙𝑑. 𝑙𝑎𝑢𝑛𝑐ℎ
2- 𝑟𝑜𝑠𝑙𝑎𝑢𝑛𝑐ℎ 𝑎𝑐𝑡𝑖𝑣𝑒_𝑠𝑙𝑎𝑚 𝑎𝑐𝑡𝑖𝑣𝑒_𝑠𝑙𝑎𝑚. 𝑙𝑎𝑢𝑛𝑐ℎ
3- 𝑝𝑦𝑡ℎ𝑜𝑛 ./𝑠𝑟𝑐/𝑎𝑐𝑡𝑖𝑣𝑒_𝑠𝑙𝑎𝑚/𝑠𝑐𝑟𝑖𝑝𝑡𝑠/𝑔𝑙𝑜𝑏𝑎𝑙_𝑝𝑙𝑎𝑛𝑛𝑖𝑛𝑔. 𝑝𝑦
After following the steps that have been discussed earlier, the simulation was
conducted to build the map and localize the robot’s position within the map.
Three different simulations will be conducted, and all these simulations have the
same first two steps which are running the command that opens the gazebo
environment and the second command is to launch the active SLAM package.
After running this command RVIZ will open and will present the map and sensor
data on the left-hand tool bar.
In this simulation, the TurtleBot3 robot will simultaneously build the map and localize
its position within that map. The robot will move from one point to another using
the keyboard, moreover, this is not active SLAM because in active SLAM the robot
must autonomously navigate to new goal locations and explore the area to form
a map of the environment.
After running the first and second commands, the third command will be run:
𝑟𝑜𝑠𝑟𝑢𝑛 𝑡𝑒𝑙𝑒𝑜𝑝_𝑡𝑤𝑖𝑠𝑡_𝑘𝑒𝑦𝑏𝑜𝑎𝑟𝑑 𝑡𝑒𝑙𝑒𝑜𝑝_𝑡𝑤𝑖𝑠𝑡_𝑘𝑒𝑦𝑏𝑜𝑎𝑟𝑑. 𝑝𝑦
This command will make the user able to control the robot using the keyboard.
As shown in the graph below, the robot has successfully built its map by using the
keyboard as a movement controller.
P a g e | 25
The following picture shows the 𝑟𝑞𝑡_𝑔𝑟𝑎𝑝ℎ for all active nodes and topics when
using the keyboard as controller.
In this simulation, the TurtleBot3 robot will simultaneously build the map and localize
its position within that map. The robot will move from one point to another using
the 2D navigation goal that is available on RVIZ. Moreover, this also not active
SLAM.
The following picture shows the 𝑟𝑞𝑡_𝑔𝑟𝑎𝑝ℎ for all active nodes and topics: without
keyboard, just using the 2D navigation goal tool.
Third Simulation: Mapping and localization using the active SLAM algorithm
In this simulation, the TurtleBot3 robot will simultaneously build the map and localize
its position within that map. The robot will autonomously move from one point to
another by incorporating the decision-making process to improve the map quality
and localization. The robot actively explores the environment, choosing actions
that are expected to reduce uncertainty in its map and localization estimates.
Moreover, the robot will send a goal location randomly and then it will try to reach
this location, moreover, when the robot reaches that location/ destination, it will
repeat the steps periodically until it explores all unknown spaces within the
environment.
The following picture shows the 𝑟𝑞𝑡_𝑔𝑟𝑎𝑝ℎ for all active nodes and topics: using the
active SLAM algorithm.
Figure 21. RQT Graph of Map Bulit Using Active SLAM Algorithm
P a g e | 29
After the robot has been assembled, the first step that you should take is to
connect the robot with the virtual machine by connecting your WIFI to the robot
hotspot. The robot hotspot has the following name and password:
Then open your virtual machine to go inside the robot tools and packages using
the following command:
𝑠𝑠ℎ − 𝑋 𝑝𝑖@192.168.1.11
Then it will require you to enter the robot password that have mentioned earlier.
Once you went inside the robot (Raspberry pi) tools and packages, you must set
the robot type, camera type, and LiDAR type because the system cannot
automatically identify the product version as there are many versions of the
ROSMASTER robots.
Case 1: if you want to use the application control, you can directly select the
corresponding product model in the APP to operate the control car.
Case 2: if you want to control the robot or to edit the code, you must enter the
𝑏𝑎𝑠ℎ𝑟𝑐 𝑓𝑖𝑙𝑒 and then edit the camera, LiDAR, and robot type.
7- In case you did not able to modify the 𝑏𝑎𝑠ℎ𝑟𝑐 file, you can manually set the
camera, LiDAR, and robot type using the following commands:
𝑒𝑥𝑝𝑜𝑟𝑡 𝑅𝑃𝐿𝐼𝐷𝐴𝑅_𝑇𝑌𝑃𝐸 = 𝑎1
𝑒𝑥𝑝𝑜𝑟𝑡 𝐶𝐴𝑀𝐸𝑅𝐴_𝑇𝑌𝑃𝐸 = 𝑎𝑠𝑡𝑟𝑎𝑝𝑟𝑜
𝑒𝑥𝑝𝑜𝑟𝑡 𝑅𝑂𝐵𝑂𝑇_𝑇𝑌𝑃𝐸 = 𝑋1
8- Then, update/refresh the environment:
𝑠𝑜𝑢𝑟𝑐𝑒 ~/. 𝑏𝑎𝑠ℎ𝑟𝑐
The ROSMASTER X1 robot has a LiDAR type A1, and it has a package called
𝑟𝑝𝑙𝑖𝑑𝑎𝑟_𝑟𝑜𝑠, make sure to use this library, not 𝑦𝑑_𝑙𝑖𝑑𝑎𝑟, otherwise no results will be
presented because you are using different LiDAR type.
the first step that we will do is to remap USB serial port, and it can be done using
two methods:
Method 1: Adding permission directly. This method only works this time:
- Check the permission of the 𝑟𝑝_𝑙𝑖𝑑𝑎𝑟 serial port using the following
command:
𝑙𝑠 − 𝑙 /𝑑𝑒𝑣 |𝑔𝑟𝑒𝑝 𝑡𝑡𝑦𝑈𝑆𝐵
- You must install USB port remapping in the 𝑟𝑝𝑙𝑖𝑑𝑎𝑟_𝑟𝑜𝑠 function package
path.
./𝑠𝑐𝑟𝑖𝑝𝑡𝑠/𝑐𝑟𝑒𝑎𝑡𝑒_𝑢𝑑𝑒𝑣_𝑟𝑢𝑙𝑒𝑠. 𝑠ℎ
- Re-plug the LiDAR USB interface and use the following command to
modify the remapping:
𝑙𝑠 − 𝑙 /𝑑𝑒𝑣 | 𝑔𝑟𝑒𝑝 𝑡𝑡𝑦𝑈𝑆𝐵
- Run the 𝑟𝑝𝑙𝑖𝑑𝑎𝑟 node and view it in 𝑟𝑣𝑖𝑧 using the following command: this
will make you able to visualize the LiDAR reading on RVIZ.
𝑟𝑜𝑠𝑙𝑎𝑢𝑛𝑐ℎ 𝑟𝑝𝑙𝑖𝑑𝑎𝑟_𝑟𝑜𝑠 𝑣𝑖𝑒𝑤_𝑟𝑝𝑙𝑖𝑑𝑎𝑟. 𝑙𝑎𝑢𝑛𝑐ℎ
P a g e | 31
- Run the 𝑟𝑝𝑙𝑖𝑑𝑎𝑟 node and see with the test application using the following
command.
𝑟𝑜𝑠𝑙𝑎𝑢𝑛𝑐ℎ 𝑟𝑝𝑙𝑖𝑑𝑎𝑟_𝑟𝑜𝑠 𝑟𝑝𝑙𝑖𝑑𝑎𝑟. 𝑙𝑎𝑢𝑛𝑐ℎ
- To see the LiDAR results/ data on the console, you can run the following
command on the terminal:
𝑟𝑜𝑠𝑟𝑢𝑛 𝑟𝑝𝑙𝑖𝑑𝑎𝑟_𝑟𝑜𝑠 𝑟𝑝𝑙𝑖𝑑𝑎𝑟𝑁𝑜𝑑𝑒𝐶𝑙𝑖𝑒𝑛𝑡
- If you want to see a graph that displays all active nodes and topics, you
can use the following command.
𝑟𝑜𝑠𝑟𝑢𝑛 𝑟𝑞𝑡_𝑔𝑟𝑎𝑝ℎ 𝑟𝑞𝑡_𝑔𝑟𝑎𝑝ℎ
Or
𝑟𝑞𝑡_𝑔𝑟𝑎𝑝ℎ
Structure:
When we tried to lower the robot speed down to 0.14 𝑚𝑒𝑡𝑒𝑟𝑠/𝑠𝑒𝑐 from
0.5 𝑚𝑒𝑡𝑒𝑟𝑠/𝑠𝑒𝑐, we got a better result with 𝑔𝑚𝑎𝑝𝑝𝑖𝑛𝑔:
P a g e | 34
The following map is for the industrial robotics lab in HTU building.
8. Project Timeline
The project timeline for our whole journey in the Capstone project during the two
semesters, as shown below, the various tasks that we did according to the weekly
meetings with the supervisor, Dr. Tarek Tutunji. Moreover, the periods show the
duration of our work on this task in addition to when we started and when we
finished.
Project End: 13/6/2024 Period Highlight: 1 Plan Duration Actual Start % Complete Actual (beyond plan ) % Complete (beyond plan)
PLAN PLAN ACTUAL ACTUAL
PERCENT November December January February March April May June
ACTIVITY START DURATION START DURATION
COMPLETE
Week Week Week Week
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35
Identification the topic
1 1 1 0 100
Implications for Future Work: Future research should focus on optimizing SLAM
algorithms to enhance processing speed and reduce computational complexity.
The development of advanced sensor fusion techniques, particularly integrating
LiDAR with computer vision, is essential for improving SLAM accuracy and
robustness. Furthermore, efficient handling of data from 3D LiDARs demands the
creation of faster SLAM algorithms capable of managing increased data volumes
and complexity.
10. References
[1] “HECTORSLAM 2D MAPPING FOR SIMULTANEOUS LOCALIZATION AND
MAPPING (SLAM)”, doi: 10.1088/1742-6596/1529/4/042032.
[4] Z. An, L. Hao, Y. Liu, and L. Dai, “Development of Mobile Robot SLAM Based
on ROS,” International Journal of Mechanical Engineering and Robotics Research,
vol. 5, no. 1, 2016, doi: 10.18178/ijmerr.5.1.47-51.