Third Eye for Visually Challanged
Third Eye for Visually Challanged
PROJECT REPORT
Bachelor of Technology
in
Electronics and Communication Engineering
DASARI KAIVALYA GRANDHI SANJAY MALLELA VISHNU TEJA
[201FA05008] [201FA05015] [211LA05022]
May 2024
ACKNOWLEDGEMENT
The satisfaction that accompanies the successful completion of any task would be
incomplete without the mention of people who made it possible and whose constant guidance
and encouragement crown all the effort with success.
We are greatly indebted to Dr. G. S. R. Satyanarayana, our revered guide and Assistant
Professor in the Department of Electronics and Communication Engineering, VFSTR (Deemed
to be University), Vadlamudi, Guntur, for his valuable guidance in the preparation of this
dissertation. He has been a source of great inspiration and encouragement to us. He has been
kind enough to devote considerable amount of his valuable time in guiding us at every stage.
This is our debut, but we are sure that we can do many more such studies, purely become of
the lasting inspiration and guidance given by respectable guide.
We would like to specially thank Dr. T. Pitchaiah, Head of the Department, ECE for
his valuable suggestions.
We would like to specially thank, Dr. N. Usha Rani, Dean, School of ECE for her help
and support during the project work.
We thank our project coordinators Dr. Satyajeet Sahoo, Dr. Arka Bhattacharyya, Mr.
M. Vamshi Krishna, Mr. Abhishek Kumar for continuous support and suggestions in
scheduling the reviews and report verifications. Also, thanks to supporting staff of ECE
department for their technical support for timely completion of the project.
Finally, we would like to thank our parents and friends for the moral support throughout
the project work.
D. Kaivalya (201FA05008)
G. Sanjay (201FA05015)
M. Vishnu Teja (211LA05022)
ABSTRACT
Millions of humans rely on corrective lenses, but for the blind, current assistive
technologies offer limited support. This project proposes a low-cost, wearable device to
empower them with improved navigation. This unobtrusive device, mountable on canes,
glasses, on clothes or gloves, utilizes two key sensors LiDAR and Ultrasonic. The acquired
sensor data is transmitted via Bluetooth to a user-friendly smartphone app. This app translates
complex readings into clear information, offering audio alerts about object distances or
integrating with other apps for image recognition or navigation assistance. This newfound
awareness of surroundings empowers blind users to navigate with greater confidence and
freedom. The focus on affordability, achieved through readily available components and
efficient design, makes the device accessible to a wider range of users. This has the potential
to be a significant advancement, granting increased independence and a sense of security to
those with visual impairments.
CONTENTS
PAGE NO
CHAPTER 1…………………………………………………………………...01
1.1 Introduction…………………………………………………………………02
1.2 Motivation…………………………………………………………………..03
1.3 Objective……………………………………………………………………04
1.3.1 Broad Objective………………………………………………….04
1.3.2 Specific Objective……………………………………………….04
1.4 Organization of thesis……………………………………………………....04
CHAPTER 2…………………………………………………………………..05
2. Literature survey…………………………………………………….….…..06-09
CHAPTER 3……………………………………………………………………10
3.2 Components……….……………………………………………………..13-23
CHAPTER 4…………………………………………………………………..24
4. Flow chart…………………………………………………………………...25
CHAPTER 5……………………………………………………………………....26
CHAPTER 6……………………………………………………………………….32
CHAPTER 7……………………………………………………………………….43
CHAPTER 8………………………………………………………………………45
8. References……………………………………………………………………….46
TABLE OF FIGURES
FIG NO NAME OF THE FIGURE PAGE NO
GND Ground
HC 05 Host Controller
IR Infrared
ML Machine Learning
0
CHAPTER 1
1
1.1Introduction
Imagine a world where navigating unfamiliar environments becomes achievable and
independent movement a reality for the visually challenged. This report presents the
development of a "Third Eye," an assistive device built with accessibility in mind. This system
leverages the power of Arduino UNO, a user-friendly microcontroller, to process data from
multiple sensors and provide crucial spatial awareness.
The core obstacle detection relies on an ultrasonic sensor, which emits sound waves and
measures their echo to determine the distance to nearby objects. For enhanced precision and
detailed object recognition, a LiDAR sensor can be integrated, creating a more comprehensive
picture of the user's surroundings.
Real-time feedback is paramount for this assistive device. The Third Eye utilizes a
Bluetooth module to wirelessly transmit sensor data to a custom smartphone application. This
application, designed as a serial monitor, translates the data into a user-friendly format.
Through a combination of audible alerts or vibrations, the user receives immediate notifications
about potential obstacles, empowering them to navigate with greater confidence.
This report will explore the hardware components chosen, delve into the software
development process for the Arduino platform, and detail the design of the smartphone
application. Additionally, it will discuss the implemented testing procedures used to evaluate
the Third Eye's effectiveness and its potential to revolutionize independent navigation for
visually challenged individuals.
2
1.2 Motivation
The inability to see presents a significant barrier to navigating the world freely. Individuals
with visual impairments often rely on assistance or limited tools for movement, hindering their
independence and mobility. This project, the development of a "Third Eye," aims to bridge this
gap and empower individuals with visual challenges.
The motivation for this project stems from a desire to create a more inclusive and
accessible world. By utilizing readily available technologies like Arduino UNO, ultrasonic
sensors, and LiDAR, the Third Eye offers a cost-effective and adaptable solution. This system
goes beyond basic obstacle detection, potentially incorporating LiDAR for detailed object
recognition, creating a richer understanding of the environment.
Furthermore, the open-source nature of the Arduino platform allows for future
customization and development by the user community, fostering innovation and ensuring the
Third Eye can adapt to individual needs and environments.
The Third Eye goes beyond basic obstacle detection. Its modular design allows for future
expansion with additional sensors, like cameras for object identification. This flexibility
positions it as a platform for continuous improvement, promoting a future where assistive
technology seamlessly integrates with daily life.
Developing the Third Eye is not just about creating a technological solution; it's about
fostering independence, building confidence, and promoting a more inclusive world for those
who experience visual impairment. By offering a cost-effective and adaptable system, the Third
Eye has the potential to significantly improve the lives of many.
3
1.3 Objective
1.3.1 Broad Objective
The third eye aims to empower visually challenged individuals with a low-cost CPU
based assistive device combining ultrasonic and potential LiDAR sensors for obstacle detection
and distance estimation. Real-time data is wirelessly transmitted to a smartphone app,
providing immediate feedback through audible alerts. This modular, cost-effective solution
seeks to enhance independent navigation and can be expanded with additional sensors like
cameras for advanced object identification.
Chapter 2 describes about the Literature work that has been studied and mentioning of the
different existing solutions along with the pros and cons.
Chapter 3 describes about the Components and their descriptions, the proposed system along
with the sample block diagram will be developed.
Chapter 5 describes about the design implementations, interfaces and working of the prototype.
4
CHAPTER 2
5
2. Literature survey
[1] Challenges: Existing navigation solutions (GPS, guide dogs) have limitations for the blind.
Infrared (IR) Sensors as a Solution: IR offers promise due to low power consumption, object
detection capabilities, and compact size for wearable devices.
Existing Research: Studies like "A Smart Infrared Microcontroller-Based Blind Guidance
System" (2013) explore IR for basic obstacle detection.
Limitations: Current IR solutions have limited range/detail and can be affected by the
environment.
Future Directions: Research could focus on sensor fusion with ultrasonics, advanced signal
processing for better object recognition, and user interface design for clear data interpretation.
[2] Smart sticks are a growing assistive technology for the visually impaired. Common sensors
include ultrasonic (affordable, good range) and LiDAR (precise object recognition, but
expensive). Existing solutions like WeWALK and UltraCane use these sensors for obstacle
detection with vibration or voice feedback. Limitations include:
Sensor Fusion: Combining sensors for wider range and richer information.
6
[3] Visually impaired individuals face challenges navigating unfamiliar environments.
Traditional tools like canes offer limited information, while guide dogs are expensive and
require training. Existing electronic aids include:
Ultrasonic ETAs: Use ultrasonic sensors for obstacle detection with audio/vibration alerts.
LiDAR Navigation Systems: Offer more precise object recognition but can be costly and
complex.
Smartphone Navigation Apps: Provide audio guidance and location information but rely on
pre-existing data.
Limited range (ultrasonic), Lack of object recognition (basic ETAs), Cost and complexity
(LiDAR), Reliance on external infrastructure (smartphone apps).
This project, the Third Eye, aims to address these issues by:
Combining ultrasonic and LiDAR sensors for better range, affordability, and recognition.
Using a real-time feedback smartphone app for customization and potential mapping
integration. Utilizing a modular design for future expansion with additional sensors.
The Third Eye has the potential to be a more comprehensive and adaptable assistive device.
[4] Visually impaired individuals rely on alternative methods like echolocation (requires
extensive training) to navigate. This paper explores assistive technologies using LiDAR for
spatial sensing. Existing Solutions:
ETAs (Electronic Travel Aids): Improved range but limited object recognition.
LiDAR for Spatial Sensing: LiDAR offers advantages over traditional methods:
7
Performance Analysis in Literature: Existing studies on LiDAR for spatial sensing often
evaluate: Accuracy of obstacle detection, Object recognition capabilities, Usability, and user
experience.
[5] Current assistive technology often offers limited obstacle detection for the visually
impaired. While short-range LiDAR or ultrasound provide some navigation aid, complex
environments demand more.
Research in autonomous vehicles highlights the potential of sensor fusion (LiDAR with
cameras/radar) for a more comprehensive picture of surroundings. The INSPEX project
exemplifies this with their "smart white cane" integrating long-range LiDAR for a more
detailed obstacle detection system. However, advancements are needed:
Wearability: Sensor miniaturization and reduced power consumption are crucial for
lightweight, portable solutions.
User Interface Design: Translating LiDAR data into clear and actionable user feedback requires
robust processing algorithms and user-centered interface design.
By addressing these challenges, long-range LiDAR has the potential to transform obstacle
detection for the visually impaired. Future research focused on miniaturization, light
adaptability, and user-centered design will pave the way for next-generation assistive
technology, empowering greater independence and navigation freedom.
[6] Traditional white canes offer limited functionality. Research explores advancements:
Ultrasonic Sensors: Common for short-range obstacle detection (up to 5 meters) but lack detail
for complex situations.
Laser Canes: Offered longer range but were bulky and impractical.
Sensor Fusion: Combining sensors (LiDAR, cameras) creates a more comprehensive
understanding of surroundings.
Existing smart canes integrate features like:
GPS/GSM: For location tracking and emergency assistance.
Multi-Sensor Integration: Projects like INSPEX utilize long-range LiDAR for detailed obstacle
8
detection.
The Nav-Cane can contribute by focusing on (mention unique features).
[7] Traditional canes offer limited obstacle detection for the blind. Research explores ultrasonic
sensors, like those in [previous citation], for navigation assistance. These emit sound waves to
measure object distance. Commercially available ultrasonic blind sticks exist, but with
limitations: short range, potential for inaccurate readings, and lack of object recognition.
Future advancements are needed:
Multi-sensor integration (e.g., LiDAR) for a more comprehensive environment picture.
Advanced signal processing for improved accuracy. Object recognition features using machine
learning for specific feedback.
By addressing these limitations, ultrasonic technology can be a valuable tool in developing
more effective assistive devices for the visually impaired.
9
CHAPTER 3
10
3.1 Proposed System
LiDAR switch
The proposed system is an advanced obstacle detection and notification system designed to
assist visually impaired and blind individuals. It integrates multiple sensors, a microcontroller,
and wireless communication using Bluetooth standard to provide real-time obstacle detection
information via a mobile phone.
Components and Functionality:
Sensors:
LiDAR: Utilized for long-range obstacle detection, the LiDAR sensor accurately measures the
distance to objects using laser pulses. It is ideal for detecting obstacles that are farther away,
providing precise distance measurements.
Maxbotix Ultrasonic Sensor (MB1200 XL-MaxSonar-EZ0): This medium-range sensor emits
ultrasonic waves and measures the echo return time to calculate distances. It offers reliable
detection of obstacles at moderate distances, enhancing the system's ability to sense the
environment effectively.
HC-SR04 Ultrasonic Sensor: Employed for short-range detection, the HC-SR04 sensor
operates on similar principles as the Maxbotix sensor but is optimized for closer objects. It
11
ensures the system can detect nearby obstacles accurately.
Microcontroller: The microcontroller is the central processing unit of the system, receiving
input from all three sensors. It processes the sensor data to determine the presence and distance
of obstacles. The microcontroller is also responsible for managing the input from the switch
and controlling the Bluetooth module.
Switch: The switch allows the user to activate or deactivate the system. When turned on, the
microcontroller begins processing the input from the sensors to detect obstacles.
Bluetooth Module: The Bluetooth module works based on the IEEE-802.15.1-2005 standard
enables wireless communication between the microcontroller and a mobile phone. After
processing the sensor data, the microcontroller sends relevant obstacle information to the
Bluetooth module, which then transmits it to the connected mobile phone.
Mobile Phone: The mobile phone receives the data from the Bluetooth module and provides
auditory or tactile feedback to the user about detected obstacles. This real-time notification
helps visually impaired individuals navigate their surroundings more safely and effectively.
System Operation:
When the user activates the system via the switch, the microcontroller starts collecting data
from the LiDAR, Maxbotix, and HC-SR04 sensors. The processed information about obstacle
distance and presence is then transmitted via Bluetooth to the user's mobile phone, which
provides immediate feedback. This integrated approach ensures comprehensive obstacle
detection across different ranges, enhancing the mobility and safety of visually impaired and
blind individuals.
12
3.2 Components
Technical Specifications:
Operating Voltage: 5V DC
Working Principle: The HC-SR04 ultrasonic sensor operates based on the principle of
echolocation, similar to how bats navigate. It consists of two main components: a transmitter
and a receiver.
1. Triggering: The sensor is triggered by sending a 10-microsecond HIGH pulse to the Trig pin.
2. Emitting Ultrasonic Pulse: Upon receiving the trigger signal, the sensor's transmitter emits
an 8-cycle burst of ultrasonic sound waves at a frequency of 40 kHz.
3. Echo Reception: These sound waves travel through the air and, upon encountering an object,
get reflected towards the sensor. The sensor's receiver detects the reflected waves (echo).
4. Distance Calculation: The time interval between the transmission of the sound waves and
the reception of the echo is measured. The distance to the object is calculated using the formula:
13
The speed of sound in air is approximately 343 meters per second.
Pin Configuration
2. Trig : Trigger pin, connected to a microcontroller's digital output pin to initiate measurement.
Technical Specifications:
Resolution: 1cm
Output Options: Analog Voltage Output, Pulse Width Output, Serial Output
14
Dimensions: 22mm x 22mm x 22mm
Working Principle: The MB1200 XL-MaxSonar-EZ0 uses ultrasonic waves to measure the
distance to an object. The sensor emits a high-frequency sound pulse and then listens for the
echo that reflects back from the target. The time taken for the echo to return is used to calculate
the distance to the object based on the speed of sound.
Triggering :The sensor can be triggered by supplying power or via a command from a
microcontroller.
Echo Reception: The sensor listens for the echo of the pulse that bounces back from the target
object.
Distance Calculation: The time between the emission of the pulse and the reception of the echo
is measured. The distance is calculated using the formula:
The sensor provides this distance information through analog, pulse width, or serial output.
Pin Configuration
Applications:
15
Robotics: For obstacle detection and avoidance in robotic systems.
Drone Altitude Holding: Maintaining stable altitude by measuring the distance to the ground.
Advantages:
Multiple Output Options: Flexibility in interfacing with different systems via analog, PWM,
and serial outputs.
Limitations:
Surface Dependency: Best performance with flat and hard surfaces; accuracy may decrease
with soft or irregular surfaces.
Beam Angle: The 20-degree beam angle might not be suitable for applications requiring wider
coverage.
The Maxbotix MB1200 XL-MaxSonar-EZ0 ultrasonic sensor is a reliable and versatile tool for
distance measurement applications. Its long-range capability, multiple output options, and high
resolution make it suitable for a wide array of applications, from robotics to industrial
automation. While environmental conditions and surface types can affect its performance, its
overall advantages make it a valuable sensor for projects requiring precise and accurate
distance measurements.
16
FIG 3.2.2 Maxbotix MB1200 XL-MaxSonar-EZ0
The TF MINI Micro LiDAR is a compact and cost-effective laser distance sensor that offers
high accuracy and reliability for distance measurement. It is designed for applications requiring
precise and continuous distance detection in a small form factor.
Technical Specifications:
Operating Voltage: 5V DC
Operating Current: ≤ 120mA
Measurement Range: 30cm to 12m
Resolution: 1cm
Accuracy: ±6cm (30cm to 6m); ±1% (6m to 12m)
Frame Rate: 100 Hz
Wavelength: 850 nm
Field of View: 2.3°
Output Options: UART, I2C
Dimensions: 42mm x 15mm x 16mm
Weight: 10 grams
Working Principle: The TF MINI Micro LiDAR uses time-of-flight (ToF) technology to
measure distance. The sensor emits a near-infrared laser pulse and calculates the time taken for
the pulse to reflect back from the target object. This time difference is used to determine the
distance to the object.
Emitting Laser Pulse: The sensor emits a laser pulse at 850 nm wavelength.
Reception of Reflected Pulse: The sensor detects the reflected laser pulse that bounces back
from the target object.
17
Distance Calculation: The time taken for the laser pulse to travel to the object and back is
measured. The distance is calculated using the speed of light and the formula:
The sensor provides this distance information through UART or I2C output.
Pin Configuration:
1. VCC: Power supply pin, connected to a 5V DC source.
2. GND: Ground pin, connected to the ground of the power supply.
3. TX: UART transmission pin for distance data output.
4. RX: UART reception pin for configuration and commands.
Applications: The TF MINI Micro LiDAR is versatile and suitable for various applications,
including:
Robotics: For obstacle detection and navigation in robotic systems.
Drones: Altitude holding and terrain following in UAVs.
Automated Guided Vehicles (AGVs): Navigation and obstacle avoidance.
Distance Measurement: Precise measurement in industrial automation.
Security Systems: Intrusion detection and monitoring.
Smart Parking: Vehicle detection and parking assistance.
Advantages:
Compact Size: Small and lightweight, making it easy to integrate into space-constrained
applications.
High Accuracy: Provides accurate distance measurements with a high resolution of 1cm.
Multiple Interface Options: Supports both UART and I2C communication for flexible
integration.
High Frame Rate: Capable of 100 Hz frame rate for real-time distance measurement.
Low Power Consumption: Energy-efficient design suitable for battery-powered applications.
Limitations: Limited Range: Effective measurement range is up to 12 meters, which may be
insufficient for some long-range applications.
Environmental Sensitivity: Performance can be affected by environmental factors such as
ambient light and weather conditions.
18
Narrow Field of View: The 2.3° field of view might limit its use in applications requiring wider
area coverage.
The TF MINI Micro LiDAR is an efficient and reliable sensor for short to medium-range
distance measurement applications. Its compact size, high accuracy, and multiple interface
options make it ideal for integration into various systems, from robotics to drones and industrial
automation. While its range and field of view limitations should be considered, its overall
advantages provide significant value for precise distance sensing in a small package.
The HC-05 Bluetooth module is a popular, versatile, and cost-effective solution for wireless
communication between devices. It operates using the Bluetooth Serial Port Protocol (SPP)
and is designed to establish a transparent wireless serial connection. This module is widely
used in various applications including robotics, home automation, and wireless data transfer
systems.
Technical Specifications:
19
Dimensions: 34mm x 15mm x 2.2mm
Working Principle: The HC-05 Bluetooth module operates by creating a wireless serial
communication link between two devices. It can be configured either as a master or slave
device, allowing it to initiate a connection or wait for another device to connect to it.
Powering Up: Connect the VCC pin to a 3.3V to 5V power supply and the GND pin to the
ground.
Initialization: The module initializes and is ready to be paired with another Bluetooth device.
It can enter AT command mode for configuration.
Pairing: In slave mode, the module waits for a pairing request from a master device.In master
mode, it searches for available slave devices and initiates pairing.
Data Transmission: Once paired, data sent from one device is wirelessly transmitted and
received by the other, effectively acting as a serial link.
Pin Configuration
1. EN: Enable/disable pin (active HIGH). When pulled high, the module is enabled.
2. VCC: Power supply pin, connected to 3.3V to 5V DC.
3. GND: Ground pin, connected to the ground of the power supply.
4. TXD: Transmit pin, sends serial data to the connected device.
5. RXD: Receive pin, receives serial data from the connected device.
6. STATE: Connection status pin, goes high when connected.
7. KEY: Used to enter AT command mode for configuration (connected to 3.3V or left floating).
Applications: The HC-05 Bluetooth module is versatile and can be used in numerous
applications, such as:
Wireless Data Transmission: Transmitting data wirelessly between microcontrollers or other
devices.
Robotics: Enabling remote control of robots via Bluetooth.
Home Automation: Controlling home appliances wirelessly.
Health Monitoring Systems: Transmitting data from medical sensors to smartphones or
computers.
20
Compact Size: Small form factor suitable for integration into various projects.
Reliable: Stable wireless communication with good range and data rate.
Limitations:
Range: Limited to around 10 meters, which may be insufficient for some applications.
Interference: Can be affected by interference from other 2.4GHz devices.
Power Consumption: Consumes more power compared to other low-energy Bluetooth
modules.
Configuration Complexity: Requires AT commands for configuration, which may be
challenging for beginners.
The HC-05 Bluetooth module is a powerful and flexible tool for implementing wireless serial
communication in a wide range of applications. Its ease of use, affordability, and reliable
performance make it an excellent choice for hobbyists and professionals alike. Despite some
limitations, such as range and power consumption, it remains a popular module for projects
involving wireless data transmission and control.
The HC-05 Bluetooth module has two operating modes: Command and Data. The HC-05 is in
Data mode by default, and you can switch to Command mode by pressing the onboard push
button. The Bluetooth standards are IEEE-802.15.1-2005.
In Data mode, the HC-05 can send and receive data from other Bluetooth devices. The default
baud rate in Data mode is 9600.
In Command mode, you can use AT commands to change the HC-05's settings and parameters,
such as the baud rate, module name, and whether it's a master or slave device. You can use a
serial to TTL converter and a PC running terminal software to change the system parameters,
21
and these changes will remain even after you turn off the power. The default baud rate in
Command mode is 38400.
The Arduino UNO is one of the most popular and widely used microcontroller boards in the
Arduino family. It is based on the ATmega328P microcontroller and is designed for beginners
and hobbyists, as well as for more advanced users who need a flexible and powerful platform
for their projects. The Arduino UNO is known for its ease of use, robustness, and extensive
community support.
Technical Specifications:- Microcontroller: ATmega328P
Operating Voltage: 5V
Input Voltage (recommended): 7-12V
Input Voltage (limits): 6-20V
Digital I/O Pins: 14 (of which 6 provide PWM output)
Analog Input Pins: 6
DC Current per I/O Pin: 20 mA
DC Current for 3.3V Pin: 50 mA
Flash Memory: 32 KB (ATmega328P) of which 0.5 KB used by bootloader
SRAM: 2 KB (ATmega328P)
EEPROM: 1 KB (ATmega328P)
Clock Speed: 16 MHz
LED_BUILTIN: 13
Length: 68.6 mm
Width: 53.4 mm
Weight: 25 g
Key Components:
Microcontroller: The ATmega328P is the brain of the board, handling all processing tasks.
Power Supply: The board can be powered via a USB connection or an external power supply
(barrel jack or Vin pin).
Digital I/O Pins: These pins can be configured as inputs or outputs for interfacing with other
components.
Analog Input Pins: Used to read analog signals from sensors.
PWM Pins: Six digital I/O pins can generate PWM signals for controlling devices like motors
22
and LEDs.
USB Interface: The USB connection is used for programming the board and for serial
communication with a computer.
Reset Button: Resets the microcontroller.
Power LED Indicator: Indicates that the board is powered on.
Built-in LED: Connected to digital pin 13, useful for basic tests and debugging.
Pin Configuration:
Digital Pins (0-13): Can be used for general-purpose I/O, with pins 3, 5, 6, 9, 10, and 11
providing PWM output.
Analog Pins (A0-A5): Can be used for reading analog sensors, each providing 10-bit resolution.
Power Pins:
Vin: Input voltage to the Arduino when using an external power source5V: Regulated 5V from
the regulator on the board.
3.3V: A 3.3V supply generated by the onboard regulator.
GND: Ground pins.
Special Pins:
AREF: Reference voltage for the analog inputs.
Reset: Used to reset the microcontroller.
The Arduino UNO is a powerful, flexible, and user-friendly microcontroller board that is ideal
for a wide range of applications. Its ease of use, robust design, and extensive community
support make it an excellent choice for both beginners and experienced users. Despite its
limitations, the Arduino UNO remains one of the most popular platforms for electronic
prototyping and education.
23
CHAPTER 4
24
4. Flow chart
Start
NO
Delay by 1s
Transmission of data
to mobile completed ?
Yes
Stop
FIG 4: Flowchart
25
CHAPTER 5
26
5.1 Design Implementation
5.1.1 Interface of Arduino with HC-05
The image shows an Arduino board connected to an HC-05 Bluetooth module. To ensure proper
functionality, here is a brief overview of the connections needed between the Arduino and the
HC-05 module:
Connections:
27
The image shows an Arduino board connected to a TF-Mini micro LiDAR. To ensure proper
functionality, here is a brief overview of the connections needed between the Arduino and the
LiDAR module:
Connections:
LiDAR Rx to Arduino Rx
LiDAR to Tx Arduino Tx
The image shows an Arduino board connected to an HC-SR04. To ensure proper functionality,
here is a brief overview of the connections needed between the Arduino and the LiDAR
module:
Connections:
28
5.1.4 Interface of Arduino with MB1200 ultrasonic sensor
The image shows an Arduino board connected to an MB1200 Ultrasonic sensor. To ensure
proper functionality, here is a brief overview of the connections needed between the Arduino
and the MB1200 ultrasonic sensor module:
Connections:
MB1200 AN to Arduino A0
29
5.1.5 Interface of all components
Ultrasonic LiDAR
Sensor
30
5.1.6 Hardware setup
The developed prototype is a wearable device that is as shown in the figure. It is designed in a
way that a blind person can carry it easily. It is portable.
31
CHAPTER 6
32
6.1. Results and Discussions
No Obstacle Head
No Obstacle Head Down
Down(CH)
(FH)
730 4000
Distance in cm
3000
Distance in cm
725
2000
720
1000
715 0
0 5 10 15 20
-1000 0 5 10 15 20
Samples Samples
No Obstacle Head
Down (LG)
3100
Distance in cm
3050
3000
0 5 10 15 20
Samples
FH: Here fore head is facing towards the ground some distance is varying is very less but
any path hole or any steps are coming its distance increases or decreases.
CH: Here we can see that with no obstacle near to the Chest sensor, so distance is very high.
LG: From leg sensor the moment we can clearly observe that moment with graph changing
the small values. where some pics are due to path rough ness.
33
No Obstacle Head Up No Obstacle HeadUp
(FH) (CH)
3680
0
Distance in cm
Distance in cm
0 10 20 3670
-0.5
3660
-1
3650
-1.5 0 10 20
Samples
Samples
No Obstacle Head Up
(LG)
3150
Distance in cm
3100
3050
3000
2950
0 5 10 15 20
Samples
CH: Here we can see that a clear no obstacle from the chest in long range. But something
from long we can see the obstacle with less change is distance value.
LG: From leg sensor the moment we can clearly observe that moment with graph changing
the small values. Where some pics are due to path rough ness.
34
Obstacle Head Down Obstacle Head Down
(FH) (CH)
745 200
Distance in cm
Distance in cm
740 100
735 0
0 5 10 15 20 0 5 10 15 20
Samples
Samples
Distance in cm
2000
0
0 5 10 15 20
-2000
Samples
FH: From Forehead sensor we can see that clear obstacle. with some obstacle is moving
and then distance varying in the graph.
CH: Same thing in chest sensor before obstacle distance is somewhat high. When the
obstacle is arrived a clear graph of distance to the target.
LG: In leg sensor we can detect only towards the ground distance by the time of walking it
is varying very high, when it is showing this much high is no obstacle but when it is low
this shows an obstacle is ahead. finally, from leg it detects very low accuracy for straight
targets.
35
Steps Down Head Up Steps Down Head Up
(FH) (CH)
600 4000
Distance in cm
Distance in cm
400 3000
2000
200
1000
0 0
0 10 20 30 0 10 20 30
Samples Samples
Distance in cm 200
100
0
0 10 20 30
Samples
CH: Here we the distance of rooftop while steeping down the distance of free space or down
stair is detected.
LG: Here we can see the distance is detected between two values, but it is not very effective
due to down stairs.
36
Steps Down Head Steps Down Head
Down(FH) Down(CH)
270 4000
Distance in cm
Distance in cm
260
2000
250
240 0
0 5 10 15 0 5 10 15
Samples Samples
CH: Here we the distance of rooftop while steeping down the distance of free space or down
stair is detected.
LG: Here we can see the distance is detected between two values but it is not very effective
due to down stairs.
37
Steps up Head Up Steps up Head Up
(FH) (CH)
300 300
Distance in cm
Distance in cm
200 200
100 100
0 0
0 10 20 30 0 10 20 30
Samples Samples
Steps up Head Up
(LG)
44.5
Distance in cm
44
43.5
43
0 20 40
Samples
CH: By the time of steps toward upstairs the chest sensor is varying, but the variation is
very less. when it reaches to very low, we can sense that an end of wall.
LG: Here we can see that between two (2) values distance is varying from leg sensor so,
we can classify this as a step toward upstairs.
38
Steps Up Head Down Steps Up Head Down
(FH) (CH)
126 4000
Distance in cm
Distance in cm
124 3000
122 2000
120
1000
118
0
0 10 20 30
-1000 0 10 20 30
Samples Samples
Distance in cm
40
20
0
0 10 20 30
Samples
FH: When the steps toward upstairs with head down. We get a sharp value of distance from
forehead to steps, so we can detect towards upstairs.
CH: By the time of steps toward upstairs the chest sensor showing no distance until unless
an end or obstacle is detected.
LG: Here we can see that between two values distance is varying from leg sensor. So we
can classify this as a steps toward upstairs.
39
No Obstacle Walking No Obstacle Walking
(FH) (CH)
1500 4000
Distance in cm
Distance in cm
1000
2000
500
0 0
0 10 20
-500 0 10 20
Samples Samples
No Obstacle Walking
(LG)
300
Distance in cm
200
100
0
0 10 20
Samples
CH: From chest the data is not detected for long time until obstacle is detected. here the
detection spike is due to his hand.
LG: Here the sensor is detected the leg moment of for distance measurement.
40
Obstacle While
Walking (FH)
300
Distance in cm
200
100
0
0 5 10 15 20
-100
Samples
Distance in cm
3000 3000
2000 2000
1000 1000
0 0
-1000 0 5 10 15 20
-1000 0 5 10 15 20
Samples Samples
FH: We can see a clear obstacle detection when a person is walking from forehead. when there
is no obstacle then the distance is zero (0).
CH: We can see a clear obstacle detection when a person is walking from for Chest, but in the
time of walking the values are fluctuating due to moment. when there is no obstacle then the
distance is zero (0).
LG: Here when there is no obstacle to leg sensor it detects as zero. Until unless obstacle is
detected.
41
6.2 Power Analysis
So, the 3 batteries connected in series can power the 1.17-watt device for approximately 24.67
hours. Therefore, the designed system needs recharge after 24 hours. The chosen batteries can
charge for 1000 cycles therefore life time of the batteries are approximately 3 years.
42
CHAPTER 7
43
7. Conclusions and Future Scope
Conclusion:
The designed prototype utilizes a combination of LiDAR, Ultrasonic Sensors, and interfaced
with a microcontroller, to gather distance measurements. The processed data is then transmitted
via Bluetooth to a mobile phone. This configuration aims to provide a flexible and accurate
distance measurement system, leveraging the strengths of each sensor for different scenarios.
LiDAR offers high precision low field of view for long-range measurements, while the
Ultrasonic Sensors are cost-effective solutions for shorter ranges. The Bluetooth module
ensures wireless data communication, enabling real-time monitoring and analysis on a mobile
device.
Incorporating machine learning (ML) algorithms can significantly enhance the capabilities and
applications of this project. The future scope could include:
1. Sensor Fusion and Data Accuracy: ML algorithms can be used to fuse data from the LiDAR,
and Ultrasonic Sensor. By training models to learn the characteristics and accuracy of each
sensor in various environments, the system can provide more accurate and reliable distance
measurements, mitigating the limitations of individual sensors.
2. Anomaly Detection: ML can help in identifying anomalies or outliers in the sensor data. This
is particularly useful in detecting malfunctioning sensors or unusual environmental conditions
that might affect measurements.
By integrating ML algorithms, the proposed system can evolve into an intelligent, adaptable,
and highly reliable distance measurement solution with wide-ranging applications in robotics,
autonomous vehicles, and industrial automation.
44
CHAPTER 8
45
8. References
[1] Marzec, P., & Kos, A. (2019, June). Low energy precise navigation system for the blind
with infrared sensors. In 2019 MIXDES-26th International Conference" Mixed Design of
Integrated Circuits and Systems" (pp. 394-397). IEEE.
[2] Nada, A. A., Fakhr, M. A., & Seddik, A. F. (2015, July). Assistive infrared sensor based
smart stick for blind people. In 2015 science and information conference (SAI) (pp. 1149-
1154). IEEE.
[3] Patil, K., Jawadwala, Q., & Shu, F. C. (2018). Design and construction of electronic aid for
visually impaired people. IEEE Transactions on Human-Machine Systems, 48(2), 172-182.
[4] Ton, C., Omar, A., Szedenko, V., Tran, V. H., Aftab, A., Perla, F., ... & Yang, Y. (2018).
LIDAR assist spatial sensing for the visually impaired and performance analysis. IEEE
Transactions on Neural Systems and Rehabilitation Engineering, 26(9), 1727-1734.
[5] Ton, C., Omar, A., Szedenko, V., Tran, V. H., Aftab, A., Perla, F., ... & Yang, Y. (2018).
LIDAR assist spatial sensing for the visually impaired and performance analysis. IEEE
Transactions on Neural Systems and Rehabilitation Engineering, 26(9), 1727-1734.
[6] Ghafoor, M. J., Junaid, M., & Ali, A. (2019, August). Nav-cane a smart cane for visually
impaired people. In 2019 International Symposium on Recent Advances in Electrical
Engineering (RAEE) (Vol. 4, pp. 1-4). IEEE.
[7] Sen, A., Sen, K., & Das, J. (2018, October). Ultrasonic blind stick for completely blind
people to avoid any kind of obstacles. In 2018 IEEE SENSORS (pp. 1-4). IEEE.
46