0% found this document useful (0 votes)
0 views

Third Eye for Visually Challanged

The document presents a project report on the development of a low-cost wearable device called 'Third Eye' aimed at assisting visually challenged individuals with navigation. Utilizing ultrasonic and LiDAR sensors, the device transmits real-time data to a smartphone app, providing audio alerts about obstacles to enhance independence and spatial awareness. The report outlines the project's objectives, literature survey, system design, and potential future enhancements for improved functionality.

Uploaded by

mwydiqmqotnyx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

Third Eye for Visually Challanged

The document presents a project report on the development of a low-cost wearable device called 'Third Eye' aimed at assisting visually challenged individuals with navigation. Utilizing ultrasonic and LiDAR sensors, the device transmits real-time data to a smartphone app, providing audio alerts about obstacles to enhance independence and spatial awareness. The report outlines the project's objectives, literature survey, system design, and potential future enhancements for improved functionality.

Uploaded by

mwydiqmqotnyx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

THIRD EYE FOR VISUALLY CHALLANGED

PROJECT REPORT

Submitted in the fulfilment of the requirements for


the award of the degree of

Bachelor of Technology
in
Electronics and Communication Engineering
DASARI KAIVALYA GRANDHI SANJAY MALLELA VISHNU TEJA
[201FA05008] [201FA05015] [211LA05022]

Under the Esteemed Guidance of


Dr. G. S. R. Satyanarayana
Assistant Professor
Department of ECE

(ACCREDITED BY NAAC WITH ‘A+’GRADE)

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


(ACCREDITED by NBA)

VIGNAN’S FOUNDATION FOR SCIENCE, TECHNOLOGY AND RESEARCH


(Deemed to be University)

Vadlamudi, Guntur, Andhra Pradesh, India– 522213

May 2024
ACKNOWLEDGEMENT

The satisfaction that accompanies the successful completion of any task would be
incomplete without the mention of people who made it possible and whose constant guidance
and encouragement crown all the effort with success.
We are greatly indebted to Dr. G. S. R. Satyanarayana, our revered guide and Assistant
Professor in the Department of Electronics and Communication Engineering, VFSTR (Deemed
to be University), Vadlamudi, Guntur, for his valuable guidance in the preparation of this
dissertation. He has been a source of great inspiration and encouragement to us. He has been
kind enough to devote considerable amount of his valuable time in guiding us at every stage.
This is our debut, but we are sure that we can do many more such studies, purely become of
the lasting inspiration and guidance given by respectable guide.
We would like to specially thank Dr. T. Pitchaiah, Head of the Department, ECE for
his valuable suggestions.

We would like to specially thank, Dr. N. Usha Rani, Dean, School of ECE for her help
and support during the project work.

We thank our project coordinators Dr. Satyajeet Sahoo, Dr. Arka Bhattacharyya, Mr.
M. Vamshi Krishna, Mr. Abhishek Kumar for continuous support and suggestions in
scheduling the reviews and report verifications. Also, thanks to supporting staff of ECE
department for their technical support for timely completion of the project.

We wish to express our gratitude to Dr. P. Nagabhushan Vice-Chancellor, VFSTR


(Deemed to be University) for providing us the greatest opportunity to have a great exposure
and to carry out the project.

Finally, we would like to thank our parents and friends for the moral support throughout
the project work.

Name of the candidates

D. Kaivalya (201FA05008)
G. Sanjay (201FA05015)
M. Vishnu Teja (211LA05022)
ABSTRACT
Millions of humans rely on corrective lenses, but for the blind, current assistive
technologies offer limited support. This project proposes a low-cost, wearable device to
empower them with improved navigation. This unobtrusive device, mountable on canes,
glasses, on clothes or gloves, utilizes two key sensors LiDAR and Ultrasonic. The acquired
sensor data is transmitted via Bluetooth to a user-friendly smartphone app. This app translates
complex readings into clear information, offering audio alerts about object distances or
integrating with other apps for image recognition or navigation assistance. This newfound
awareness of surroundings empowers blind users to navigate with greater confidence and
freedom. The focus on affordability, achieved through readily available components and
efficient design, makes the device accessible to a wider range of users. This has the potential
to be a significant advancement, granting increased independence and a sense of security to
those with visual impairments.
CONTENTS
PAGE NO

CHAPTER 1…………………………………………………………………...01

1.1 Introduction…………………………………………………………………02
1.2 Motivation…………………………………………………………………..03
1.3 Objective……………………………………………………………………04
1.3.1 Broad Objective………………………………………………….04
1.3.2 Specific Objective……………………………………………….04
1.4 Organization of thesis……………………………………………………....04

CHAPTER 2…………………………………………………………………..05

2. Literature survey…………………………………………………….….…..06-09

CHAPTER 3……………………………………………………………………10

3.1 Proposed system……………………………………………………...….11-12

3.2 Components……….……………………………………………………..13-23

3.2.1 HC-SR04 Ultrasonic sensor………………………………...…. 13-14

3.2.2 Maxbotix Ultrasonic sensor(MB1200 XL-MaxSonar-EZ0)…..14-17

3.2.3 TF-Mini Micro LiDAR……………………………….……..…17-19

3.2.4 HC-05 Bluetooth module………………………………………19-22

3.2.5 Arduino UNO…………………………………………………..22-23

CHAPTER 4…………………………………………………………………..24

4. Flow chart…………………………………………………………………...25
CHAPTER 5……………………………………………………………………....26

5.1 Design Implementation

5.1.1 Interface of Arduino with HC-05……………………………………..27

5.1.2 Interface of Arduino with LiDAR………………………………..…27-28

5.1.3 Interface of Arduini with HC-SR04………………..…………….…..28

5.1.4 Interface of Arduino with MB1200…………………………………..29

5.1.5 Interface of all components……………………………………….…..30

5.1.6 Hardware setup……………………………………………………….31

CHAPTER 6……………………………………………………………………….32

6. 1 Results and Discussions………………………………………………………33-41

6.2 Power and Analysis…………………………………………………………….42

CHAPTER 7……………………………………………………………………….43

7. Conclusion and Future Scope……………………………………………………44

CHAPTER 8………………………………………………………………………45

8. References……………………………………………………………………….46
TABLE OF FIGURES
FIG NO NAME OF THE FIGURE PAGE NO

3.1 Proposed system 11

3.2.1 HC-SR04 Ultrasonic sensor 14

3.2.2 (MB1200 XL-MaxSonar-EZ0) 17

3.2.3 TF-Mini Micro LiDAR 19

3.2.4 HC-05 Bluetooth module 21

3.2.5 Arduino UNO 23

5.1.1 Arduino with HC-05 27

5.1.2 Arduino with LiDAR 27

5.1.3 Arduino with HC-SR04 28

5.1.4 Arduino with MB1200 29

5.1.5 Interface of components 30

5.1.6 Hardware Setup 31

6.1 No obstacle Head down 33

6.2 No obstacle Head up 34

6.3 Obstacle Head up 35

6.4 Step Down Head up 36

6.5 Step Down Head Down 37

6.6 Step Up Head Up 38

6.7 Steps Up Head down 39

6.8 No obstacle walking 40

6.9 Obstacle while walking 41


LIST OF ACRONYMS
CPU Central Processing Unit

ETA Electronic Travel Aid

EEPROM Electrically Erasable Programmable Read-Only Memory

GND Ground

GFSK Gaussian Frequency Shift Keying

GPS Global Positioning System

GSM Global System for Mobile Communications

HC SR04 High-Conductance Ultrasonic Sensor

HC 05 Host Controller

IR Infrared

I2C Inter- Integrated Circuits

LiDAR Light Detection And Ranging

ML Machine Learning

PWM Pulse Width Modulation

SRAM Static Random Access Memory

TTL Transistor-Transistor Logic

Uno One in Italian

UAV Unmanned Ariel Vehicles

VCC Voltage at the Common Collector

UART Universal Asynchronous Receiver/Transmitter

0
CHAPTER 1

1
1.1Introduction
Imagine a world where navigating unfamiliar environments becomes achievable and
independent movement a reality for the visually challenged. This report presents the
development of a "Third Eye," an assistive device built with accessibility in mind. This system
leverages the power of Arduino UNO, a user-friendly microcontroller, to process data from
multiple sensors and provide crucial spatial awareness.

The core obstacle detection relies on an ultrasonic sensor, which emits sound waves and
measures their echo to determine the distance to nearby objects. For enhanced precision and
detailed object recognition, a LiDAR sensor can be integrated, creating a more comprehensive
picture of the user's surroundings.

Real-time feedback is paramount for this assistive device. The Third Eye utilizes a
Bluetooth module to wirelessly transmit sensor data to a custom smartphone application. This
application, designed as a serial monitor, translates the data into a user-friendly format.
Through a combination of audible alerts or vibrations, the user receives immediate notifications
about potential obstacles, empowering them to navigate with greater confidence.

This report will explore the hardware components chosen, delve into the software
development process for the Arduino platform, and detail the design of the smartphone
application. Additionally, it will discuss the implemented testing procedures used to evaluate
the Third Eye's effectiveness and its potential to revolutionize independent navigation for
visually challenged individuals.

2
1.2 Motivation

The inability to see presents a significant barrier to navigating the world freely. Individuals
with visual impairments often rely on assistance or limited tools for movement, hindering their
independence and mobility. This project, the development of a "Third Eye," aims to bridge this
gap and empower individuals with visual challenges.

The motivation for this project stems from a desire to create a more inclusive and
accessible world. By utilizing readily available technologies like Arduino UNO, ultrasonic
sensors, and LiDAR, the Third Eye offers a cost-effective and adaptable solution. This system
goes beyond basic obstacle detection, potentially incorporating LiDAR for detailed object
recognition, creating a richer understanding of the environment.

Furthermore, the open-source nature of the Arduino platform allows for future
customization and development by the user community, fostering innovation and ensuring the
Third Eye can adapt to individual needs and environments.

Real-time feedback is crucial. The Bluetooth module and a dedicated smartphone


application ensure clear communication. The application, designed as a serial monitor,
translates sensor data into user-friendly alerts, empowering individuals with the information
needed to navigate with confidence. This project seeks to break down the walls of limitation
and provide a sense of freedom through improved spatial awareness.

The Third Eye goes beyond basic obstacle detection. Its modular design allows for future
expansion with additional sensors, like cameras for object identification. This flexibility
positions it as a platform for continuous improvement, promoting a future where assistive
technology seamlessly integrates with daily life.

Developing the Third Eye is not just about creating a technological solution; it's about
fostering independence, building confidence, and promoting a more inclusive world for those
who experience visual impairment. By offering a cost-effective and adaptable system, the Third
Eye has the potential to significantly improve the lives of many.

3
1.3 Objective
1.3.1 Broad Objective
The third eye aims to empower visually challenged individuals with a low-cost CPU
based assistive device combining ultrasonic and potential LiDAR sensors for obstacle detection
and distance estimation. Real-time data is wirelessly transmitted to a smartphone app,
providing immediate feedback through audible alerts. This modular, cost-effective solution
seeks to enhance independent navigation and can be expanded with additional sensors like
cameras for advanced object identification.

1.3.2 Specific Objective


The Third Eye enhances navigation for the visually challenged with reliable obstacle detection
(1-5 meters) using ultrasonic sensors, and potential LiDAR for precise object recognition.
Seamless Bluetooth communication (10-meter range) provides real-time feedback via
customizable audio or vibration alerts on a smart phone app.

1.4 Organization of Thesis


Chapter 1 describes about the “THIRD EYE FOR VISUALLY CHALLANGED” which
includes Introduction, Motivation, and the specific, broad objectives.

Chapter 2 describes about the Literature work that has been studied and mentioning of the
different existing solutions along with the pros and cons.

Chapter 3 describes about the Components and their descriptions, the proposed system along
with the sample block diagram will be developed.

Chapter 4 describes the operational Flowchart of the prototype.

Chapter 5 describes about the design implementations, interfaces and working of the prototype.

Chapter 6 describes about the results and the discussions.

Chapter 7 concludes the work and given future scope.

4
CHAPTER 2

5
2. Literature survey
[1] Challenges: Existing navigation solutions (GPS, guide dogs) have limitations for the blind.

Infrared (IR) Sensors as a Solution: IR offers promise due to low power consumption, object
detection capabilities, and compact size for wearable devices.

Existing Research: Studies like "A Smart Infrared Microcontroller-Based Blind Guidance
System" (2013) explore IR for basic obstacle detection.

Limitations: Current IR solutions have limited range/detail and can be affected by the
environment.

Future Directions: Research could focus on sensor fusion with ultrasonics, advanced signal
processing for better object recognition, and user interface design for clear data interpretation.

Conclusion: IR sensors hold promise for low-energy navigation systems. By addressing


limitations, IR has the potential to become a valuable tool for the blind.

[2] Smart sticks are a growing assistive technology for the visually impaired. Common sensors
include ultrasonic (affordable, good range) and LiDAR (precise object recognition, but
expensive). Existing solutions like WeWALK and UltraCane use these sensors for obstacle
detection with vibration or voice feedback. Limitations include:

Sensor Range: Ultrasonic/infrared might have a shorter range than LiDAR.

Basic Detection: Some solutions only provide basic obstacle detection.

Cost: LiDAR integration can be expensive.

Future directions include:

Sensor Fusion: Combining sensors for wider range and richer information.

Advanced Feedback: Developing more sophisticated feedback mechanisms.

Cost-Effective Design: Making LiDAR integration more affordable. By addressing these


limitations, future smart sticks can empower visually impaired individuals with even greater
independence.

6
[3] Visually impaired individuals face challenges navigating unfamiliar environments.
Traditional tools like canes offer limited information, while guide dogs are expensive and
require training. Existing electronic aids include:

Ultrasonic ETAs: Use ultrasonic sensors for obstacle detection with audio/vibration alerts.

LiDAR Navigation Systems: Offer more precise object recognition but can be costly and
complex.

Smartphone Navigation Apps: Provide audio guidance and location information but rely on
pre-existing data.

These solutions have limitations:

Limited range (ultrasonic), Lack of object recognition (basic ETAs), Cost and complexity
(LiDAR), Reliance on external infrastructure (smartphone apps).

This project, the Third Eye, aims to address these issues by:

Combining ultrasonic and LiDAR sensors for better range, affordability, and recognition.

Using a real-time feedback smartphone app for customization and potential mapping
integration. Utilizing a modular design for future expansion with additional sensors.

The Third Eye has the potential to be a more comprehensive and adaptable assistive device.

[4] Visually impaired individuals rely on alternative methods like echolocation (requires
extensive training) to navigate. This paper explores assistive technologies using LiDAR for
spatial sensing. Existing Solutions:

Canes: Limited information beyond contact.

ETAs (Electronic Travel Aids): Improved range but limited object recognition.

Camera-based systems: Computationally expensive, struggle in low light.

LiDAR for Spatial Sensing: LiDAR offers advantages over traditional methods:

Improved obstacle detection: Greater distance and accuracy.

Object recognition: Potential to classify objects (doorways, furniture).

Real-time environment mapping: Crucial for navigation planning.

7
Performance Analysis in Literature: Existing studies on LiDAR for spatial sensing often
evaluate: Accuracy of obstacle detection, Object recognition capabilities, Usability, and user
experience.

This paper contributes by presenting a LiDAR-based system's development and performance


analysis for visually impaired individuals.

[5] Current assistive technology often offers limited obstacle detection for the visually
impaired. While short-range LiDAR or ultrasound provide some navigation aid, complex
environments demand more.

Research in autonomous vehicles highlights the potential of sensor fusion (LiDAR with
cameras/radar) for a more comprehensive picture of surroundings. The INSPEX project
exemplifies this with their "smart white cane" integrating long-range LiDAR for a more
detailed obstacle detection system. However, advancements are needed:

Lighting Variation: LiDAR performance can be limited in bright or low-light conditions.

Wearability: Sensor miniaturization and reduced power consumption are crucial for
lightweight, portable solutions.

User Interface Design: Translating LiDAR data into clear and actionable user feedback requires
robust processing algorithms and user-centered interface design.

By addressing these challenges, long-range LiDAR has the potential to transform obstacle
detection for the visually impaired. Future research focused on miniaturization, light
adaptability, and user-centered design will pave the way for next-generation assistive
technology, empowering greater independence and navigation freedom.

[6] Traditional white canes offer limited functionality. Research explores advancements:
Ultrasonic Sensors: Common for short-range obstacle detection (up to 5 meters) but lack detail
for complex situations.
Laser Canes: Offered longer range but were bulky and impractical.
Sensor Fusion: Combining sensors (LiDAR, cameras) creates a more comprehensive
understanding of surroundings.
Existing smart canes integrate features like:
GPS/GSM: For location tracking and emergency assistance.
Multi-Sensor Integration: Projects like INSPEX utilize long-range LiDAR for detailed obstacle

8
detection.
The Nav-Cane can contribute by focusing on (mention unique features).

[7] Traditional canes offer limited obstacle detection for the blind. Research explores ultrasonic
sensors, like those in [previous citation], for navigation assistance. These emit sound waves to
measure object distance. Commercially available ultrasonic blind sticks exist, but with
limitations: short range, potential for inaccurate readings, and lack of object recognition.
Future advancements are needed:
Multi-sensor integration (e.g., LiDAR) for a more comprehensive environment picture.
Advanced signal processing for improved accuracy. Object recognition features using machine
learning for specific feedback.
By addressing these limitations, ultrasonic technology can be a valuable tool in developing
more effective assistive devices for the visually impaired.

9
CHAPTER 3

10
3.1 Proposed System

LiDAR switch

Maxbotix Ultrasonic Microcontroller Bluetooth


sensor

HC-SR04 Mobile phone

FIG 3.1 Proposed system

The proposed system is an advanced obstacle detection and notification system designed to
assist visually impaired and blind individuals. It integrates multiple sensors, a microcontroller,
and wireless communication using Bluetooth standard to provide real-time obstacle detection
information via a mobile phone.
Components and Functionality:

Sensors:
LiDAR: Utilized for long-range obstacle detection, the LiDAR sensor accurately measures the
distance to objects using laser pulses. It is ideal for detecting obstacles that are farther away,
providing precise distance measurements.
Maxbotix Ultrasonic Sensor (MB1200 XL-MaxSonar-EZ0): This medium-range sensor emits
ultrasonic waves and measures the echo return time to calculate distances. It offers reliable
detection of obstacles at moderate distances, enhancing the system's ability to sense the
environment effectively.
HC-SR04 Ultrasonic Sensor: Employed for short-range detection, the HC-SR04 sensor
operates on similar principles as the Maxbotix sensor but is optimized for closer objects. It

11
ensures the system can detect nearby obstacles accurately.
Microcontroller: The microcontroller is the central processing unit of the system, receiving
input from all three sensors. It processes the sensor data to determine the presence and distance
of obstacles. The microcontroller is also responsible for managing the input from the switch
and controlling the Bluetooth module.
Switch: The switch allows the user to activate or deactivate the system. When turned on, the
microcontroller begins processing the input from the sensors to detect obstacles.
Bluetooth Module: The Bluetooth module works based on the IEEE-802.15.1-2005 standard
enables wireless communication between the microcontroller and a mobile phone. After
processing the sensor data, the microcontroller sends relevant obstacle information to the
Bluetooth module, which then transmits it to the connected mobile phone.
Mobile Phone: The mobile phone receives the data from the Bluetooth module and provides
auditory or tactile feedback to the user about detected obstacles. This real-time notification
helps visually impaired individuals navigate their surroundings more safely and effectively.

System Operation:
When the user activates the system via the switch, the microcontroller starts collecting data
from the LiDAR, Maxbotix, and HC-SR04 sensors. The processed information about obstacle
distance and presence is then transmitted via Bluetooth to the user's mobile phone, which
provides immediate feedback. This integrated approach ensures comprehensive obstacle
detection across different ranges, enhancing the mobility and safety of visually impaired and
blind individuals.

12
3.2 Components

3.2.1 HC-SR04 Ultrasonic sensor


The HC-SR04 is a widely used ultrasonic sensor designed for measuring distances accurately
and affordably. It employs ultrasonic waves to detect the distance between the sensor and an
object, making it suitable for various applications, including obstacle detection, robotics, and
range finding.

Technical Specifications:

Operating Voltage: 5V DC

Operating Current: 15mA

Measurement Range: 2cm to 400cm

Measurement Accuracy: ±3mm

Operating Frequency: 40 kHz

Output: Digital pulse

Interface: 4-pin (VCC, Trig, Echo, GND)

Dimensions: 45mm x 20mm x 15mm

Working Principle: The HC-SR04 ultrasonic sensor operates based on the principle of
echolocation, similar to how bats navigate. It consists of two main components: a transmitter
and a receiver.

1. Triggering: The sensor is triggered by sending a 10-microsecond HIGH pulse to the Trig pin.

2. Emitting Ultrasonic Pulse: Upon receiving the trigger signal, the sensor's transmitter emits
an 8-cycle burst of ultrasonic sound waves at a frequency of 40 kHz.

3. Echo Reception: These sound waves travel through the air and, upon encountering an object,
get reflected towards the sensor. The sensor's receiver detects the reflected waves (echo).

4. Distance Calculation: The time interval between the transmission of the sound waves and
the reception of the echo is measured. The distance to the object is calculated using the formula:

𝑡𝑖𝑚𝑒 𝑥 𝑠𝑝𝑒𝑒𝑑 𝑜𝑓 𝑠𝑜𝑢𝑛𝑑


𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒 =
2

13
The speed of sound in air is approximately 343 meters per second.

Pin Configuration

1. VCC: Power supply pin, connected to a 5V DC source.

2. Trig : Trigger pin, connected to a microcontroller's digital output pin to initiate measurement.

FIG 3.2.1 HC-SR04 Ultrasonic sensor

3.2.2 Maxbotix Ultrasonic sensor(MB1200 XL-MaxSonar-EZ0)

The Maxbotix MB1200 XL-MaxSonar-EZ0 is a high-performance ultrasonic distance sensor


designed for a wide range of applications requiring accurate and reliable distance
measurements. It is particularly noted for its versatility, ease of use, and robust performance in
various environments.

Technical Specifications:

Operating Voltage: 2.5V to 5.5V DC

Operating Current: 2.1mA

Measurement Range: 20cm to 765cm (7.65 meters)

Resolution: 1cm

Beam Angle: 20 degrees

Output Options: Analog Voltage Output, Pulse Width Output, Serial Output

Operating Frequency: 42 kHz

Communication: TTL Serial, Analog, PWM

14
Dimensions: 22mm x 22mm x 22mm

Weight: 4.3 grams

Working Principle: The MB1200 XL-MaxSonar-EZ0 uses ultrasonic waves to measure the
distance to an object. The sensor emits a high-frequency sound pulse and then listens for the
echo that reflects back from the target. The time taken for the echo to return is used to calculate
the distance to the object based on the speed of sound.

Triggering :The sensor can be triggered by supplying power or via a command from a
microcontroller.

Emitting Ultrasonic Pulse: The sensor emits a 42 kHz ultrasonic pulse.

Echo Reception: The sensor listens for the echo of the pulse that bounces back from the target
object.

Distance Calculation: The time between the emission of the pulse and the reception of the echo
is measured. The distance is calculated using the formula:

𝑡𝑖𝑚𝑒 𝑥 𝑠𝑝𝑒𝑒𝑑 𝑜𝑓 𝑠𝑜𝑢𝑛𝑑


𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒 =
2

The sensor provides this distance information through analog, pulse width, or serial output.

Pin Configuration

1. V+: Power supply pin, connected to 2.5V to 5.5V DC source.

2. GND: Ground pin, connected to the ground of the power supply.

3. AN: Analog voltage output.

4. PW: Pulse width output.

5. TX: Serial output pin.

6. RX: Serial input pin.

7. BW: Bandwidth selection pin.

Applications:

The MB1200 XL-MaxSonar-EZ0 is suitable for a variety of applications including:

15
Robotics: For obstacle detection and avoidance in robotic systems.

Distance Measurement: Accurate distance measurement for industrial automation.

Level Sensing: Liquid level measurement in tanks and containers.

Security Systems: Intrusion detection by monitoring distance changes.

Drone Altitude Holding: Maintaining stable altitude by measuring the distance to the ground.

Automated Parking Systems: Assisting vehicles in parking by detecting nearby objects.

Advantages:

Long Range: Capable of measuring distances up to 7.65 meters.

Multiple Output Options: Flexibility in interfacing with different systems via analog, PWM,
and serial outputs.

High Resolution: Provides high resolution of 1cm for precise measurements.

Low Power Consumption: Energy-efficient, making it suitable for battery-powered


applications.

Easy Integration: Simple to integrate with various microcontroller platforms.

Limitations:

Environmental Sensitivity: Performance can be affected by environmental conditions such as


temperature and humidity.

Surface Dependency: Best performance with flat and hard surfaces; accuracy may decrease
with soft or irregular surfaces.

Beam Angle: The 20-degree beam angle might not be suitable for applications requiring wider
coverage.

The Maxbotix MB1200 XL-MaxSonar-EZ0 ultrasonic sensor is a reliable and versatile tool for
distance measurement applications. Its long-range capability, multiple output options, and high
resolution make it suitable for a wide array of applications, from robotics to industrial
automation. While environmental conditions and surface types can affect its performance, its
overall advantages make it a valuable sensor for projects requiring precise and accurate
distance measurements.

16
FIG 3.2.2 Maxbotix MB1200 XL-MaxSonar-EZ0

3.2.3 TF-Mini Micro LiDAR

The TF MINI Micro LiDAR is a compact and cost-effective laser distance sensor that offers
high accuracy and reliability for distance measurement. It is designed for applications requiring
precise and continuous distance detection in a small form factor.
Technical Specifications:

Operating Voltage: 5V DC
Operating Current: ≤ 120mA
Measurement Range: 30cm to 12m
Resolution: 1cm
Accuracy: ±6cm (30cm to 6m); ±1% (6m to 12m)
Frame Rate: 100 Hz
Wavelength: 850 nm
Field of View: 2.3°
Output Options: UART, I2C
Dimensions: 42mm x 15mm x 16mm
Weight: 10 grams
Working Principle: The TF MINI Micro LiDAR uses time-of-flight (ToF) technology to
measure distance. The sensor emits a near-infrared laser pulse and calculates the time taken for
the pulse to reflect back from the target object. This time difference is used to determine the
distance to the object.
Emitting Laser Pulse: The sensor emits a laser pulse at 850 nm wavelength.
Reception of Reflected Pulse: The sensor detects the reflected laser pulse that bounces back
from the target object.

17
Distance Calculation: The time taken for the laser pulse to travel to the object and back is
measured. The distance is calculated using the speed of light and the formula:

𝑡𝑖𝑚𝑒 𝑥 𝑠𝑝𝑒𝑒𝑑 𝑜𝑓 𝑙𝑖𝑔ℎ𝑡


𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒 =
2

The sensor provides this distance information through UART or I2C output.
Pin Configuration:
1. VCC: Power supply pin, connected to a 5V DC source.
2. GND: Ground pin, connected to the ground of the power supply.
3. TX: UART transmission pin for distance data output.
4. RX: UART reception pin for configuration and commands.
Applications: The TF MINI Micro LiDAR is versatile and suitable for various applications,
including:
Robotics: For obstacle detection and navigation in robotic systems.
Drones: Altitude holding and terrain following in UAVs.
Automated Guided Vehicles (AGVs): Navigation and obstacle avoidance.
Distance Measurement: Precise measurement in industrial automation.
Security Systems: Intrusion detection and monitoring.
Smart Parking: Vehicle detection and parking assistance.
Advantages:

Compact Size: Small and lightweight, making it easy to integrate into space-constrained
applications.
High Accuracy: Provides accurate distance measurements with a high resolution of 1cm.
Multiple Interface Options: Supports both UART and I2C communication for flexible
integration.
High Frame Rate: Capable of 100 Hz frame rate for real-time distance measurement.
Low Power Consumption: Energy-efficient design suitable for battery-powered applications.
Limitations: Limited Range: Effective measurement range is up to 12 meters, which may be
insufficient for some long-range applications.
Environmental Sensitivity: Performance can be affected by environmental factors such as
ambient light and weather conditions.

18
Narrow Field of View: The 2.3° field of view might limit its use in applications requiring wider
area coverage.
The TF MINI Micro LiDAR is an efficient and reliable sensor for short to medium-range
distance measurement applications. Its compact size, high accuracy, and multiple interface
options make it ideal for integration into various systems, from robotics to drones and industrial
automation. While its range and field of view limitations should be considered, its overall
advantages provide significant value for precise distance sensing in a small package.

FIG 3.2.3 TF-Mini Micro LiDAR

3.2.4 HC-05 Bluetooth module (IEEE-802.15.1-2005)

The HC-05 Bluetooth module is a popular, versatile, and cost-effective solution for wireless
communication between devices. It operates using the Bluetooth Serial Port Protocol (SPP)
and is designed to establish a transparent wireless serial connection. This module is widely
used in various applications including robotics, home automation, and wireless data transfer
systems.
Technical Specifications:

Operating Voltage: 3.3V to 5V DC


Operating Current: 30mA (average), 40mA (peak)
Bluetooth Protocol: Bluetooth V2.0+EDR (Enhanced Data Rate)
Frequency: 2.4GHz ISM band
Modulation: GFSK (Gaussian Frequency Shift Keying)
Sensitivity: -80dBm
Range: Up to 10 meters (30 feet)
Data Transfer Rate: Up to 3Mbps (asynchronous), 2.1Mbps (synchronous)
UART Interface: 3.3V logic level, 9600 baud rate (default)

19
Dimensions: 34mm x 15mm x 2.2mm
Working Principle: The HC-05 Bluetooth module operates by creating a wireless serial
communication link between two devices. It can be configured either as a master or slave
device, allowing it to initiate a connection or wait for another device to connect to it.
Powering Up: Connect the VCC pin to a 3.3V to 5V power supply and the GND pin to the
ground.
Initialization: The module initializes and is ready to be paired with another Bluetooth device.
It can enter AT command mode for configuration.
Pairing: In slave mode, the module waits for a pairing request from a master device.In master
mode, it searches for available slave devices and initiates pairing.
Data Transmission: Once paired, data sent from one device is wirelessly transmitted and
received by the other, effectively acting as a serial link.
Pin Configuration
1. EN: Enable/disable pin (active HIGH). When pulled high, the module is enabled.
2. VCC: Power supply pin, connected to 3.3V to 5V DC.
3. GND: Ground pin, connected to the ground of the power supply.
4. TXD: Transmit pin, sends serial data to the connected device.
5. RXD: Receive pin, receives serial data from the connected device.
6. STATE: Connection status pin, goes high when connected.
7. KEY: Used to enter AT command mode for configuration (connected to 3.3V or left floating).
Applications: The HC-05 Bluetooth module is versatile and can be used in numerous
applications, such as:
Wireless Data Transmission: Transmitting data wirelessly between microcontrollers or other
devices.
Robotics: Enabling remote control of robots via Bluetooth.
Home Automation: Controlling home appliances wirelessly.
Health Monitoring Systems: Transmitting data from medical sensors to smartphones or
computers.

Bluetooth Beacons: Used in positioning systems and asset tracking.


Advantages:

Ease of Use: Simple to interface with microcontrollers and other devices.


Cost-Effective: Affordable solution for adding Bluetooth capability.
Versatile: Can operate in both master and slave modes.

20
Compact Size: Small form factor suitable for integration into various projects.
Reliable: Stable wireless communication with good range and data rate.
Limitations:

Range: Limited to around 10 meters, which may be insufficient for some applications.
Interference: Can be affected by interference from other 2.4GHz devices.
Power Consumption: Consumes more power compared to other low-energy Bluetooth
modules.
Configuration Complexity: Requires AT commands for configuration, which may be
challenging for beginners.
The HC-05 Bluetooth module is a powerful and flexible tool for implementing wireless serial
communication in a wide range of applications. Its ease of use, affordability, and reliable
performance make it an excellent choice for hobbyists and professionals alike. Despite some
limitations, such as range and power consumption, it remains a popular module for projects
involving wireless data transmission and control.

FIG 3.2.4 HC-05 Bluetooth module

The HC-05 Bluetooth module has two operating modes: Command and Data. The HC-05 is in
Data mode by default, and you can switch to Command mode by pressing the onboard push
button. The Bluetooth standards are IEEE-802.15.1-2005.

In Data mode, the HC-05 can send and receive data from other Bluetooth devices. The default
baud rate in Data mode is 9600.
In Command mode, you can use AT commands to change the HC-05's settings and parameters,
such as the baud rate, module name, and whether it's a master or slave device. You can use a
serial to TTL converter and a PC running terminal software to change the system parameters,

21
and these changes will remain even after you turn off the power. The default baud rate in
Command mode is 38400.

3.2.5 Arduino UNO

The Arduino UNO is one of the most popular and widely used microcontroller boards in the
Arduino family. It is based on the ATmega328P microcontroller and is designed for beginners
and hobbyists, as well as for more advanced users who need a flexible and powerful platform
for their projects. The Arduino UNO is known for its ease of use, robustness, and extensive
community support.
Technical Specifications:- Microcontroller: ATmega328P
Operating Voltage: 5V
Input Voltage (recommended): 7-12V
Input Voltage (limits): 6-20V
Digital I/O Pins: 14 (of which 6 provide PWM output)
Analog Input Pins: 6
DC Current per I/O Pin: 20 mA
DC Current for 3.3V Pin: 50 mA
Flash Memory: 32 KB (ATmega328P) of which 0.5 KB used by bootloader
SRAM: 2 KB (ATmega328P)
EEPROM: 1 KB (ATmega328P)
Clock Speed: 16 MHz
LED_BUILTIN: 13
Length: 68.6 mm
Width: 53.4 mm
Weight: 25 g
Key Components:

Microcontroller: The ATmega328P is the brain of the board, handling all processing tasks.
Power Supply: The board can be powered via a USB connection or an external power supply
(barrel jack or Vin pin).
Digital I/O Pins: These pins can be configured as inputs or outputs for interfacing with other
components.
Analog Input Pins: Used to read analog signals from sensors.
PWM Pins: Six digital I/O pins can generate PWM signals for controlling devices like motors

22
and LEDs.
USB Interface: The USB connection is used for programming the board and for serial
communication with a computer.
Reset Button: Resets the microcontroller.
Power LED Indicator: Indicates that the board is powered on.
Built-in LED: Connected to digital pin 13, useful for basic tests and debugging.
Pin Configuration:

Digital Pins (0-13): Can be used for general-purpose I/O, with pins 3, 5, 6, 9, 10, and 11
providing PWM output.
Analog Pins (A0-A5): Can be used for reading analog sensors, each providing 10-bit resolution.
Power Pins:
Vin: Input voltage to the Arduino when using an external power source5V: Regulated 5V from
the regulator on the board.
3.3V: A 3.3V supply generated by the onboard regulator.
GND: Ground pins.
Special Pins:
AREF: Reference voltage for the analog inputs.
Reset: Used to reset the microcontroller.
The Arduino UNO is a powerful, flexible, and user-friendly microcontroller board that is ideal
for a wide range of applications. Its ease of use, robust design, and extensive community
support make it an excellent choice for both beginners and experienced users. Despite its
limitations, the Arduino UNO remains one of the most popular platforms for electronic
prototyping and education.

FIG 3.2.5 Arduino UNO

23
CHAPTER 4

24
4. Flow chart

Start

Initialize all sensors

Acquire data from the sensors

NO
Delay by 1s

Transmission of data
to mobile completed ?

Yes

Render distance information through


voice/vibration

Stop

FIG 4: Flowchart

25
CHAPTER 5

26
5.1 Design Implementation
5.1.1 Interface of Arduino with HC-05

FIG 5.1.1 Arduino with HC-05

The image shows an Arduino board connected to an HC-05 Bluetooth module. To ensure proper
functionality, here is a brief overview of the connections needed between the Arduino and the
HC-05 module:
Connections:

HC-05 VCC to Arduino 5V


HC-05 GND to Arduino GND
HC-05 TXD to Arduino RX (Digital Pin 0)
HC-05 RXD to Arduino TX (Digital Pin 1)

5.1.2 Interface of Arduino with LiDAR

FIG 5.1.2 Arduino with LiDAR

27
The image shows an Arduino board connected to a TF-Mini micro LiDAR. To ensure proper
functionality, here is a brief overview of the connections needed between the Arduino and the
LiDAR module:
Connections:

LiDAR Rx to Arduino Rx

LiDAR to Tx Arduino Tx

LiDAR VCC to Arduino VCC

LiDAR GND to Arduino GND

5.1.3 Interface of Arduino with HC-SR04 Ultrasonic sensor

FIG 5.1.3 Arduino with HC-SR04

The image shows an Arduino board connected to an HC-SR04. To ensure proper functionality,
here is a brief overview of the connections needed between the Arduino and the LiDAR
module:
Connections:

HC-SR04 Trigger pin to Arduino pin 9

HC-SR04 to Tx Arduino pin 10

HC-SR04 VCC to Arduino VCC

HC-SR04 GND to Arduino GND

28
5.1.4 Interface of Arduino with MB1200 ultrasonic sensor

FIG 5.1.4 Arduino with MB1200 ultrasonic sensor

The image shows an Arduino board connected to an MB1200 Ultrasonic sensor. To ensure
proper functionality, here is a brief overview of the connections needed between the Arduino
and the MB1200 ultrasonic sensor module:
Connections:

MB1200 VCC to Arduino VCC

MB1200 GND to Arduino GND

MB1200 AN to Arduino A0

29
5.1.5 Interface of all components

Re-Chargeable Arduino Bluetooth


Batteries Board Module

Ultrasonic LiDAR
Sensor

FIG 5.1.5 Interface of all components

30
5.1.6 Hardware setup

FIG 5.1.6 Developed prototype

The developed prototype is a wearable device that is as shown in the figure. It is designed in a
way that a blind person can carry it easily. It is portable.

31
CHAPTER 6

32
6.1. Results and Discussions

No Obstacle Head
No Obstacle Head Down
Down(CH)
(FH)
730 4000

Distance in cm
3000

Distance in cm
725
2000
720
1000
715 0
0 5 10 15 20
-1000 0 5 10 15 20
Samples Samples

No Obstacle Head
Down (LG)
3100
Distance in cm

3050

3000
0 5 10 15 20
Samples

FIG 6.1.1 No obstacle Head down

FH: Forehead Sensor

CH: Chest Sensor

LG: Leg Sensor

FH: Here fore head is facing towards the ground some distance is varying is very less but
any path hole or any steps are coming its distance increases or decreases.

CH: Here we can see that with no obstacle near to the Chest sensor, so distance is very high.

LG: From leg sensor the moment we can clearly observe that moment with graph changing
the small values. where some pics are due to path rough ness.

33
No Obstacle Head Up No Obstacle HeadUp
(FH) (CH)
3680
0

Distance in cm
Distance in cm
0 10 20 3670
-0.5
3660
-1
3650
-1.5 0 10 20
Samples
Samples

No Obstacle Head Up
(LG)
3150

Distance in cm
3100
3050
3000
2950
0 5 10 15 20
Samples

FIG 6.1.2 No obstacle Head up


FH: when head is facing straight with no obstacle then the distance is beyond the range of
sensor therefore the sensor gives the output as -1 by default.

CH: Here we can see that a clear no obstacle from the chest in long range. But something
from long we can see the obstacle with less change is distance value.

LG: From leg sensor the moment we can clearly observe that moment with graph changing
the small values. Where some pics are due to path rough ness.

34
Obstacle Head Down Obstacle Head Down
(FH) (CH)
745 200

Distance in cm
Distance in cm
740 100

735 0
0 5 10 15 20 0 5 10 15 20
Samples
Samples

Obstacle Head Down


(LG)
4000

Distance in cm
2000
0
0 5 10 15 20
-2000
Samples

FIG 6.1.3 obstacle Head up

FH: From Forehead sensor we can see that clear obstacle. with some obstacle is moving
and then distance varying in the graph.

CH: Same thing in chest sensor before obstacle distance is somewhat high. When the
obstacle is arrived a clear graph of distance to the target.

LG: In leg sensor we can detect only towards the ground distance by the time of walking it
is varying very high, when it is showing this much high is no obstacle but when it is low
this shows an obstacle is ahead. finally, from leg it detects very low accuracy for straight
targets.

35
Steps Down Head Up Steps Down Head Up
(FH) (CH)
600 4000

Distance in cm

Distance in cm
400 3000
2000
200
1000
0 0
0 10 20 30 0 10 20 30
Samples Samples

Steps Down Head Up


(LG)
300

Distance in cm 200

100

0
0 10 20 30
Samples

FIG 6.1.4 Step Down Head Up


FH: When the steps toward down stairs with head up. there is no distance is detected. But
the distance is measured is the distance of rooftop of the upstairs.

CH: Here we the distance of rooftop while steeping down the distance of free space or down
stair is detected.

LG: Here we can see the distance is detected between two values, but it is not very effective
due to down stairs.

36
Steps Down Head Steps Down Head
Down(FH) Down(CH)
270 4000

Distance in cm

Distance in cm
260
2000
250
240 0
0 5 10 15 0 5 10 15
Samples Samples

Steps Down Head


Down(LG)
200
Distance in cm 150
100
50
0
0 5 10 15
Samples

FIG 6.1.5 Step Down Head Down


FH: By the time of steeping down with head down a clear-cut distance is detected of the
steps.

CH: Here we the distance of rooftop while steeping down the distance of free space or down
stair is detected.

LG: Here we can see the distance is detected between two values but it is not very effective
due to down stairs.

37
Steps up Head Up Steps up Head Up
(FH) (CH)
300 300

Distance in cm

Distance in cm
200 200
100 100
0 0
0 10 20 30 0 10 20 30
Samples Samples

Steps up Head Up
(LG)
44.5

Distance in cm
44

43.5

43
0 20 40
Samples

FIG 6.1.6 Step Up Head Up


FH: When the steps toward upstairs we can see the distance of the end of the steps is
decreasing when it reaches to very less, we take it as an end of steps.

CH: By the time of steps toward upstairs the chest sensor is varying, but the variation is
very less. when it reaches to very low, we can sense that an end of wall.

LG: Here we can see that between two (2) values distance is varying from leg sensor so,
we can classify this as a step toward upstairs.

38
Steps Up Head Down Steps Up Head Down
(FH) (CH)
126 4000

Distance in cm

Distance in cm
124 3000
122 2000
120
1000
118
0
0 10 20 30
-1000 0 10 20 30
Samples Samples

Steps Up Head Down


(LG)
60

Distance in cm
40
20
0
0 10 20 30
Samples

FIG 6.1.7 Steps Up head down

FH: When the steps toward upstairs with head down. We get a sharp value of distance from
forehead to steps, so we can detect towards upstairs.

CH: By the time of steps toward upstairs the chest sensor showing no distance until unless
an end or obstacle is detected.

LG: Here we can see that between two values distance is varying from leg sensor. So we
can classify this as a steps toward upstairs.

39
No Obstacle Walking No Obstacle Walking
(FH) (CH)
1500 4000

Distance in cm

Distance in cm
1000
2000
500
0 0
0 10 20
-500 0 10 20
Samples Samples

No Obstacle Walking
(LG)
300

Distance in cm
200
100
0
0 10 20
Samples

FIG 6.1.8 No Obstacle Walking


FH: Here we can see that no obstacle with sometime later target is detected then the distance
is decreasing, because moving towards to obstacle.

CH: From chest the data is not detected for long time until obstacle is detected. here the
detection spike is due to his hand.

LG: Here the sensor is detected the leg moment of for distance measurement.

40
Obstacle While
Walking (FH)
300

Distance in cm
200

100

0
0 5 10 15 20
-100
Samples

Obstacle While Obstacle While


Walking (CH Walking (LG)
4000 4000
Distance in cm

Distance in cm
3000 3000
2000 2000
1000 1000
0 0
-1000 0 5 10 15 20
-1000 0 5 10 15 20
Samples Samples

FIG 6.1.9 Obstacle while walking

FH: We can see a clear obstacle detection when a person is walking from forehead. when there
is no obstacle then the distance is zero (0).

CH: We can see a clear obstacle detection when a person is walking from for Chest, but in the
time of walking the values are fluctuating due to moment. when there is no obstacle then the
distance is zero (0).

LG: Here when there is no obstacle to leg sensor it detects as zero. Until unless obstacle is
detected.

41
6.2 Power Analysis

1. Calculate the total voltage of the batteries in series:

• Each battery is 3.7 volts.


• batteries in series add their voltages: 3.7V +3.7V+3.7V = 11.1V.

2. Determine the current drawn by the device:

• The power P of the device is 1.17 watts.


• Using the formula P=Vx I, where V is voltage and I is current, we can solve for current
I:
𝑃 1.17𝑊
𝐼=𝑉= ≈ .1054𝐴 or 105.4mA.
11.1𝑉

3. Calculate the capacity of the batteries in terms of time:

• The capacity of each battery is 2600mAh


• Since the batteries are in series, the voltage adds up, but the capacity (in mAh) remains
the same at 2600mAh.
• To convert this capacity into hours of operation at a current of 105.4mA, we use the
formula

𝑏𝑎𝑡𝑡𝑒𝑟𝑦 𝑐𝑎𝑝𝑎𝑐𝑖𝑡𝑦 2600𝑚𝐴𝐻


𝑇𝑖𝑚𝑒 = = ≈ 24.67 ℎ𝑜𝑢𝑟𝑠
𝑐𝑢𝑟𝑟𝑒𝑛𝑡 105.4𝑚𝐴

So, the 3 batteries connected in series can power the 1.17-watt device for approximately 24.67
hours. Therefore, the designed system needs recharge after 24 hours. The chosen batteries can
charge for 1000 cycles therefore life time of the batteries are approximately 3 years.

42
CHAPTER 7

43
7. Conclusions and Future Scope
Conclusion:

The designed prototype utilizes a combination of LiDAR, Ultrasonic Sensors, and interfaced
with a microcontroller, to gather distance measurements. The processed data is then transmitted
via Bluetooth to a mobile phone. This configuration aims to provide a flexible and accurate
distance measurement system, leveraging the strengths of each sensor for different scenarios.
LiDAR offers high precision low field of view for long-range measurements, while the
Ultrasonic Sensors are cost-effective solutions for shorter ranges. The Bluetooth module
ensures wireless data communication, enabling real-time monitoring and analysis on a mobile
device.

Future Scope Using ML Algorithms:

Incorporating machine learning (ML) algorithms can significantly enhance the capabilities and
applications of this project. The future scope could include:

1. Sensor Fusion and Data Accuracy: ML algorithms can be used to fuse data from the LiDAR,
and Ultrasonic Sensor. By training models to learn the characteristics and accuracy of each
sensor in various environments, the system can provide more accurate and reliable distance
measurements, mitigating the limitations of individual sensors.

2. Anomaly Detection: ML can help in identifying anomalies or outliers in the sensor data. This
is particularly useful in detecting malfunctioning sensors or unusual environmental conditions
that might affect measurements.

3. Adaptive Algorithms: ML models can adapt to new environments and conditions by


continuously learning from new data. This adaptability can improve the system’s performance
in diverse and changing conditions, making it more robust and versatile.

By integrating ML algorithms, the proposed system can evolve into an intelligent, adaptable,
and highly reliable distance measurement solution with wide-ranging applications in robotics,
autonomous vehicles, and industrial automation.

44
CHAPTER 8

45
8. References
[1] Marzec, P., & Kos, A. (2019, June). Low energy precise navigation system for the blind
with infrared sensors. In 2019 MIXDES-26th International Conference" Mixed Design of
Integrated Circuits and Systems" (pp. 394-397). IEEE.

[2] Nada, A. A., Fakhr, M. A., & Seddik, A. F. (2015, July). Assistive infrared sensor based
smart stick for blind people. In 2015 science and information conference (SAI) (pp. 1149-
1154). IEEE.

[3] Patil, K., Jawadwala, Q., & Shu, F. C. (2018). Design and construction of electronic aid for
visually impaired people. IEEE Transactions on Human-Machine Systems, 48(2), 172-182.

[4] Ton, C., Omar, A., Szedenko, V., Tran, V. H., Aftab, A., Perla, F., ... & Yang, Y. (2018).
LIDAR assist spatial sensing for the visually impaired and performance analysis. IEEE
Transactions on Neural Systems and Rehabilitation Engineering, 26(9), 1727-1734.

[5] Ton, C., Omar, A., Szedenko, V., Tran, V. H., Aftab, A., Perla, F., ... & Yang, Y. (2018).
LIDAR assist spatial sensing for the visually impaired and performance analysis. IEEE
Transactions on Neural Systems and Rehabilitation Engineering, 26(9), 1727-1734.

[6] Ghafoor, M. J., Junaid, M., & Ali, A. (2019, August). Nav-cane a smart cane for visually
impaired people. In 2019 International Symposium on Recent Advances in Electrical
Engineering (RAEE) (Vol. 4, pp. 1-4). IEEE.
[7] Sen, A., Sen, K., & Das, J. (2018, October). Ultrasonic blind stick for completely blind
people to avoid any kind of obstacles. In 2018 IEEE SENSORS (pp. 1-4). IEEE.

46

You might also like