report
report
UNIVERSITY
“Jnana Sangama”, BELAGAVI – 590 014, KARNATAKA, INDIA
Project Report
ON
“MULTI PURPOSE ROBOT”
BACHELOR OF ENGINEERING IN
ELECTRONICS AND COMMUNICATION ENGINEERING
SUBMITTED BY:
CERTIFICATE
This is to certify that the project entitled
Under our supervision and guidance in partial fulfillment for the award of Bachelor of
Engineering in 8th semester B.E. of the Visvesvaraya Technological University,
Belagavi during the year 2024-25. This technical seminar report has been approved as
it satisfies the academic requirements in respect of technical synopsis for the Bachelor
of Engineering Degree in Electronics and Communication.
We are grateful to our chairman Dr. Paul Mathulla for having provided us with
excellent facilities in the college during our courses.
We are indebted to the president Dr. Alice Abraham and to the principal Dr. A. N.
Khaleel Ahmed of I.C.E.A.S for providing us with the resources needed to take up this
project work.
We are grateful to our Head of the Department Mr. Santosh K for his kind support,
guidance and motivation during the course of the seminar work.
We are also deeply indebted to my project guide Ms. Disha BG, Asst. Professor, ECE,
for her guidance and constant encouragement during my course of project synopsis and
for co- operation in successful completion of the report.
Guidance and deadlines play a very important role in the successful completion of the
seminar on time. We convey our regards for having constantly monitored the
development of the seminar and setting up precise deadlines.
SRINIVAS NAIK N
[USN:1IC21EC006]
VINOD KUMAR S
[USN:1IC21EC008]
NIKHIL M
[USN: 1IC21EC009]
L. KISHORE KUMAR
[USN:1IC22EC408]
CONTENTS
ABSTRACT………………………………………………………………...1
1. Chapter 1
INTRODUCTION…………………………………………………..2
1.4 Objectives………………………………………………………4
2. Chapter 2
LITERATURE SURVEY…………………………………………..5
2.1 Multipurpose Adaptable Robot………………………………..5
3. Chapter 3
I
3.2 Proposed System……………………………………………...11
3.3 Problem Statement………………………………………........12
3.4 Objectives…………………………………………………….12
4. Chapter 4
IMPLEMENTATION…………………………………………….14
4.1 Block Diagram……………………………………………….14
4.2 Hardware Requirements……………………………….......16
4.2.1 Power Supply…………………………………......17
4.2.2 Arduino…………………………………………….18
4.2.3 DC Motors…………………………………………19
4.2.4 ILN2003AN/IC-Pump Driver……………………..20
4.2.5 L298N/Motor Driver………………………………22
4.2.6 Pump……………………………………………….24
4.2.7 PCA9685 16-Channel Servo Motor Driver Module.26
4.2.8 HW201 Infrared (IR) Sensor Module……………...28
4.2.9 MQ2 Gas Sensor………………………………….29
4.2.10 KY036 Metal Touch Sensor……………………...31
4.2.11 Passive infrared sensor (PIR sensor)……………..33
4.2.12 Flame Sensor……………………………………..35
4.2.13 Robotic Arm……………………………………...37
4.2.14 Solar Panel………………………………………..39
4.2.15 Buzzer…………………………………………….41
4.2.16 Camera……………………………………………43
4.3 Software Components………………………………………...44
4.3.1 Arduino IDE………………………………………..44
4.3.2 Embedded C………………………………………..47
II
5. Chapter 5………………………………………………………………...49
RESULT ANALYSIS………………………………………….…49
5.1 WORKING OF THE MODEL……………………………49
6. Chapter 6
CONCLUSION AND FUTURE SCOPE…………………………..55
6.1 CONCLUSION……………………………………………...55
6.2 FUTURE SCOPE……………………………………………56
REFERENCES…………………………………………………………...57
III
LIST OF FIGURES
4.1Block Diagram………………………………………………………....14
4.3Arduino model……………………………………………...……….…18
4.5ILN2003AN/IC-Pump Driver……………………………….…………20
4.6L298N/Motor Driver……………….…………………………………...22
4.7Pump…………………………………………………………………..24
4.16 Buzzer………………………………………………………………..41
4.17 Camera…………………………………………………………….....43
IV
5.3 Capturing the visuals using the installed camera on the robot……….50
5.4 Obstacle Sensor in the Robot Sensing the Obstacle in Front of it ..…52
5.7 Gas Sensor in the Robot Sensing for harmful gases present in the
surrounding ………………………………………………………………53
5.8 Some of the Footages Captured by the Camera which are Controlled in
the Application…………………………………………………………...54
V
LIST OF ABBREVIATIONS
1. AI-Artificial Intelligence
2. FL-Fuzzy Logic
4. GA-Genetic Algorithm
9. DR-Dimensionally Reduction
13.IR-Infra Red
14.PIR-Passive Infra-Red
16.AC-Alternating Current
17.DC-Direct Current
18.IC-Integrated Circuit
20.TTL-Transistor-Transistor Logic
VI
21.CMOS-Complementary Metal Oxide Semiconductor
24.EMF-Electromotive Force
26.SDA-Serial Data
27.SCL-Serial Clock
28.OE-Output Enable
30.DOF-Degree of Freedom
31.PV-Photo Voltaic
39.IOT-Internet of Things
VII
MULTI PURPOSE ROBOT 2024-25
ABSTRACT
This project explores the development of a multipurpose robot equipped with solar panel
recharging capabilities, advanced sensor technology, a robotic arm, and integrated audio
and video features. Designed for versatility and efficiency, the robot autonomously
navigates diverse environments while performing critical tasks such as safety monitoring,
environmental assessment, property maintenance, and material handling.
The solar power system ensures sustainable operation by harnessing renewable energy,
complemented by rechargeable batteries for uninterrupted functionality. The metal sensors
enable the robot to detect metallic objects, enhancing its utility in search and recovery
operations. Fire sensors provide real-time alerts to potential fire hazards, making it an
invaluable asset for safety management. Obstacle sensors facilitate smooth navigation,
allowing the robot to avoid obstacles and traverse complex terrains effectively. The
integrated robotic arm extends the robot’s capabilities, enabling it to manipulate objects,
perform repairs, and assist in various tasks with precision.
Additionally, the robot features audio and video capabilities, allowing for real-time
surveillance and communication. This enables users to monitor environments visually and
audibly, enhancing situational awareness and response strategies. with an integrated user
interface for remote control and monitoring, this solar-powered, sensor-rich robot stands as
a robust solution for both domestic and industrial applications, promoting safety, efficiency,
and environmental sustainability. A mobile app with Bluetooth connectivity allows users
to control the robot remotely, facilitating real-time monitoring and customization of its
functions. This innovative design combines renewable energy, advanced sensing
capabilities, and user-friendly operation, making it a valuable solution for a wide range of
applications while enhancing safety and operational efficiency.
Chapter 1
INTRODUCTION
Today robotics is no longer limited to laboratory experiments; they have found their way
into our homes. Being a physical entity itself. Several design ideas have been explored and
are presented in an attempt to maximize the user awareness of the robot’s interaction with
the environment. In this system advanced monitoring of the home is done. This is handled
by Bluetooth controller software Apps, and we are using Bluetooth module for the purpose
of the interfacing of android Mobile and system. In this system they use a sensor which is
used to detect gas leakage. We can detect thieves or unwanted people who enter the home
when the owner is not present at home. In this sense it will help to monitor the home and
capture the image of unwanted activities happening in the home when the owner is not
present at home. The advantage of this system is that we can monitor pick-place objects,
alert owner when gas leakage is detected via buzzer, motion is detected by PIR sensor,
camera for the surveillance and IR sensor for obstacle detection.
In this system we are using a microcontroller, i.e. Arduino UNO microcontroller. The main
purpose of Arduino UNO microcontroller is to interface with the sensor so we can monitor
gas leakage, flame and report to the owner through the buzzer, motion is detected by
lighting on the red and green lights provided. This system is basically doing all operations
as well as picking up the material that present at here and there and put it on the proper
place, as well as it keeps eyes on the home as in the security mode, like gas is detected.
This system will operate on Bluetooth that functions in the android mobile, there is
application available in the mobile which will communicate with system and then will
operate the function user wants like forward, reverse, left, right, cleaning, grip, leave and
stop.
As technology continues to advance, the demand for automation in various sectors has
never been higher. Industries such as manufacturing, logistics, healthcare, and agriculture
are seeking to enhance their operations by leveraging robotic systems that can perform tasks
more efficiently and safely. However, most current robots are designed with a very narrow
focus—each robot is typically built to carry out only one specific task, such as assembling
parts on a production line, cleaning floors, or performing medical procedures. This
specialized design makes it challenging to integrate robots across different areas or
applications, creating inefficiencies and limiting their potential.
One of the key limitations of these traditional robots is their inability to adapt to new
environments or contexts. For example, a robot built for assembling components in a
factory setting is unlikely to be useful in an agricultural setting where tasks vary from
harvesting crops to inspecting soil quality. Similarly, a cleaning robot in a household would
struggle to perform more complex tasks like assisting in elderly care or conducting
maintenance in an industrial facility.
This lack of versatility not only hampers the ability of robots to perform a broad range of
functions but also means industries must invest in multiple, task-specific robots, each
requiring separate maintenance, programming, and integration into their existing systems.
This results in unnecessary complexity, increased costs, and missed opportunities for
broader automation.
1.4 Objectives
• Design and build a robotic system that can perform a variety of tasks across multiple
domains, including household chores, industrial processes, healthcare assistance,
and agricultural operations.
• Equip the multi-purpose robot with the ability to autonomously perceive, interpret,
and react to its environment without human intervention.
• Create a robot with a modular design that allows for easy adaptation and
reconfiguration for different tasks and environments.
• Design the multi-purpose robot to operate safely around humans and other living
beings, particularly in environments where it may collaborate directly with people
(e.g., healthcare, homes, workplaces).
• Optimize the robot’s ability to complete tasks quickly, efficiently, and with a high
degree of accuracy.
• Ensure that the multi-purpose robot can be easily integrated into existing
infrastructures, workflows, and technologies without requiring significant changes
or investment.
• Develop a system that is cost-effective and scalable to meet the needs of different
industries, from small-scale households to large-scale industrial applications.
Chapter 2
LITERATURE SURVEY
Multi-purpose robot technology is becoming increasingly prominent in various industries,
offering the potential to streamline operations and reduce the need for extensive human
labor. These robots are designed to perform a wide range of tasks, from basic automation
to complex problem-solving, by integrating advanced technologies like artificial
intelligence (AI), machine learning, robotics, and data mining. The applications of multi-
purpose robots are diverse, spanning sectors such as e-commerce, analytics, customer
support, education, entertainment, finance, healthcare, human resources, marketing, news,
personal productivity, shopping, travel, and utilities.
robot that can perform multiple functions would be a better solution. This type of
multifunctional robot can execute numerous tasks without any modification of the hardware
or technology built upon. Autonomous robot vehicles can move and operate intelligently
without any assistance. As their potential uses rise, the recent surge in interest in this area
will grow even more. So, the objective of this paper is to build a robot that can perform
manually operated tasks as well as autonomously.[1]
With robotic technologies developing in every passing day, robotic systems have been
widely used in many applications such as factory automation, dangerous environments,
hospitals, surgery, etc... One of the most important issues in the design and development of
intelligent mobile systems is the navigation problem. This consists of the ability of a mobile
robot to plan and execute free collision movements within its environment. To achieve that,
the mobile robot must be capable of sensing its environment, interpreting the sensed
information to refine the knowledge of its position and the environment's structure,
planning a route from an initial to a goal position with obstacle avoidance, and controlling
the mobile robot's turning angle and linear velocity to reach the target. Some artificial
intelligence techniques, such as Fuzzy Logic (FL), Artificial Neural Networks (ANNs),
Genetic Algorithm (GA), and/or a combination of a few of them, are used to implement
intelligent robots. ANNs modeling in software has proven to be effective in many cases e.g.
functions approximation, pattern recognition, image processing, etc.
However, since these software models run on a sequential machine, they lose touch with
the parallel nature of biological neural networks. In general, software instructions executed
sequentially cannot take advantage of the inherent parallelism of ANNs architectures.
Hardware implementations of neural networks promise higher speed operation since they
can exploit this massive parallelism. Different hardware implements of neural networks
have been reported. The best choices for neural network implementations that achieve both
high speed and rapid prototyping appear to be programmable hardware approaches
like field programmable gate arrays (FPGAs) and field programmable analog arrays
(FPAAs). Compared to digital hardware, FPAAs have the advantage of interacting directly
with the real world because they receive, process, and transmit signals totally in the analog
domain (without the need to do A/D, D/A conversions) and are suitable for real time
applications. Mobile Robot equipped with visual capabilities may give a great challenge
for the main robotic systems since it requires skills for the solution of complex image in
order to understand the situation. One of the visual capabilities that can be added to the
robot is Person identification; it is a very important function for robots which work with
people in the real world. Face recognition is one of the most common methods of
identification since it is a non-contact method and the person under examination can thus
be unconscious that recognition is being carried out. In face recognition, the captured face
image data often lies in a high-dimensional space. In practice, these image spaces often lead
to low recognition accuracy and expensive computational cost. Dimensionality reduction
(DR) provides a means to solve this problem by projecting the face images into a lower
dimensional feature space in which the semantic structure of the image space becomes
clear.
There are several techniques available but Principle Component Analysis (PCA) and Linear
Discriminate Analysis (LDA) are the most well-known techniques in dimensionality
reduction and features extraction field. PCA is a linear transformation which is used for
feature extraction. It is a powerful technique for extracting global structures from high-
dimensional data sets and has been widely employed to reduce dimensionality and extract
abstract features of faces for face recognition. LDA is like PCA, a linear transformation but
it is a supervised method. It is usually applied to get the feature or reduce the dimension of
the image before classification stage. After subspace features are computed, many methods
are used to classify the face images. Recently, more complicated classifiers, such as support
vector machines (SVMs), have been applied to further enhance the classification
performance of the PCA and LDA subspace features.[2]
In today's contemporary world, technology has a huge advantage that reduces the labor and
cost of the work. Robotization plays a massive role in technology. It changes people's day-
to-day lives by helping them do better and great things in an easy manner. To minimize the
Workload and time consumption and increase production and efficiency, robots are used.
Here, the pick and place robot used to pick the materials and place them in the
correct position and order. For industrial purposes, where the most complicated work or
task is to handle the material or product, this pick and place robot makes it easier and more
efficient. The arm is designed to reach different positions and locations to reach the object
and also perform the different tasks. Through a combination of vision technology and
sensors, the object is identified and performs the task. Where the sensor is attached to the
robot for detecting the object. The movement of a robotic arm is controlled by forward and
backward, right and left motions. The remote controller is used at the transmitting end.
Here, the four motors are used to run the robot. Pick and place or robotic arms are more
commonly used in manufacturing assembly, packaging, bin- picking, and inspection, but
they can also be used for a variety of other jobs and applications. Robotic pick and place
systems can be programmed to function at a faster speed than manual processes allow.
According to the production requirements, many pick and place robots can be programmed
to reach the maximum output of the product.[3]
The world of Robotics is one of the most energizing zones that has experienced steady
advancement and development. Advanced mechanics are interdisciplinary and have
become increasingly more an aspect of our lives. It is noticeable that robotics is all over the
place, and over the long haul, they will be present and increasing. Most importantly, mobile
robots are helping people to fight disasters and accidents like fire. As of late, there has been
a great deal of danger on the workers in firefighting. Firefighting robots can be utilized to
secure a firefighting workforce from the threat of ignition and inward breath of poisonous
gases and hazardous materials, and these robots are prompting the upkeep of the life of
workers in the field of firefighting. Robot designing is an electromechanical tool utilized in
science or industry to supplant a human work or do the capacities relegated to him.
Technologies are changing in our life in every possible way. People are trying to invent
more and more new things every day. Even when it comes to the fact of surveillance, then
also it makes our life easier. We can use robots as a substitute for humans. When it Also
comes about the fact of security, the surveillance robot plays a vital role here. This robot
can reach all those places where humans can’t reach. From national security to household
security, we find surveillance robots. Here we are fighting fire surveillance robots with
collision avoidance which can help people at the time of spreading fire and can help to save
lives and can help face any financial losses. Mobile Robot is a system fit for moving their
bodies from one spot to another in its condition. Mobile robots come in two ways: fastened
and self-ruling. Surveillance means monitoring any situation, behavior, or activity. These
days, most of the system utilizes mobile robots with a camera for surveillance. The camera
mounted on the robot can move to various areas. These sorts of robots are more adaptable
than fixed cameras. In it is given that generally utilized surveillance robots are wheel robots.
These days, fire accidents are probable, and now and then, it turns out to be extremely
difficult for a firefighter to ensure individuals' lives.
Sun offered sunlight and hit (with chemical effects) to earth continuously over millions of
years and will offer millions of years onwards. The tremendous energy offered from sun is
Thousands of times higher than the total energy consumption used by the world in the
present time. The solar panel is the equipment to convert transferring the sun sunlight and
hit into electrical energy, it is a renewable energy resource. Unfortunately, the solar panel
energy is uncertain and unstable. How to use the energy source is an engineering topic. This
Paper introduces a method to obtain stable energy from solar panel energy system. The
sunlight changes from time to time. For example, we bought a solar panel. It is open from
8 o’clock in the morning till 8 o’clock in the evening. The rated voltage from the panel is
186 V (plus the current is about 13 A), but it varies from 186 -20 % to 186 + 20 %, i.e. from
148.8 V to 223.2 V. In order to convert this energy into a grid: 400 V/50 Hz/ 3), we have
to design our power electronic circuits to deal with it.[5]
Chapter 3
The multi-purpose robot system integrates various sensors and actuators to perform a wide
range of tasks. At its core, the system uses an Arduino microcontroller for processing and
control. The robotic arm, powered by servo motors and a 16-channel servo motor driver,
allows for precise movement and manipulation of objects. The system is equipped with a
flame sensor for detecting fire, an infrared (IR) sensor for obstacle detection and distance
measurement, and a metal touch sensor to identify metal objects. A PIR sensor is included
for motion detection, enabling the robot to respond to human presence. Gas detection is
handled by a gas sensor, which ensures safety by detecting harmful gases in the
environment. The robot's movement is controlled by DC motors, driven by a motor driver,
allowing for smooth navigation. A buzzer alerts users to specific events or conditions, and
Bluetooth connectivity allows for wireless control via a Bluetooth controller, enabling
remote operation. The integration of these components creates a versatile robot capable
of performing various tasks in dynamic environments.
serves as an alert mechanism, notifying users of critical events such as gas detection or fire.
Bluetooth connectivity is integrated into the system for remote control, allowing users to
operate the robot wirelessly via a Bluetooth controller. This proposed system is designed
to provide a highly adaptable and efficient solution for tasks such as fire detection, gas leak
monitoring, object manipulation, and environmental interaction, making it suitable for both
industrial and home applications.
The problem addressed by this system is the lack of an integrated, multi-functional robot
capable of autonomously performing various tasks in dynamic environments while
ensuring safety and user interaction. In many industrial, commercial, and home settings,
there is a growing need for robots that can detect environmental hazards (such as gas leaks
or fire), interact with physical objects, and navigate without human intervention. Existing
robots are often limited to specific tasks or lack the necessary sensor integration to handle
multiple functions effectively. Furthermore, current solutions require manual control or
complex setups, limiting their versatility and ease of use. The proposed system aims to
overcome these limitations by integrating multiple sensors (flame, IR, metal touch, PIR,
gas sensor) and actuators (servo motors, DC motors) into a single platform, with wireless
control via Bluetooth. This will enable the robot to autonomously detect and respond to
various environmental conditions, perform tasks such as object manipulation, and ensure
safety in hazardous environments, offering a versatile, user-friendly solution.
3.4 Objectives
These objectives aim to deliver a highly functional, adaptable, and safe robotic system that
can autonomously perform various tasks while providing remote control capabilities for
greater flexibility.
Chapter 4
IMPLEMENTATION
to detect metallic objects, enhancing its utility in search and recovery operations. Fire
sensors provide real-time alerts to potential fire hazards, making it an invaluable asset for
safety management. Obstacle sensors facilitate smooth navigation, allowing the robot to
avoid obstacles and traverse complex terrains effectively. The integrated robotic arm
extends the robot’s capabilities, enabling it to manipulate objects, perform repairs, and
assist in various tasks with precision.
Additionally, the robot features audio and video capabilities, allowing for real-time
surveillance and communication. This enables users to monitor environments visually and
audibly, enhancing situational awareness and response strategies. with an integrated user
interface for remote control and monitoring, this solar-powered, sensor-rich robot stands as
a robust solution for both domestic and industrial applications, promoting safety, efficiency,
and environmental sustainability. A mobile app with Bluetooth connectivity allows users
to control the robot remotely, facilitating real-time monitoring and customization of its
functions. This innovative design combines renewable energy, advanced sensing
capabilities, and user-friendly operation, making it a valuable solution for a wide range of
applications while enhancing safety and operational efficiency.
• Power Supply.
• Arduino.
• Dc Motor Driver.
• Iln2003an/IC-Pump Driver.
• L298n/Motor Driver
• Pump.
• Flame Sensor
• Robotic Arm
• Solar Panel
• Buzzer
• Camera
The power supply which we are using consists of transformers which reduces alternating
current to required voltage coming with the flow of power supply after reduction of
voltage then we use a full wave bridge rectifier to convert AC to DC and then the DC
current obtained is fed to the filters and then to the required voltage regulator and the to
a load, the output of this power supply is obtained from the load and given to the
components and the microcontroller. The function of a linear voltage regulator is to
convert a varying DC voltage to a constant, often specific, lower DC voltage. In addition,
they often provide a current limiting function to protect the power supply and load from
overcurrent (excessive, potentially destructive current).
A constant output voltage is required in many power supply applications, but the voltage
provided by many energy sources will vary with changes in load impedance. Furthermore,
when an unregulated DC power supply is the energy source, its output voltage will also
vary with changing input voltage. To circumvent this, some power supplies use a linear
voltage regulator to maintain the output voltage at a steady value, independent of
fluctuations in input voltage and load impedance. Linear regulators can also reduce the
magnitude of ripple and noise on the output voltage.[6]
4.2.2. Arduino
The Arduino Uno is an open-source microcontroller board based on the Microchip
ATmega328P microcontroller and developed by Arduino.cc. The board is equipped with
sets of digital and analog input/output (I/O) pins that may be interfaced to various expansion
boards (shields) and other circuits. The board has 14 digital I/O pins (six capable of PWM
output), 6 analog I/O pins, and is programmable with the Arduino IDE (Integrated
Development Environment), via a type B USB cable. It can be powered by the USB cable
or by an external 9-volt battery, though it accepts voltages between 7 and 20 volts.
It is like Arduino Nano and Leonardo. The hardware reference design is distributed under
a Creative Commons Attribution Share-Alike 2.5 license and is available on the Arduino
website. Layout and production files for some versions of the hardware are also available.
The word "uno" means "one" in Italian and was chosen to mark the initial release of
Arduino Software. The Uno board is the first in a series of USB-based Arduino boards; it
and version 1.0 of the Arduino IDE were the reference versions of Arduino, which have
now evolved to newer releases.[7]
4.2.3. Dc Motor
Let us start by looking at the plan of a simple two-pole DC electric motor. Electric
motors are everywhere almost every mechanical movement that can be seen around is
caused by an AC (alternating current) or DC (direct current) electric motor.
An electric motor is all about magnets and magnetism: A motor uses magnets to create
motion If that have ever played with magnets about the fundamental law of all magnets:
Opposite pole attracts and like pole repels. So have two bar magnets with their ends
marked "north" and "south" then the north end of one magnet will attract the south end
of the other.
On the other hand, the north end of one magnet will repel the north end of the other (and
similarly, south will repel south). Inside an electric motor, these attracting and repelling
forces create rotational motion). To understand how an electric motor works, the key is
to understand how the electromagnet works. In the Fig 4.2.3.1 shows two magnets in
the motor, the armature (or rotor) is an electromagnet, while the field magnet is a
permanent magnet.[8]
The ILN2003AN is a specific type of IC (Integrated Circuit) designed to control and drive
pumps, motors, or other inductive loads. This component belongs to the broader category
of motor driver ICs and is typically used in applications involving stepper motors, DC
motors, and similar devices, particularly when the load requires precise control over current
and voltage.
The ILN2003AN is a low-side driver IC typically used to drive inductive loads like pumps,
motors, or solenoids. It is designed to handle moderate power and can control switching
signals sent to inductive devices. This IC is often used in automation, robotics, or other
systems that require precise control of an inductive load.
4.2.4.4 Applications:
The ILN2003AN is used in various applications that require driving inductive loads, such
as:
• Pump Drivers: Ideal for controlling water, air, or liquid pumps in various systems,
including HVAC, medical devices, and industrial automation.
• Motor Drivers: Common in applications where DC motors or stepper motors need
to be driven in a controlled manner.
4.2.6 Pump
Micro DC 3-6V Micro Submersible Pump Mini water pump For Fountain Garden Mini
water circulation System DIY project. This is a low cost, small size Submersible Pump
Motor which can be operated from a 3 ~ 6V power supply. It can take up to 120 liters
per hour with very low current consumption of 220mA. Just connect tube pipe to the
motor outlet, submerge it in water and power it. Make sure that the water level is always
higher than the motor. Dry run may damage the motor due to heating and it will also
produce noise.
4.2.6.1 Specifications
• Operating Voltage: 3 ~ 6V
The HW201 Infrared (IR) Sensor Module is a simple and commonly used module designed
to detect infrared signals, typically used for proximity detection or obstacle avoidance in
various embedded system applications like robots, automated systems, and security
devices. The module detects the presence of infrared light, making it an essential
component for systems that require basic IR sensing functionality.
This module is often used for applications such as line-following robots, obstacle detection
systems, and basic IR remote control systems.
The HW201 IR Sensor is a basic infrared light sensor module that typically includes both
an IR emitter (LED) and an IR receiver (photodiode or phototransistor). The emitter sends
out infrared light, which is then reflected by objects in the environment. The receiver
detects the reflected infrared light, and based on the amount of reflected light, the module
outputs a signal that can be used for detection or triggering actions.
This module is designed for short-range detection, making it ideal for basic proximity
sensors, line-following robots, and systems that need to detect objects or surfaces.
The MQ-2 is a popular gas sensor module used to detect various gases such as methane
(CH4), propane (C3H8), butane (C4H10), smoke, and other combustible gases. It is
commonly used in gas leakage detection systems, air quality monitoring, and fire alarm
systems. The sensor is based on a semiconductor that changes resistance when exposed to
different gases, making it ideal for applications requiring the detection of toxic or
flammable gases in the air.
The KY-036 Metal Touch Sensor is a capacitive touch sensor module commonly used
in various embedded systems, automation projects, and interactive designs. The
module is primarily used to detect the presence or touch of a human or an object on a
metal surface. It
uses the principle of capacitive sensing, where the sensor can detect changes in
capacitance when a person touches the metal plate.This module is suitable for
applications where you need to create touch-sensitive switches, touchless interaction
systems, or as part of safety systems to detect touch or human presence.
The KY-036 touch sensor module contains a metal plate and associated electronics
that detect the presence of a nearby finger or hand. When a person touches the metal
surface, the system detects the change in capacitance caused by the human body’s
conductivity and generates a digital or analog signal to trigger actions in
microcontroller-based systems.
The module is commonly used with Arduino or similar platforms, and it can be used
in a variety of projects that require a touch-sensitive or proximity sensor, such as
touch-sensitive switches, human presence detection, and interactive devices.
The PIR sensor (Passive Infrared Sensor) is a type of motion sensor that detects infrared
radiation (heat) emitted by warm objects such as human bodies. PIR sensors are widely
used in motion detection applications such as security systems, lighting control, and
automation. They are also an essential component in devices like automatic doors, alarm
systems, and energy-saving systems.
The PIR sensor can detect movement in an environment by sensing the infrared radiation
emitted by warm bodies. Once movement is detected, the sensor outputs a signal that can
trigger various actions such as turning on lights or triggering alarms.
A Flame Sensor is a device designed to detect the presence of a flame, fire, or heat sources
in its proximity. It is widely used in fire detection systems, safety alarms, and various other
applications requiring detection of flame-related events. The Flame Sensor Module for
Arduino typically uses an infrared (IR) sensor or a photoresistor to detect specific
wavelengths of light emitted by flames, usually in the infrared or visible light spectrum.
The Flame Sensor module provides an easy way to detect fire, making it suitable for
applications like fire safety systems, fire alarms, and robotic fire detection.
Once the sensor detects a flame, it sends a signal to the connected microcontroller (like an
Arduino) through its output pin, which can be used to trigger an event, such as sounding an
alarm or turning on a cooling system.
healthcare, research, and even for educational purposes. They are often equipped with
motors, actuators, sensors, and controllers to perform tasks such as lifting, moving,
assembling, or interacting with objects in a controlled environment.
In this guide, we will cover the components, working principles, and applications of robotic
arms, focusing on the Arduino-based robotic arm, which is a popular choice for both
hobbyists and professionals working on automation and robotics projects.
• Servo Motors: The most commonly used actuators in robotic arms, as they provide
precise control of the arm's movements. Each joint in the arm is typically driven by
a separate servo motor.
• Stepper Motors: Used in some robotic arms for more accurate control of rotation
and positioning.
• DC Motors: In some designs, DC motors are used for less precise but more robust
movements, especially for applications requiring more power.
b. Joints:
• The arm's movement is divided into multiple joints (typically 3 to 6), each providing
a specific degree of freedom (DOF). These joints can move in different ways:
o Rotational Joints: Allow the arm to rotate around a fixed axis.
o Linear Joints: Allow linear motion (like a slide or extension).
o Combination of Both: Some joints allow both rotational and linear
movement.
c. End Effector:
• The end effector is the "tool" or "hand" of the robotic arm. This could be a gripper,
vacuum, or specialized tool (e.g., a welding torch, soldering iron, or 3D printer
nozzle).
• The type of end effector used depends on the task the robotic arm is designed to
perform.
d. Sensors:
• Position Sensors: Feedback from the arm's joints to know the exact position of each
servo motor.
• Force Sensors: Measure the force applied by the arm on objects (important for
delicate tasks like assembly or picking).
• Infrared (IR) Sensors or Ultrasonic Sensors: Used for collision avoidance and object
detection.
• Cameras: In some advanced robotic arms, cameras or vision systems are used to
guide movements and recognize objects for manipulation.
e. Microcontroller/Controller:
• A microcontroller (e.g., Arduino, Raspberry Pi, ESP32) is used to control the
movements of the robotic arm. It receives signals from input devices (e.g., joystick,
keyboard, or sensors) and sends control signals to the motors.
• Common controllers include Arduino boards for simple projects or Raspberry Pi for
more advanced processing and vision-based tasks.
f. Power Supply:
• A stable DC power supply is required to power the motors and microcontroller. The
power needs will vary depending on the size and complexity of the robotic arm,
with larger arms requiring more powerful supplies.
g. Communication Modules:
• Wireless Communication: Robotic arms can be controlled remotely using wireless
communication technologies like Bluetooth, Wi-Fi, or RF modules.
• Wired Communication: For more precision and reliability, robotic arms can also be
controlled via USB or serial communication. [18]
A solar panel (also known as a photovoltaic (PV) panel) is a device that converts sunlight
into electricity using the photovoltaic effect. Solar panels are an essential component of
solar power systems, used to harness renewable energy from the sun. They have wide
applications, ranging from residential and commercial power generation to powering small
electronic devices.
In this guide, we will cover the components, working principles, and applications of
solar panels, focusing on how they work, how to integrate them with systems like Arduino
for energy monitoring or control, and how to set up a simple solar power system.
The electrical output from a solar panel can be used directly or stored in batteries for later
use. Solar panels are often integrated into larger systems, which can include solar charge
controllers, batteries, and inverters to make the energy usable for devices or to feed it into
the grid.
d. Back Sheet:
• The back sheet, typically made of a durable, insulating material, helps protect the
solar cells from moisture, dust, and other contaminants. It also prevents electrical
short circuits by isolating the cells.
e. Junction Box:
• A junction box is attached to the back of the solar panel and houses the electrical
components (diodes and wiring) that connect the panel to external devices, like a
charge controller or battery.
f. Diodes:
• Bypass diodes are used to prevent power loss in case one of the solar cells in the
panel becomes shaded or malfunctioning. They allow current to bypass the affected
cell and maintain the flow of electricity. [19]
4.2.15 Buzzer
A buzzer is an electrical device that produces a sound (usually a beeping or buzzing noise)
in response to an electrical signal. It is commonly used in electronic circuits and devices to
alert, warn, or indicate a specific event or action, such as an alarm, notification, or error
signal.
Buzzers are simple components that are widely used in projects with Arduino, Raspberry
Pi, or other microcontrollers. They come in two main types: active buzzers and passive
buzzers. Each type has its own characteristics and use cases.
4.2.16 Camera
Fig 4.17:Camera
A camera is a device that captures light and records images, either as still photographs or
moving images (videos). It does this through a combination of optics (lenses), sensors, and
electronics that work together to convert light into a digital or physical image.
an executable cyclic executive program with the GNU toolchain, also included with the
IDE distribution. The Arduino IDE employs the program argued to convert the executable
code into a text file in hexadecimal encoding that is loaded into the Arduino board by a
loader program in the board's firmware. By default, argued is used as the uploading tool
to flash the user code onto official Arduino boards with the rising popularity of Arduino
as a software platform, other vendors started to implement custom open-source compilers
and tools (cores) that can build and upload sketches to other microcontrollers that are not
supported by Arduino's official line of microcontrollers.
In October 2019 the Arduino organization began providing early access to a new Arduino
Pro IDE with debugging and other advanced features. Arduino is an open-source hardware
and software company, project and user community that designs and manufactures single-
board microcontrollers and microcontroller kits for building digital devices. It’s hardware
products are licensed under a CC-BY-SA license, while software is licensed under the
GNU Lesser General Public License (LGPL) or the GNU General Public License (GPL),
permitting the manufacture of Arduino boards and software distribution by anyone.
Arduino boards are available commercially from the official website or through
authorized distributors. Arduino board designs use a variety of microprocessors and
controllers. The boards are equipped with sets of digital and analog input/output (I/O) pins
that may be interfaced to various expansion boards ('shields') or breadboards (for
prototyping) and other circuits. The boards feature serial communications interfaces,
including Universal Serial Bus (USB) on some models, which are also used for loading
programs from personal computers.
The microcontrollers can be programmed using the C and C++ programming languages,
using a standard API which is also known as the "Arduino language". In addition to using
traditional compiler toolchains, the Arduino project provides an integrated development
environment (IDE), and a command line tool (Arduino-cli) developed in Go. The Arduino
project began in 2005 as a tool for students at the Interaction Design Institute Ivrea in Ivrea,
Italy, aiming to provide a low-cost and easy way for novices and professionals to create
devices that interact with their environment using sensors and actuators. Common examples
of such devices intended for beginner hobbyists include simple robots, thermostats and
motion detectors.
The name Arduino comes from a bar in Ivrea, Italy, where some of the founders of the
project used to meet. The bar was named after Arduin of Ivrea, who was the margrave of
the March of Ivrea and King of Italy from 1002 to 1014. A program for Arduino hardware
may be written in any programming language with compilers that produce binary machine
code for the target processor. Atmel provides a development environment for their 8-bit
AVR and 32-bit ARM Cortex-M based microcontrollers: AVR Studio (older) and Atmel
Studio (newer).
4.3.1.1 IDE
On October 18, 2019, Arduino Pro IDE (alpha preview) was released. The system still uses
Arduino CLI (Command Line Interface), but improvements include a more professional
development environment, autocompletion support, and Git integration. The application
frontend is based on the Eclipse This is a Open Source IDE. The main features available in
4.3.1.3 Sketch
A sketch is a program written with the Arduino IDE. Sketches are saved on the
development computer as text files with the file extension. Arduino Software (IDE) pre-
1.0 saved sketches with the extension.
• setup (): This function is called once when a sketch starts after power-up or reset.
It is used to initialize variables, input and output pin modes, and other libraries
needed in the sketch. It is analogous to the function-main ().
• loop (): After setup () function exits (ends), the loop () function is executed
repeatedly in the main program. It controls the board until the board is powered
off or is reset. It is analogous to the function while (1). [22]
4.3.2. Embedded c
A precise and stable characteristic feature is that or not all functions of embedded
software are initiated/controlled via a human interface, but through machine-interfaces
instead. Manufacturers build embedded software into the electronics of cars, telephones,
modems, robots, appliances, toys, security systems, pacemakers, televisions and set-top
boxes, and digital watches, for example. This software can be very simple, such as
lighting controls running on an 8-bit microcontroller with a few kilobytes of memory with
the suitable level of processing complexity determined with a Probably Approximately
Correct Computation framework (a methodology based on randomized algorithms) or can
become very sophisticated in applications such as airplanes, missiles, and process control
systems.[23]
Chapter 5
RESULT ANALYSIS
Fig 5.3: Capturing the visuals using the installed camera on the robot
Fig 5.4: Obstacle Sensor in the Robot Sensing the Obstacle in Front of it
Fig 5.7: Gas Sensor in the Robot Sensing for harmful gases present in the
surrounding
Fig 5.8: Some of the Footages Captured by the Camera which are
Controlled in the Application
Chapter 6
6.1 Conclusion
With these advancements, the robot is well-positioned to evolve into a highly autonomous
system capable of navigating its environment, performing remote operations via Bluetooth,
and even functioning in security and surveillance roles by capturing real-time video feed.
The camera also opens possibilities for incorporating machine learning algorithms for
image processing, improving object detection, and enabling the robot to adapt its behavior
based on visual input.
The future scope of this robot is rich with possibilities, including enhanced sensing, AI-
driven decision-making, advanced robotic manipulation, and seamless integration into
smart environments. By leveraging additional sensors, cloud connectivity, and machine
learning, the robot can evolve into a more intelligent and autonomous assistant, capable of
performing tasks ranging from household assistance to industrial automation and healthcare
support.
Ultimately, this multi-purpose robot provides a solid foundation for future innovations,
offering great potential to adapt to a wide variety of real-world applications while providing
a scalable and flexible platform for further research and development in robotics.
The multi-purpose robot, equipped with advanced sensors like IR, PIR, flame, and a
camera, as well as a robotic arm and various control components, has significant potential
for further development. As technology continues to advance, there are numerous
opportunities to enhance the robot’s capabilities, enabling it to perform more complex tasks
autonomously, interact more intuitively with humans, and be applied across various
industries. Below are the key areas where this robot could evolve:
7. Internet of Things (IoT) and Cloud Integration (Remote control, data analysis)
REFERENCES
[1] Design of Solar-Panel Energy System: Fang Lin Luo, Senior Member IEEE Nanyang
Technological University Singapore [email protected].
[3] Intelligent Surveillance Robot Dr. Thair Ali Salh Department of Computer Technology
Engineering Technical Collage-Mosul Mosul, Iraq [email protected] Mustafa Zuhaer
Nayef Department of Computer Technology Engineering Technical Collage-Mosul Mosul,
Iraq [email protected].
[4] MATERIAL HANDLING USING PICK AND PLACE ROBOT: Mr.C. Mohan Raj,
Assistant Professor Department of Electrical and Electronics Engineering. Sri Eshwar
College of Engineering Coimbatore, India [email protected], Ramya M UG Students
Department of Electrical and Electronics Engineering. Sri Eshwar College of Engineering
Coimbatore, India [email protected], Dr.W. Rajanbabu, Professor Department
of Electrical and Electronics Engineering. Sri Eshwar College of Engineering Coimbatore,
India [email protected], Manjuladevi. UG Students Department of Electrical and
Electronics Engineering. Sri Eshwar College of Engineering Coimbatore, India
[email protected], Tanuja M UG Students Department of Electrical and
Electronics Engineering. Sri Eshwar College of Engineering Coimbatore, India
[email protected], Hari Dharshini S UG Students Department of Electrical
[5] Design And Development Of A Fire Fighting Mobile Surveillance Robot With
Autonomous Collision Avoidance Mehedi Hasan Dept. of Electrical and Electronic
Engineering American International University - Bangladesh Dhaka 1229, Bangladesh
[email protected] Ritu Parna Das Dept. of Electrical and Electronic
Engineering American International University - Bangladesh Dhaka 1229, Bangladesh
[email protected] Rafiul Islam Sonet Dept. of Electrical and Electronic Engineering
American International University - Bangladesh Dhaka 1229, Bangladesh
[email protected] Joel Ahmed Dept. of Electrical and Electronic
Engineering American International University - Bangladesh Dhaka 1229, Bangladesh
[email protected].
[6] https://en.wikipedia.org/wiki/Power_supply_unit_(computer)
[7] https://docs.arduino.cc/hardware/uno-rev3/
[8] https://www.geeksforgeeks.org/dc-motor/
[9] https://www.digchip.com/datasheets/parts/datasheet/966/ILN2003AN.php
[10] https://components101.com/modules/l293n-motor-driver-module
[11] https://en.wikipedia.org/wiki/Pump
[12] https://learn.adafruit.com/16-channel-pwm-servo-driver/overview
[13] https://www.circuits-diy.com/hw201-infrared-ir-sensor-module/
[14] https://www.elprocus.com/an-introduction-to-mq2-gas-sensor/
[15]https://www.electroniclinic.com/ky-036-metal-touch-sensor-module-arduino-circuit-diagram-
code/
[16] https://www.electronicwings.com/arduino/pir-sensor-interfacing-with-arduino-uno
[17] https://www.elprocus.com/flame-sensor-working-and-its-applications/
[18] https://en.wikipedia.org/wiki/Robotic_arm
[20] https://www.elprocus.com/buzzer-working-applications/
[21] https://en.wikipedia.org/wiki/Camera
[22] https://www.arduino.cc/en/software
[23] https://www.geeksforgeeks.org/embedded-c/