0% found this document useful (0 votes)
103 views

A Custom Designed 3D Printed 4 DoF Mobile Robotic Arm For Remote Exploration

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-6 | Issue-7 , December 2022, URL: https://www.ijtsrd.com/papers/ijtsrd52430.pdf Paper URL: https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/52430/a-customdesigned-3d-printed-4dof-mobile-robotic-arm-for-remote-exploration/naman-tanwar

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views

A Custom Designed 3D Printed 4 DoF Mobile Robotic Arm For Remote Exploration

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-6 | Issue-7 , December 2022, URL: https://www.ijtsrd.com/papers/ijtsrd52430.pdf Paper URL: https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/52430/a-customdesigned-3d-printed-4dof-mobile-robotic-arm-for-remote-exploration/naman-tanwar

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

International Journal of Trend in Scientific Research and Development (IJTSRD)

Volume 6 Issue 7, November-December 2022 Available Online: www.ijtsrd.com e-ISSN: 2456 – 6470

A Custom-designed 3D Printed 4-DoF


Mobile Robotic Arm for Remote Exploration
Naman Tanwar, Sanjay Kumar Singh
School of Electronics Engineering, Vellore Institute of Technology, Vellore, Tamil Nadu, India

ABSTRACT How to cite this paper: Naman Tanwar |


Exploration of various unreachable/hazardous places is a key to Sanjay Kumar Singh "A Custom-
scientifically enriching our understanding of this universe. And designed 3D Printed 4-DoF Mobile
remotely performing operations on these locations opens more Robotic Arm for Remote Exploration"
aspects of not even exploration but also providing medical aid where Published in
International
a real human intervention is not possible. This project aims to serve Journal of Trend in
this goal of remote exploration and operation. A custom-designed 3D Scientific Research
printed 4DoF Robotic arm is used for operations in remote terrain and Development
with a movable rigid aluminum custom-designed chassis with (ijtsrd), ISSN:
multiple distance sensors and a camera for remote exploration and 2456-6470, IJTSRD52430
navigation. Volume-6 | Issue-7,
December 2022, pp.857-862, URL:
KEYWORDS: Robotic Arm, Mobile vehicle, Remotely Operated, www.ijtsrd.com/papers/ijtsrd52430.pdf
ESP32, Raspberry Pi, Servo Motor, Blynk, Cloud service
Copyright © 2022 by author (s) and
International Journal of Trend in
Scientific Research and Development
Journal. This is an
Open Access article
distributed under the
terms of the Creative Commons
Attribution License (CC BY 4.0)
(http://creativecommons.org/licenses/by/4.0)

I. INTRODUCTION
There are many places where human intervention is modes of connectivity can be used according to the
not possible simply because it might cost a human life. location where the robot will operate.
The requirement of intervening in those places might
II. Design And Implementaion
be to explore those parts of the world, to fix some
A. Methodology
critical faults at places where human intervention is
The working of the project can be defined as:
very dangerous, to save another human life by
2 Ultrasonic sensors are used to measure the
providing medical aid, or for military reasons like
distance from any obstacle in any of the four
spying on terrorist operations.
directions.
Hence to serve these purposes, this paper presents a The front Ultrasonic sensors will be used to
project that aims to build a remotely controlled robot measure the distance of the front obstacle. The
that can explore, navigate, operate, and can be distance can then be used by the robotic arm to
manually controlled in such remote places. operate on that obstacle
This project is a 3D printed 4-DoF robotic arm The rear Ultrasonic sensors are mounted on a
mounted on a custom-designed Aluminium-based servo motor which will help to measure the
movable chassis with different sensors and a camera. distance from an obstacle in 3 directions using
Ultrasonic sensors are used to provide the available only one sensor
distance in all four directions and a camera is used to Wheels and the robot arms are controlled using an
visually inspect the target on which the robot will ESP32 microcontroller
operate. The whole construction can be controlled A live camera is set up using Raspberry Pi which
remotely via the internet. The only requirement is that can be accessed remotely from any location. This
that place should have mobile internet coverage. This makes the Robot perfect for operations in remote
can also affect the places it can operate but different locations.

@ IJTSRD | Unique Paper ID – IJTSRD52430 | Volume – 6 | Issue – 7 | November-December 2022 Page 857
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
The robot can be controlled remotely using the joined together with lead-based corner brackets. The
Blynk IoT Platform mounts for the camera were also made using this
Temporarily, a 12v SMPS is used instead of a LiPo aluminium profile.
battery to power the robot. The Camera is mounted horizontally to the ground to
get a perfectly aligned 2D image of the ground which
B. Mechanical Design and Implementation
will be further useful for performing operations on the
ground objects via the robotic arm.
The Orange and white colour acrylic sheet (as seen in
Fig. 1) is also designed using Fusion 360 and then
laser cut through an online service (www.Robu.in) to
mount the front sensor and DC geared motors. [3]
Two 100rpm geared DC motors mounted on the
orange acrylic sheets will be used as main motors to
move the entire robot around.
Fig 1 Mechanical Design and Implementation Two wheels are attached to the geared motors to
The complete mechanical structure is designed using provide mobility. The motors with wheels are
Fusion 360 keeping in mind the real-world parameters mounted horizontally centered concerning the
such as servo motor size, motor diameter, wheel size, complete chassis (this includes the width of the power
circuit board dimensions, power supply dimensions, supply).
the center of the movement, etc. About 14 individual
parts were designed which then was 3d printed.
The robotic arm parts, sensor mount, and PCB mount
were 3D printed with PLA filament.

Fig 3 Robot’s bottom view. 4 castor wheels, PCB


board, 3D printed PCB holder plates, and 2 main
100mm wheels can be observed.
Fig 2 Robot Chasis
The wheel diameter is 100mm which lifted the whole
The robot chassis was made using the 2020 chassis by 22mm and hence about 22mm castor
Aluminium Extrusion profile. The profile was cut with wheels were used in front and back to provide
an angle grinder according to the 3D design and then stability to the robot
C. Circuit Diagram, Components used, and Circuit:

Fig. 4. Circuit Diagram

@ IJTSRD | Unique Paper ID – IJTSRD52430 | Volume – 6 | Issue – 7 | November-December 2022 Page 858
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
The Circuit has three parts: The circuit is made on a PCB board. Male header
Power Supply pins are used to connect the servo motors, Ultrasonic
ESP32 and connected devices sensors, Motor Driver input, and power input for
Raspberry Pi and camera Raspberry Pi. Female headers are used to mount ESP
32 on the PCB.
1. Power Supply
A 12V 30A rated SMPS is used to supply power to the Two Buck converters can be seen mounted with some
entire system. Since this power supply is temporary, screws on the PCB in the top left. At the bottom, the
It can be replaced with a suitable LiPo battery L298N motor driver is mounted with some screws.
(preferably, 3S 35C 3500MAh) for mobility [7].
D. Raspberry Pi Camera setup:
The voltage is then stepped down to 5v using Buck A web interface is set up for the Raspberry Pi
Converters so that it can be used to power other camera through an Open Source GitHub Project.
components. But this can only be viewed locally. [2]
The servo (Tower Pro MG995) can pull up to 1.5A to Then, using the remote.it platform, the local
2A of current under stall conditions. So all the 6 webpage is streamed to the internet which can be
servers can pull up to 12A of current. Hence, a accessed through a specific URL. [4]
separate buck converter (rated 20A) is used to power This URL is then used in the Blynk IoT platform
servos alone. The remaining buck convertor (rated to watch the video stream through the mobile app.
10A) is used to power the rest of the circuit. [5]
ESP32 is used as a microcontroller as it has WiFi
E. Servo Calibration
functionality which can be used to connect to the
The MG995 servos used for the arm were not
internet. A JioFi hotspot is used to connect to the
calibrated to move to a correct angle.
internet. Since the JioFi hotspot is based on the
mobile internet, the hotspot itself is mounted on the For an input of 0 to 147 degrees (using Arduino
robot and hence there is no effect on the mobility of IDE) the servo moved from 0 to 180 degrees. So,
the robot the servos were calibrated accordingly.
2. ESP32 and connected devices The fully open and fully closed point for the servo
Six Tower Pro MG995 servos are connected to the used in the claw was found to be 180 and 140
digital pins of the ESP32 Microcontroller (D2, D4, respectively
D5, D15, D18, D19). L298N Motor Driver is used to F. Movement Equations
control two 200 RPM geared DC motors. The inputs
are connected to D12, D13, D14, and D27 while
enable is connected to D23 and D22. Two HCSRO4
Ultrasonic sensors are used to measure distance in all
four directions. Digital Pins D25, D26, D32, and D33
are used for connections of ultrasonic sensor
3. Raspberry Pi and Camera
A Raspberry Pi camera is connected to Raspberry Pi
3B+ which can be used to view camera preview
remotely.
The previously used 5v 10Amp buck convertor is
used to directly power the raspberry pi through its Fig 6 2D constrained sketch in Fusion 360
GPIO pins.
A 2D constrained line schematic was set up in Fusion
360. For different values of the horizontal distance,
angle values were found which then was inserted into
Microsoft Excel to find polynomial equations
between the distance and the angle

Fig 5 Circuit

@ IJTSRD | Unique Paper ID – IJTSRD52430 | Volume – 6 | Issue – 7 | November-December 2022 Page 859
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
Buttons to perform some programmed operations
such as Automatic Turn, Moving Arm to Object
Location, and Home button to place arm to home
location
LIFT button to lift the arm up
Terminal window to perform debugging.
H. ESP32 Code Flow chart and Explanation
The code for the ESP32 microcontroller is written in
C++ inside Arduino IDE. Different extra libraries for
the Blynk platform and servo control were also
installed in the Arduino IDE. Since the code has more
than 500 lines so only an explanation and flowchart
are mentioned in this paper. The actual code can be
viewed using the GitHub link provided in the
references [1].
Fig 7 Movement Equations
G. Mobile App using blynk

Fig 9 Basic Flow Chart of the Operation


Initially, the ESP32 gets connected to the WiFi
Fig 8 Mobile App using Blynk IoT platform hotspot with the credentials provided in the code
The Blynk IoT provides IoT functionality to the robot. itself. Then the device looks for the trigger of any
This platform also provides an app builder with which virtual pin through the application created using the
we can build our app using its provided widgets. Blynk platform.
Features of our Mobile Application: Vx represents the Virtual Pin where ‘x’ stands for the
Display for camera output pin-number. The various functions involved in the
Horizontal and Vertical Sliders to control Arm operation are listed in TABLE I.
position
Sliders to control Wrist Angle and Claw
Joystick to move the Robot
TABLE I. Various Functions Triggered by the Virtual Pins and their Description
Virtual Pin/ Functional Description of
Additional Functions Called during operation
Data Stream BLYNK_WRITE (Vx)
Slider control for Base i.e.,
V0 BaseServoOperate() – Custom function to rotate base servo
Horizontal Slider
V2 Terminal Window GetSensorData() – To get the data from both ultrasonic sensors
FrontDist() – To get front sensor destance data
forward(), backward(), right(), left(), Brake() – These functions
V3 Joystick Control
control the 6 pins of the motor driver which then controls the 2
motors.

@ IJTSRD | Unique Paper ID – IJTSRD52430 | Volume – 6 | Issue – 7 | November-December 2022 Page 860
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
calcTheta(vertValue) – This will calculate theta based on the
V4 Vertical Slider value of the vertical slider according to the formulae derived
earlier.
WristServoOperate() – To operate the wrist servo based on the
V5 Slider to Rotate Wrist
slider value
ClawServoOperate() - To operate claw servo based on the
V6 Slider to operateClaw
slider value
Home() – To make the servo go home positions as defined in
V7 Home Button
the code.
autoTurn() – This will initially calculate the distance in all 4
V8 AutoTurn Button
directions and then decide to rotate in the preferred direction.
FrontDist() – To obtain a front distance value from the sensor
calcTheta()- To calculate theta values based on the front sensor
V9 ArmToObject Button
and then operate the servos
BaseServoOperate() – To operate base servo
ArmServoOperate() – To rotate the servo based on the two
V10 Lift Button
angle values

III. Working Demonstration C. Automatic Turn


A. Automatic Arm Movement Towards object

Fig 10 Automatic Arm Movement and Object


Lifting
When the ArmToObject button is pressed, the front Fig 12Robot initial State and State after Turn
Sensor finds the distance of the object from the robot Operation
which then is used to move the robotic arm. Then the When the Automatic Turn button is pressed, the robot
object can be lifted just by clicking on the LIFT will first find the distances in all 4 directions and then
button on the mobile application. turn Right or Left based on the amount of distance
B. Sensor readings through debug window available.
It was found out that when both the motors are
spinning in the same rotational direction
(clockwise/anticlockwise) with the same base speed
(ie ~100rpm at ~12v), then it takes exactly 600ms to
make a perpendicular rotation. Hence to turn left or
right the motors were programmed to spin in
clockwise or anticlockwise respectively for exactly
600ms.
The entire robot’s operation can be viewed through
the demonstration videos which can be viewed
through the Google Drive link shared. [6]
Fig 11 Robot State and Sensor Readings
IV. Summary
The robot is programmed as such if you enter
In this paper, a practical model of the Mobile Robotic
“sensor” in the debug window then the robot will find
Arm was discussed. Initially, the complete robot was
the distance in all 4 directions and then displays it on
designed using Fusion 360 from which some parts
the debug window itself.
were 3D printed including the parts for the Arm of the

@ IJTSRD | Unique Paper ID – IJTSRD52430 | Volume – 6 | Issue – 7 | November-December 2022 Page 861
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
robot and structural support parts like PCB holder and Using image processing to identify the object
power supply holder. The design gave an idea of how with its 3D dimensions
the real-world robot will look like and it also helped
VI. References and Useful Links
in constructing the Aluminum 2020 profile-based
[1] GitHub link to view this entire project.
chassis and gave a layout for laser cutting the 3 https://github.com/naman-tanwar/Mobile-
acrylic sheets which were then used to mount the
Robotic-Arm
ultrasonic sensor and the motors
[2] GitHub link for local Raspberry Camera setup:
Then circuit was designed and built on a PCB board
https://github.com/silvanmelchior/RPi_Cam_W
(Veroboard) keeping in mind the power requirements
eb_Interface
of every component. Temporarily a 12v SMPS was
used which can easily be replaced with a suitable [3] Online Laser Cutting service used:
LiPo battery in the future. https://robu.in/product/online-laser-cutting-
service
Then the movement equations were practically
derived, different moving parts were calibrated and [4] Blynk IoT platform website: https://blynk.io/
finally, a suitable code for the ESP32 was written in [5] Remote.it website: https://remote.it/
C++ with different functions like Setting arm location
using Sliders, Auto Arm to Object, Arm to Home, [6] Google Drive link for videos related to robot’s
Lift Operation, Automatic Robot Turn operation, operation:
Joystick Movement and Getting data using terminal, https://drive.google.com/drive/folders/1_x-
using Arduino IDE. DkT6cmrkyrrUYNz0MS6p55thI3KZy?usp=sh
aring
Finally, the working of the complete project was
demonstrated. [7] Battery calculator:
https://www.batteriesinaflash.com/c-rating-
V. Future Improvements calculator
Using LIPO battery instead of SMPS Supply
[8] Russell Barnes, “Power your Raspberry Pi:
Using better servos (rather than stepper motors) expert advice for a supply”. Available at:
which are more precise, have zero gear dead- https://magpi.raspberrypi.com/articles/power-
zone, and can handle more weight (This may supply
require a major design change and
implementation of gear system) [9] Pavel Bayborodin, “How to get started with
Blynk”, posted on June 15, 2021. Available at:
Coding an automatic area map algorithm to https://blynk.io/blog/how-to-get-started-with-
perform a certain task in a remote area without blynk
human intervention
[10] Official Fusion 360 design guide referred.
Using LIDAR instead of Ultrasonic sensors to Available at:
map the area https://help.autodesk.com/view/fusion360/ENU
/courses/

@ IJTSRD | Unique Paper ID – IJTSRD52430 | Volume – 6 | Issue – 7 | November-December 2022 Page 862

You might also like