0% found this document useful (0 votes)
11 views

Multitasking Voice-Controlled Robot

The document presents a graduation project on a multitasking voice-controlled robot developed by students at October University for Modern Science & Arts. It focuses on utilizing voice recognition technology to manage tasks in various workplaces, incorporating features like obstacle avoidance, object tracking, and manipulation through a robotic arm. The project aims to enhance efficiency and creativity in task management by programming microcontrollers to recognize and execute spoken commands.

Uploaded by

secondt.mail39
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Multitasking Voice-Controlled Robot

The document presents a graduation project on a multitasking voice-controlled robot developed by students at October University for Modern Science & Arts. It focuses on utilizing voice recognition technology to manage tasks in various workplaces, incorporating features like obstacle avoidance, object tracking, and manipulation through a robotic arm. The project aims to enhance efficiency and creativity in task management by programming microcontrollers to recognize and execute spoken commands.

Uploaded by

secondt.mail39
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 74

114Equation Chapter 4 Section 1

MSA
October University for Modern Science & Arts
Faculty of Engineering
Department of Mechatronics System Engineering

Multitasking Voice-Controlled Robot

A Graduation Project
Submitted in Partial Fulfillment of B.Sc. Degree
Requirements in Mechatronics System Engineering
(Part II)

Prepared By

Student 1 Name: Ali Mohamed Ali Ahmed Barakat ID1:203679


Student 2 Name: Mohamed Ahmed Abdelhay Aly ID2:195499

Supervised By

Assoc. Prof. Mohamed Salah Eldin Shibat Elhamed

Spring 2024
Abstract
This study illustrates the concept of using voice recognition technology to manage
and command robots, it’s aimed to manage tasks in workplaces with ease and creativity.
The design is made to be utilizing different techniques, such as avoiding obstacles,
tracking to a specific point via line follower, carrying objects and placing them in the
desired spot via robotic arm and even more potentials.

Assuming the mechanical and circuit design are assembled, by programming


microcontrollers and providing chemistry between them, making it familiar with
phrases in our language so it will be able to recognize and analyze quotes and recognize
programmed commands immediately. Furthermore, it’s preferred to improve accuracy
of voice recognition elements to increase efficiency and reduce potential errors or
mistakes.

So, we made a robot that recognizes wide varieties of quotes, with the right
materials and design it has some tools for some field jobs and programmed to perform
the instructions well and it will be a prototype to inspires real field upgrades.

Keywords: Mobile Robot, Multitasking, AI, Voice Recognition, Robotics, Line


Follower, Robotic Arm, Voice Commands.

i
Acknowledgement
Praise be to Allah the most gracious. Thanks to our professors and supervisors, we were
able to complete this project with their extensive knowledge, extensive experience and
professional experience in a variety of sectors. This endeavor would not have been
possible without their help and guidance.

And special thanks to Dr. Mohamed Shibat for his support, patience, thoughtfulness,
and for guiding us through the whole thing. We also want to thank our families and
friends for supporting us and for tolerating our tempers when things get harsh. We’ve
come a long way and we still have another way to go, so this is to thank them and ask
them to keep supporting us for the rest of the road.

Thanks to our university MSA and all the faculty members for their constant support
and follow, they work continually to improve the courses and programs.

Finally, we owe a debt of gratitude to the engineering team and our colleagues for their
invaluable assistance.

Mohamed Ahmed Abdelhay

Ali Mohamed Ali

ii
Table of Contents

Abstract...............................................................................................................................i
Acknowledgement..............................................................................................................i
Table of Contents.............................................................................................................iii
List of Figures...................................................................................................................iv
List of Tables....................................................................................................................vi
List of Abbreviations......................................................................................................viii
Activity Work Contribution.............................................................................................ix
Report Writing Contribution..........................................................................................viii
Graduation Project II – Time Plan.....................................................................................x
Chapter I Introduction...............................................................................1
1.1.1 Background of the Study.....................................................................................1
1.1.2 Problem Definition...............................................................................................2
1.1.3 Project Aim and Objectives.................................................................................2
Chapter II Literature Background............................................................3
2.1 Preface.....................................................................................................................3
2.2 Manipulator Design................................................................................................3
2.3 Motor Selection.......................................................................................................5
2.4 Sensor Integration...................................................................................................5
2.5 Microcontroller and Motor Driver..........................................................................6
2.6 Control algorithms..................................................................................................7
Chapter III Mechanical Design................................................................11
3.1 Preface...................................................................................................................11
3.2 Material Selection.................................................................................................11
3.3 Design and Specifications.....................................................................................12
3.4 Mecanum Wheels Modelling................................................................................13
3.5 Robotic Arm Modelling........................................................................................16
Chapter IV Electric Circuit Design.........................................................24
4.1 Motors...................................................................................................................24
4.2 Sensor Fusion........................................................................................................27
4.3 Modules.................................................................................................................29
4.3 Wiring Diagram....................................................................................................34
Chapter V Control Algorithms................................................................36
5.1 Preface...................................................................................................................36
5.2 Speach-recognition...............................................................................................37
5.3 Line Following......................................................................................................38
Chapter VI Implementation Proccess.....................................................39
6.1 Preface...................................................................................................................39
6.2 Manufacturing process..........................................................................................39
6.3 Assembly...............................................................................................................40

Chapter IX Conclusions and Future Work............................................43


9.1 Conclusion............................................................................................................43
iii
9.2 Future Work..........................................................................................................43
References..................................................................................................48
Appendices.................................................................................................52
Appendix A: Plywood Mechanical Properties .............................................................52
Appendix B: Datasheeets of electronic components ....................................................54

List of Figures

Figure 1. 1: Concept of voice-recognition.........................................................................1


Figure 2. 1: Lynx motion 6................................................................................................3
Figure 2. 2: Coordinate frame of the robotic arm..............................................................4
Figure 2. 3: Architecture of sensors and actuators............................................................6
Figure 2. 4: Scheme for motor drivers ..............................................................................7
Figure 2. 5: Suggested software system............................................................................8
Figure 2. 6: Proposed architecture of navigation system...................................................9
Figure 2. 7: Multylayer perceptron ANN........................................................................10
Figure 2. 8: Amazon warehouses mobile robot using QR code......................................10
Figure 3. 1: Grades of Finnish Plywood..........................................................................11
Figure 3. 2: 3D model of robot........................................................................................13
Figure 3. 3: Stress analysis of the top part.......................................................................14
Figure 3. 4: Analysis of the chassis base.........................................................................14
Figure 3. 5: Analysis of robotic arm base........................................................................14
Figure 3. 6: Analysis of link 1.........................................................................................15
Figure 3. 7: Analysis of link 2.........................................................................................15
Figure 3. 8: Analysis of link 1.........................................................................................15
Figure 3. 9: Model of a mecanum wheel with graphically marked angle.......................16
Figure 3.10: Chassis and wheels 3D model.....................................................................16
Figure 3.11: Analytical model of WMR with mecanum wheels.....................................17
Figure 3.12: 3D model of arm design..............................................................................18
Figure 3.13: Arm base drawing in Solid Work...............................................................19
Figure 3.14: Waist drawing in Solid Work......................................................................19
Figure 3.15: Link drawing in Solid Work.......................................................................20
Figure 3.16: Gripper drawing in Solid Work..................................................................20
iv
Figure 3.17: Representation model of robotic arm in MATLAB software ....................21
Figure 3.18: Representation model of robotic arm in ROS Moveit................................23
Figure 3.19: The GUI program calculates forward and inverse kinematics variables....24
Figure 4. 1: DC geared motor..........................................................................................25
Figure 4. 2: MG995 servo motor.....................................................................................25
Figure 4. 3: Tower Pro SG90 servo motor......................................................................25
Figure 4. 4: HC-SR04 Ultrasonic Sensor........................................................................26
Figure 4. 5: IR sensor......................................................................................................27
Figure 4. 6: Arduino compatible line tracker sensor.......................................................28
Figure 4. 7: Arduino Mega2560......................................................................................29
Figure 4. 8: L298N Motor Driver Module......................................................................30
Figure 4. 9: Raspberry Pi 4 microcontroller....................................................................31
Figure 4. 10: External USB 7.1 audio card......................................................................32
Figure 4. 11: K35 Wireless microphone..........................................................................32
Figure 4. 12: Logic level converter.................................................................................33
Figure 4. 13: Logic level converter scheme....................................................................33
Figure 4. 14: Wiring diagram..........................................................................................34
Figure 4. 15: Powering Raspberry pi...............................................................................34
Figure 5. 1: STT model...................................................................................................37
Figure 5. 2: Line following technique.............................................................................38
Figure 6. 1: Laser cut process..........................................................................................39
Figure 6. 2: Press-fit chassis with bolts...........................................................................40
Figure 6. 3: Fixing components with bolts......................................................................41
Figure 6. 4: DC motor mounts and wheels .....................................................................41
Figure 6. 5: Assembling robotic arm...............................................................................42

v
List of Tables

Table 2. 1: Specifications of mechanical design.............................................................13


Table 2. 2: DH parameters of kinematics model.............................................................19
Table 8. 1. Detailed cost analysis....................................................................................44

List of Abbreviations
AI Artificial Intelligence
DOF Degrees of Freedom
PWM Pulse Width Modulation
I/O Input and Output
CAD Computer Aided Design
IR Infrared
JARVIS Just A Rather Very Intelligent System
GPIO General Purpose Input/Output
ANN Artificial Neural Network
ASTM American Society for Testing and Materials
WMR Wheeled Mobile Robot
DH Denavit-Hartenberg
WMR Wheeled Mobile Robot
ROS Robot Operation Software
GUI Graphical User Interface
SLAM Simultaneous Localization and Mapping
RNN Recurrent Neural Network
CTC Connectionist Temporal Classification
vi
GRU Gated Recurrent Unit
IMU Inertial Measurement Unit
LiDAR Light Detection and Ranging

Activity Work Contribution


Part II
Project title: Multitasking voice-controlled robot.
Advisor name: Assoc. Prof. Mohamed Salah Eldin Shibat Elhamed

Students
Week Meetings Work Done
Name
Mohamed Ahmed
Dr. Mohamed Proposal Abdelhay
01 Shibat Ali Mohamed Ali
Dr. Mohamed Mohamed Ahmed
02 Shibat Literature Review Part 1 Abdelhay
Ali Mohamed Ali
Dr. Mohamed Mohamed Ahmed
03 Shibat Literature Review Part 2 Abdelhay
Ali Mohamed Ali
Mohamed Ahmed
Dr. Mohamed Abdelhay
04 Shibat Mechanical Design
Ali Mohamed Ali
Dr. Mohamed
Mohamed Ahmed
Shibat
Stress Analysis of mechanical Design Abdelhay
05 Ali Mohamed Ali

Dr. Mohamed Mohamed Ahmed


06 Shibat Wheeled mobile car modelling Abdelhay
Ali Mohamed Ali
Mohamed Ahmed
07-08 Midterms Robotic arm modelling Abdelhay
Ali Mohamed Ali

Dr. Mohamed Mohamed Ahmed


09 Electric circuit part 1 Abdelhay
Shibat
Ali Mohamed Ali
Dr. Mohamed Mohamed Ahmed
Shibat Abdelhay
10 Electric circuit part 2
Ali Mohamed Ali
Dr. Mohamed Mohamed Ahmed
11 Shibat Control algorithm background Abdelhay
Ali Mohamed Ali

vii
Dr. Mohamed Mohamed Ahmed
12-14 Shibat Control algorithm implementation Abdelhay
Ali Mohamed Ali
Mohamed Ahmed
14-18 Finals Simulation Abdelhay
Ali Mohamed Ali

Report Writing Contribution


Part II
Project title: Multitasking voice-controlled robot.
Advisor name: Assoc. Prof. Mohamed Salah Eldin Shibat Elhamed
Chapter I Introduction Student Name
1.1 Background of the Study Mohamed Abdelhay
1.2 Problem Definition Ali Mohamed
1.3 Aim and Objectives Ali Mohamed
Chapter II Literature Background
2.1 Preface Ali Mohamed
2.2 Manipulator Design Ali Mohamed
2.3 Motor Selection Ali Mohamed
2.4 Sensors Integration Ali Mohamed
2.5 Microcontroller and Motor Driver Ali Mohamed
2.6 Control Algorithms Mohamed Abdelhay
Chapter III Mechanical Design

3.1 Material Mohamed Abdelhay


3.2 Design and specifications Mohamed Abdelhay
3.3 Stress analysis Mohamed Abdelhay
3.4 Mecanum wheels modelling Mohamed Abdelhay
3.5 Robotic arm modelling Ali Mohamed
3.6 Extinguishing System Mohamed Abdelhay
Chapter IV Electric Circuit Design
4.1 Preface Ali Mohamed
4.2 Motors Ali Mohamed
4.3 Sensors Ali Mohamed
4.4 Modules Ali Mohamed

viii
Chapter V Control Algorithm
5.1 Preface Mohamed Abdelhay
5.2 Voice-recognition Mohamed Abdelhay
Chapter VI Implementation Process
6.1 Preface Mohamed Abdelhay
6.2 Manufacturing Process Mohamed Abdelhay
6..3 Assembly Mohamed Abdelhay
Chapter VII Control Algorithm
7.1 Preface Mohamed Abdelhay
7.2 Analysis Mohamed Abdelhay
Chapter VIII Testing and Results
8.3 Obstacle Avoidance Mohamed Abdelhay
8.4 Navigation and Path Planning Mohamed Abdelhay
Chapter IX Conclusions and Future Plan
9.1 Conclusions Ali Mohamed
9.2 Future work plan Ali Mohamed

ix
Oct. Nov. Dec. Jan.

Task
Week 1
Week 2
Week 3
Week 4
Week 1
Week 2
Week 3
Week 4
Week 1
Week 2
Week 3
Week 4
Week 1
Week 2
Week 3
Week 4
Introduction
Literature Review
Mechanical Design
Electric Circuits
Control algorithm
Results and Discussion
Conclusion and Future
Work
Book
Report Writing

Graduation Project II – Time Plan

x
xi
Chapter I
Introduction
1.1 Background of the study:

It is obvious that some workers in different fields usually in need of low and high-level
assistance whether to contribute in the task at hand or to wrap up other tasks
accordingly to spare time for other things. Hence, for decades mobile robots are
commonly utilized on construction sites, warehouses, and industrial plants, not to
mention be used in material handling applications, which are gaining popularity.

That’s why it was necessary to precisely develop the methods of controlling and
directing the robotic devices through multiple ways. This is where the voice-controlled
robots’ role come in the light, humans' most common mode of communicating is
through voice. Almost every communication is carried out utilizing speech signals.

Fig1.1: Concept of voice-recognition (bishoph.org)

Electrical signals can be generated from sounds and speech signals, this is done via
voice recognition technique for converting voice impulses into text format for a device
or computer instructions. This voice recognition system can be used to regulate and
generate speech acknowledgment. speech-controlled robots comprehend and carry out
the required actions using hundreds of speech instructions.
1.2 Problem definition:

Managing work field’s tasks and objects with versatility and swiftly is recommended to
organize the environment for better production, using simplest way of controlling to get
the job done.

1.3 Aims and objectives:

It’s aimed to design, test and implement a versatile mobile robot capable of navigating
and performing several tasks by recognizing human spoken commands in order to get
the job done smoothly and with more creativity. This can be done by the following:

 To examine previous work done on the problem under consideration.


 To summarize key findings from previously reviewed work.
 To determine the desirable tasks to design the mobile robot mechanical system
intended to serve the requirement.
 To design the circuitry of sensors and actuators which work accordingly to reach
the targeted goal, then connecting them together on real.
 To develop algorithm or AI model which will implement voice recognition
technique to receive commands and thus perform the coded different functions
made to it.
 To carry out a lot of field tests and feedback in order to enhance the accuracy of
the performance.
Chapter II
Literature Background
2.1. Preface
The references include details about the mechanical design of the robot, such as the
chassis and motor mounting brackets, all created using SolidWorks software. In the
field of robotics, mechanical design is a well-established area of research. Robot chassis
design, in particular, is a critical aspect as it directly affects the robot's structural
integrity and overall performance. Computer-aided design (CAD) tools like SolidWorks
are commonly employed for precision in mechanical design.

2.2 Manipulator design


For manipulator mechanical design, the design of robotic manipulators, particularly for
voice-controlled applications, has been an area of considerable interest and innovation
in recent years. According to (Kabir and el. al. (2011)), they provided valuable insights
into the development of a voice-controlled robotic arm and its manipulator design,
exploring some key themes and trends in manipulator design for such systems.

Fig. 2.1: Lynx motion Lynx 6 (Kabir and el. al. (2011))

According to Y. Choi et al. (2008), the gripper has force-torque sensors, which
improves its ability to precisely grasp and manipulate objects, and the manipulator is
mounted on a vertical lift to extend its vertical reach. Additionally, it locates and
engages with objects in its surroundings using its sensors and cameras. It advances in
the direction of a predetermined 3D point, locates surrounding objects with the laser
range finder, and then releases its gripper to manipulate it. Should it be unable to
successfully grasp an object the first time, it will try up to four times before giving up.
(Y. Choi and others, 2008)
Deciding how many degrees of freedom (DOF) are needed for a given application is a
critical part of manipulator design. The authors utilize a five-degree-of-freedom small
robotic arm. Finding a DOF that balances precision and flexibility has become popular
in manipulator design.
The choice of actuators is another aspect of the manipulator's design. Because of their
high torque-to-weight ratio and precise control via pulse-width modulation (PWM),
servo motors are widely used. The servos in the reference are HS-422 and HS81. These
motors are frequently used by manipulators because of their small size and PWM
control compatibility.
Inverse kinematics is often used in manipulator design for voice-controlled systems to
determine the joint angles needed to achieve a given end-effector position and
orientation. According to (Kabir and et al., 2011), joint angles are calculated using an
inverse kinematics method based on desired locations.
When it comes to comprehending how robotic arms move, kinematics analysis is
essential because it sheds light on the connections between joints, links, and the location
and orientation of the end-effector. The paper by Tahseen F. Abaas et al. (2020) that
describes the kinematics analysis of a 5-DOF robotic arm employing forward and
inverse kinematics is the main subject of this review. A MATLAB GUI software is used
to illustrate how they integrated the geometric approach and the DH method for both
forward and inverse kinematics analysis.

Fig. 2.2. The coordinate frame of the robotic arm (Tahseen F. Abaas et al. (2020))
2.3 Motor selection for robotics
It's highlighted by (Saleh Ahmed and et al. (2021)) that the basic significance of
appropriate engine measuring in robot plan. Various ponders emphasize the ought to
calculate the specified engine torque and speed to attain wanted robot execution. It is
recognized that insufficient engine measuring can result in diminished effectiveness,
solidness, and expanded costs in robot stages.

DC engines are commonly utilized due to their controllability and reasonableness for
different applications in mechanical technology. The creators talked about the
determination of a particular DC engine (Gm25-370ca) based on the calculated engine
torque and rotational speed necessities. Researchers often consider components like
engine sort, productivity, control yield, and compatibility with the robot's planning
errands.

Furthermore, because it was outlined by (Dr. Ali Abed and et al. (2015)), the two servo
engines utilized within the robot's chassis are basic for controlling its development. The
L298 engine driver, a double H-bridge driver, plays a crucial part in directing the speed
and course of these engines. Inquire about on engine drivers in mechanical applications
underscores their noteworthiness in engine control, giving a dependable implies of
interpreting electrical signals into mechanical movement (User's Direct, 2012), but in
our case, we'd lean toward utilizing the microcontroller with engine driver that'd be
implanted on it.

2.4 Sensor integration


A key component of the system's operation is the integration of sensors. To aid in
navigation and obstacle avoidance, it integrates tactile, proximity, and infrared optical
path sensors. In order to make sure the robot precisely follows a predetermined path; the
optical track sensing device depends on the absorption and reflection of infrared
radiation. At the same time, proximity sensors give the system intelligence by allowing
for maneuverability in unstructured environments and providing early warning of
obstacles. Together, these sensors improve the system's capacity for efficient navigation
and interaction with its environment (Ms. Damodar Patil et al., 2014).
Fig. 2.3. Architecture of sensors and actuators (Daniel Jeswin et al., 2018).

2.5 Microcontroller and motor driver

As it was highlighted by (Ms. Sarada and et al. (2023)), the use of the Arduino
ATmega328/P microcontroller as the central processing unit for the voice-controlled
robot for its flexibility and ease of use in robotics applications. Researchers often
leverage Arduino for its extensive I/O capabilities, programmability, and compatibility
with various sensors and modules.
Additionally, the integration of hardware components in robotics is a fundamental
aspect of system design. The reference highlights the use of the Raspberry Pi 3 as the
core hardware component for controlling the robotic arm. The Raspberry Pi's small size,
cost-effectiveness, and processing capabilities have made it a popular choice in the field
of robotics (Daniel and et al. (2018)). While the authors focus on this specific
implementation, some others have numerous examples of Raspberry Pi-based robotic
systems. such object recognition and tracking, showcasing the platform's versatility.
Power Supply

Wheel Motor Driver


RF motors

Receiver Microcontroller

Raspberry Pi Arm Driver Gripper


motors

Fig 2.4. Scheme for motor drivers

Efficient control algorithms are vital for ensuring precise and accurate robotic arm
movements. it was also mentioned that the controller module issues low-level
instructions to control the arm's servo motors. Various control strategies are employed
in robotic systems (Daniel and et al. (2018)).

Raspberry pi was also used by Altayeb & Al-Ghraiah (2022). The authors explained
that employing Raspberry Pi in a voice-controlled robot presents a multitude of
advantages, revolutionizing the way robots interact with their environments. The
compact yet powerful nature of Raspberry Pi allows for a seamless integration of
sophisticated voice recognition and processing capabilities. Its low-cost, open-source
architecture makes it an accessible platform for developers and hobbyists alike,
fostering innovation and customization in the realm of robotics. Raspberry Pi's GPIO
(General Purpose Input/Output) pins enable the connection of various sensors and
actuators, enhancing the robot's responsiveness to voice commands.

2.6 Control algorithms


According to S. Maksymova et al. (2017), speech recognition plays a key role in
controlling robots and is regarded as a useful tool for boosting human-robot interaction
and the usability of robotic applications. The authors also demonstrated how users can
give vague instructions more naturally when they use fuzzy commands. Fuzzy
commands and other natural language interaction with robots have drawn attention
because they facilitate easier and more natural human-robot communication.
Additionally, research and industry have widely used Microsoft Speech Engine (ASR)
technology to develop voice-controlled applications (S. Maksymova et al., 2017).

Fig. 2.5. Suggested overall software system (Dr. Ali Abed and et al. (2015))

As stated in the above reference, Kai-Wen Lo (August 2010) conducted a study in


which the author assesses the design and algorithms of a hybrid navigation system for
autonomous robots. For dependable navigation and obstacle avoidance, the system
combines model-based and behaviour-based navigation techniques. The modular design
of the system, the particular algorithms used, and their useful implementations in real-
world scenarios are all covered in the discussion.
To efficiently navigate complex settings, hybrid navigation systems combine the
advantages of behaviour-based and model-based techniques. Model-based navigation
uses environmental models to plan routes, but behaviour-based navigation responds to
changes in the environment instantly.
Fig. 2.6. Proposed architecture of navigation system (Kai-Wen Lo (August 2010)

Using ANNs is another strategy that we most likely will use in our project. An algorithm for
supervised machine learning called Artificial Neural Networks (ANNs) is frequently used to
recognize patterns in signals and images. In their paper, Rendyansyah et al. (2022)
described how ANNs are made up of an input, a hidden layer, and an output. Each training
data comparison in the ANN process is changed until the machine learning outcomes match
the intended output.

Fig. 2.7. Multilayer perceptron ANN topology (Rendyansyah et al., 2022)


Two techniques—QR code localization and radio frequency identification—have been
used in projects pertaining to indoor robot mapping and localization. In Amazon
warehouses, commodities are transported from trucks and laborers to stock, where they
are stored, using the QR code localization method. The robot uses a QR code scanner
and stickers placed on the ground to map and localize itself. Each code provides
information about the robot's location, allowing it to determine where to travel next. The
RFID approach uses the same technique, except that the robot is outfitted with an RFID
scanner to read data from RFID chips that are positioned on the ground.

Fig. 2.8. Amazon warehouses mobile robot using QR code. (businessinsider.in)


Chapter III
Mechanical Design
3.1 Preface
This chapter gives a total description of the mechanical design, starting with material
selection and criteria of choosing a suitable material. Then the design and specifications
of 3D model given the tasks the robot is to carry out. The dimensions of the chassis are
also described below. Also, a clear description of the modelling of a mecanum wheeled
mobile robot, how they work and their means of control. A description of the
extinguishing system used with its specifications and how it works is provided as well.
Also provided the modelling of the robotic arm and calculations of forward and inverse
kinematics to facilitate the programming and use of the arm.

3.2 Material Selection


As previously discussed, the robot needs to operate in an industrial environment, so the
chassis must withstand the circumstances. 6mm-thick Finnish plywood sheets of grade
B(I) are used as the material for the chassis due to its characteristics. Standard Finnish
plywood is classified according to the grades of its face veneers which comply with
Standard EN 635. These grade categories are based on the recommendations of the
International Organization for Standardization (ISO 2426).

Fig. 3.1 Grades of Finnish Plywood (Build4less, 2020)


In grade B(I) pin knots permitted, also other knots and holes are allowed up to 6 mm in
diameter but doesn’t exceed a cumulative diameter of 12 mm per m 2. Closed splits and
checks permitted up to an individual length of 100 mm and only one per meter of width.
Other defects strictly limited.

In addition to strength, modulus of elasticity and shear modulus the density and section
properties are needed as input values in the design process. These properties have been
determined for Finnish plywood by VTT (Technical Research Centre of Finland) in
cooperation with the plywood producers.

Table 2.1 Specifications of birch plywood with 6mm thickness

CHARACTERISTIC STRENGTH MEAN MODULUS OF


GRAD ELASTICITY
E Bending Compressio Tension Bending Tension and
n Compression
B(I)
6 MM 29 22.8 N/mm2 42.2 4763 N/mm2 9844 N/mm2
THICK N/mm2 N/mm2

3.3 Design and Specifications

The chassis is designed to withstand the weight of the object being carried by the
robotic arm. And the weight of the components as well. The chassis consists of mainly
two parts, upper part, and lower part. The base part has the space to hold the electrical
components as well as the motors from the bottom. The top part of the chassis is
supported on the bottom part using side walls fastened together with press-fit technique
and also screws and nuts. And is also supported on a column fastened to the base to give
it extra strength under the center of the robotic arm base. The figure shows the 3D
model of the robot. The dimensions of the 3D model are 500x300x60 mm regardless of
the robotic arm length when fully extended which will be described in details in the
robotic arm section.

Fig. 3.2 3D model of robot

The three parts of the chassis are made of 3 mm thick stainless-steel sheet which are cut
and machined to give the design of each part provided in the appendices. The following
table summarizes the specifications of the chassis.

Table 2.1 Specifications of mechanical design.

PARAMETER VALUE UNIT


MATERIAL Birch (I) Plywood
DENSITY 680 kg/m3
MASS 445 G
VOLUME 654411.7 mm3

3.4 Stress analysis

In the analysis stage, ASTM A240 material is used, the force applied is 500N which is
roughly twice the actual force the robot will be exposed to. The body will be subjected
to different forces with different orientations. The force of the body’s weight and
components subjected vertically downwards on the chassis, and the resolution of the
force of the lifted objected along the axes of the robotic arm. Each part is analyzed
separately.
Fig. 3.3. Stress analysis of the top part

Fig. 3.4. Analysis of the chassis base

Fig. 3.5. Analysis of robotic arm base


Fig. 3.6. Analysis of link 1

Fig. 3.7. Analysis of link 2

Fig. 3.8. Analysis of link 3

3.5 Mecanum wheels modelling

This new wheel used in wheeled mobile robots (WMR) includes the so-called
omnidirectional mecanum wheels. It consists of a hub and a number or rollers installed
on the circumference of the hub. The number of the rollers vary depending on the
design, all are free to rotate around their axis which makes an angle =45° with the axis
of rotation of the wheel itself.
Fig. 3.9. Model of a mecanum wheel with graphically marked angle 
(Hendzel & Rykała, 2017)

The wheeled mobile robot consists of a chassis, four mecanum wheels and four
independently controlled direct-current motors. The mecanum wheels are in turn
rigidly placed on the gearboxes fixed to the shafts of the motors. The 3D model of
the mentioned WMR is presented in the Fig.

Fig. 3.10. Chassis and wheels 3D model

3.5.1 Kinematics of mecanum wheels.


During movement of the WMR, the wheels rotate with angular velocity φ̇ i, where i
denotes the number of the wheel ( i = 1, 2, 3, 4). The radiuses of the wheels (R) and
rollers (r) are constant values that are equal for each of the four wheels. Further
assumptions in regard to the WMR are presented graphically in Fig.2.11. It has also
been assumed that the angle of rotation of the platform of the WMR around the
characteristic point S is angle β. The width of the platform is 2 S y , and the distance
between point S, and the midpoints of the front and back axle (points A1, A2) are in
both cases equal to S x. Furthermore, we assume that all the mecanum wheels are
moving on a level base without skidding. A WMR object with mecanum wheels can be
described in a co-ordinate system x p y p z p (Fig.2.11.) connected with the mass center of
the WMR platform, i.e., the characteristic point S.

Fig. 3.11. Analytical model of WMR with mecanum wheels (Hendzel & Rykała, 2017)

The description of kinematics of the WMR in the case under consideration can also be
presented in the form of the following equations.

v S x −v S y − β ( s x˙+ s y ) −φ̇1 ( R+r )=0


p p

v S x −v S y − β ( s x˙+ s y ) −φ̇2 ( R+r )=0


p p

v S x −v S y − β ( s x˙+ s y ) −φ̇3 ( R+r ) =0


p p

v S x −v S y − β ( s x˙+ s y ) −φ̇ 4 ( R+ r )=0


p p

where: v S x - projection of velocity of point S on the x p axis, v S y - projection of velocity


p p

of point S on the y p. axis.


3.6 Robotic arm modelling

As for robotic arm, it was previously mentioned the design shall be a 5 DoF robotic arm
with end effector of two gripper to grab small to medium objects and to provide
flexibility and good end effector orientation, there will be some places for servo motor
attached to each link connected to the main microcontroller, it’s also made from 3D
printed durable plastic which serves the purpose.

Fig. 3.12. 3D model of arm design


The dimensions implemented as shown in the next figures, are important to mention for
the modeling that’ll be made, note that all dimensions shown are in (mm). The base is
the one that’ll carry the rest of the arm, it’s fixed on the top of car chassis, so the arm
extends to the targeted object, there’s also a slot for waist servo motor to control and
rotate it 360 degrees.
Fig. 3.13. Arm base drawing in Solid Work

Speaking of waist, this one is what will carry the first arm, it’s installed on the base with
1 mm less to freely rotate on it. Note that this also has a slot for servo behind the
vertical hole to be attached on the arm to control it.

Fig. 3.14. Waist drawing in Solid Work

The arm, on the other hand, will be two of them to cover good distance of gripping, it’s
ought to rotate 180 degrees freely if not for the collision that’ll occur due to the arm
position on the car.
Fig. 3.15. Link drawing in Solid Work

Now for the end effector, the gripper. Two small gears are attached to each other, one of
them is operated by a small servo attached to the gripper base. Each gear is holding a
gripping finger that’s ought to open and close 90 degrees freely.

Fig. 3.16. Gripper drawing in Solid Work

Furthermore, Kinematics analysis are essential concepts in robotics, and both forward
and inverse kinematics serve distinct purposes in understanding and controlling the
movement of robotic systems, also for designing, programming, and controlling robotic
systems, enabling them to perform a wide range of tasks in diverse applications.

The next section is for reviewing kinematics analysis modelling, using both standard
equations and MATLAB software to illustrate it more.

The Denavit-Hartenberg (DH) parameters method for the kinematic scheme is one of
the most methods used in forward kinematics which represents the relationship of the
joint coordinate between two links as shown in the next table.
Table 2.2. DH parameters of kinematics

LINK ØI(DEGREE) DI(MM) AI(MM) ΑI(MM)


1 q1 96 0 90
2 q2 0 120 0
3 q3 0 90 0
4 q4 0 25 0
5 q5 36 0 90
Where: ai: Link length. αi: Link twist. di: Link offset. Θi: Joint angle.

Fig. 3.17. Representation model of robotic arm in MATLAB software using Robotics
Toolbox

By using transition matrixes, these factors may be utilized to characterize the relative
location and orientation of linkages.

For the Denavit-Hartenberg parameters of the 5 DOF manipulator, where Ci=cos (qi),
Si=sin (qi), Cαi=cos(αi), Sαi=sin(αi):

[ ]
C i −Cαi . Si Sαi . Si ai . Ci
S i Cαi .Ci −Sαi .Ci ai . Si
Ti= 0 Sαi Cαi di
0 0 0 1
[ ] [ ]
C 1 0 −S 1 0 C 2 −S 2 0 a 2.C 2
S1 0 C1 0 S2 C2 0 a 2. S 2
T1= , T2=
0 −1 0 d1 0 0 1 0
0 0 0 1 0 0 0 1

[ ] [ ]
C 3 i −S 3 0 a 3. C 3 C4 0 S4 0
S3 i C3 0 a 3. S 3 S4 0 −C 4 0
T3= , T4=
0 0 1 0 0 1 0 0
0 0 0 1 0 0 0 1

[ ]
C 5 −S 5 0 0
S5 C5 0 0
T5=
0 0 1 d5
0 0 0 1

From all the previous transformation matrixes the end effector transformation matrix
(Te)= T1 T2 T3 T4 T5=

[
C 1 .C 234.C 5+S 1. S 5 −C 1 . C 234. S 5+ S 1. C 5 −C 1. S 234 C 1(d 5. S 234 +a 3. C 23+a 2. C 2)
C 1 .C 234.C 5−S 1. S 5 −C 1 .C 234. S 5−C 1.C 5 −S 1. S 234 S 1(−d 5. S 234 +a 3.C 23+a 2. C 2)
−S 234. C 5 S 234. S 5 −C 234 d 1−a 2. s 2−a 3. S 23−d 5.C 234
0 0 0 1

Fig. 3.18. Representation model of robotic arm in ROS Moveit software


Moreover, inverse kinematics deals with the opposite problem, it involves determining
the joint angles or joint variables required to place the end-effector at a specific position
and orientation in Cartesian space. It establishes a relationship between the desired end-
effector pose and the corresponding joint configurations. Inverse kinematics is crucial
for controlling the robot to reach specific positions and orientations in its workspace.

Solution to the inverse kinematics problem reduced search of the arguments q1, q2, q3,
q4, q5 based on the gripper’s position and orientation. These six equations in five
unknowns may have no solution, but based on elements of Te matrix, the 4 th column
elements are P5x, P5y , P5z respectively, the solution could be written as following:

(Where: ¿), (b=P 5 x . C 1+ P5 y . S 1+ d 5. S 234) )

P5 y
q1=tan-1(¿ P 5 x )

Note that for this solution P5x > 0 (- π /2) < q1 < ( π /2)

q2=atan 2(a . ( a 2+a 3. C 3 )−b . a 3. S 3 , a . a 3. S 3+ b .(a 2 . a 3.C 3))


2 2 2 2
a +b −a 2 +a 3
q3= a rccos ⁡(
2. a 2. a 3
),

q4= q234 -q2 -q3

q5= C234.q1 -2atan (x5y, x5x)

By representing all this on MATLAB using Graphical User Interface (GUI) tool, the
program's input, as demonstrated, was the robotic arm's end-effector's location; its
outputs, on the other hand, were the arm's joints' angles and positions. Additionally, a
3D simulation of the robotic arm's motion was displayed in the GUI.
Fig. 3.19. The GUI program calculates forward and inverse kinematics variables.

.
Chapter IV
Electric Circuit Design
4.1 Motors

4.1.1 DC Motor
Motor sizing is an important procedure in order to find optimal ones for the required
tasks.

For the wheels, which will carry the whole mobile robot, the silver DC metal geared
motor will do the job as it incorporates metal gears in its construction, that plays a role
in enhancing the durability, strength, and resistance to wear and tear of the motor.

Fig. 4.1. DC geared motor

There’re several versions with different specifications, the one used here operates on
6V, has a torque of 3.2 kg.cm and rotation speed is up to 133 RPM.

It is more robust and resistant to wear compared to plastic gears, has higher strength and
torque capability, allowing the motor to handle more substantial loads and offer better
precision and reliability, not to mention the stability and smooth rotating.

4.1.2 MG995 Servo Motor


The robotic arm has 3 MG995 metal gear servo motors, this one is popular and widely
used servo motor in the hobbyist and robotics community. It's a standard-sized servo
motor, meaning it has a specific form factor and mounting pattern that is common in the
industry. It provides the following features:
• High speed rotation for quick response
• Operating voltage range: 4.8 V to 7.2 V
• Stall torque: 9.4kg/cm (4.8v); 11kg/cm (6v)
• Rotational degree: 180º
• Current draw at idle: 10mA
• Current at maximum load: 1200mA

Fig. 4.2. MG995 servo motor

So, it’s concluded that the motor is suitable for lifting the arm links and the load it
should carry, making it functional relative to the unit size.

4.1.3 SG90 Servo motor


Another commonly used servo motor is SG90 which 2 of it will be used in the arm, one
for link 3 which will spin the end effector 180 degrees, and the other for the gears of
grippers of end effector. It provides these features:

 Voltage: +5V in average


 Torque: 2.5kg/cm
 Operating speed is 0.1s/60°
 Gear Type: Plastic
 Rotation: 0°-180°
 Weight of motor: 9gm

Fig. 4.3. Tower Pro SG90 servo motor


Apparently, it’s good for low loads and controlling small parts with low cost of voltage
and budget.

4.2 Sensor Fusion


The sensors in this project are vital for the robot to read feedback to adjust its direction
of operating it takes continuously, here are the main sensors used.

4.2.1 HC-SR04 Ultrasonic Sensor


A lot of mobile robots (including this one) have the HC-SR04 ultrasonic sensor to
measures an object's distance via sonar. With an accuracy of 0.3 cm (0.1 in), this sensor
can read from 2 cm to 400 cm (0.8 in to 157 in), which is sufficient for most hobbyist
projects, ultrasonic transmitter and receiver modules are also included with this specific
module with operation at an ultrasonic frequency of around 40 KHz.

Fig. 4.4. HC-SR04 Ultrasonic Sensor

4.2.2 IR Sensor
It's quite simple to utilize this infrared obstacle/object detecting sensor. A potentiometer
is included to change the sensitivity. The output is a digital signal, so it is easy to
interface with any microcontroller such as Maker UNO, Mega, or even the Raspberry
Pi. By using infrared reflection, this infrared sensor provides quick, easy, and non-
contact obstacle detection. Because it relies on light reflection, different surfaces yield
varied results for the detection, and any infrared source could obstruct the detection.
Fig. 4.5. IR sensor

At the front of the module, there are two infrared emitters and receivers. When an object
blocks the infrared source, it reflects the light, which is picked up by the receiver and
passes through an on-board comparator circuit. The output pin will display logic LOW
in response to adjustments made to the threshold, and the green LED will illuminate to
signify detection. The sensitivity and detection range can be further increased by
rotating the onboard potentiometer clockwise, which is compatible with both 3.3V and
5V power input, making it good for obstacles avoidance purpose.

4.2.3 Line Tracker Sensor


For guiding purpose, a line tracker sensor is to be used in order to make the robot able
to find its way towards the goal. This one is a high-performance line finder that is
designed for line following robotic. It consists two parts - an IR emitting LED and an IR
sensitive phototransistor. It can output digital signal to a microcontroller so the robot
can reliably follow a black line on a white background, or vice versa. The sensor is
Arduino compatible.

Further, it is a small size, operates on 5V DC power supply. has an indicator LED,


Digital outputs and distance up to 3 cm.
Fig. 4.6. Arduino compatible line tracker sensor

4.3 Modules

4.3.1 Arduino Mega2560


It's required to have a module that control the main sensors and actuators and receive
commands from the main microcontroller.

Hence comes the role of Arduino Mega2560 microcontroller, which is assigned here to
control DC motors, sensors and servos. The main reason for this one choice is for it has
many numbers of digital general-purpose input and outputs sockets and analogs sockets,
which is needed for the many numbers of motors.
Fig. 4.7. Arduino Mega2560

The ATmega2560 microcontroller is in charge of this. It has 54 digital input/output


pins, of which 16 are analog inputs and 14 are used as hardware serial ports (UARTs)
for PWM outputs. Additionally, it has an ICSP header, a power jack, a USB port, an
ICSP oscillator operating at 16 MHz, and an RST button. The majority of what is
needed to support the microcontroller is on this board. Therefore, you can power this
board by either connecting it to a PC via a USB cable, or you can use an AC-DC
adapter, battery, or USB cable. By adding a base plate, this board can be shielded from
an unplanned electrical discharge.

4.3.2 L298N Motor Driver Module


Motor driver boards for microcontrollers are electronic modules designed to control the
movement of motors in various applications. These boards are commonly used in
robotics, automation, and other projects involving motorized components, it is designed
to drive specific types of motors, such as DC motors, stepper motors, or servo motors.

This L298N Motor Driver Module is a high-power motor driver module for driving DC
and Stepper Motors. This module consists of an L298 motor driver IC and a 78M05 5V
regulator. L298N Module can control up to 4 DC motors, or 2 DC motors with
directional and speed control.
Fig. 4.8. L298N Motor Driver Module

It operates on 5-24V and 2A current at max with maximum power 25W, each motor
requires an input pin, output pin and enabler pin for PWM control.

In addition, it's provided with small heat syncs and overtemperature protection, for
safety measurement it will be attached to two DC motors, meaning it will require two
drivers.

4.3.3 Raspberry Pi 4
Now for the mind of the whole system which operates all actions, the microcontroller.
The one used here is Raspberry Pi 4 microcontroller, it provides the following features
that makes it a good candidate for such project.

Processor: Broadcom BCM2711 Quad-core ARM Cortex-A72 processor running at


1.5GHz.

With the overall performance, you should be able to work on a Raspberry Pi just as you
would on a standard desktop computer right now. Naturally, it depends on the operating
system and software you're using. However, for this small single board computer, this
represents a major power boost overall. The Raspberry Pi 4's sole disadvantage is that it
will get much warmer than its predecessors.

For RAM memory size it has options for 2GB, 4GB, or 8GB of LPDDR4-3200
SDRAM.
As for USB 3.0, Gigabit Ethernet and Bluetooth, it offers two USB 3.0 ports and two
USB 2.0 ports. While the storage is still handled by an SD card, the team added USB
3.0 to make hooking up SSD’s faster.

Fig. 4.9. Raspberry Pi 4 microcontroller

The Raspberry Pi 4 B offers 802.11ac Wi-Fi, the same as the Raspberry Pi 3 B+, but
sports Bluetooth 5.0. To improve upon PCB routing, the Raspberry Pi team moved the
Gigabit Ethernet-jack from the bottom right of the board to the top right, it is also
compatible with the Power over Ethernet HAT.

In addition, it has 40-pin GPIO header for connecting to various devices and
peripherals. And a USB-C power supply with support for 5V/3A power input.

4.3.4. Voice Detection Module


It's required a driver that will incoming voice commands keywords in order to trigger
the functions programmed in microcontroller.

Hence an external audio card will be added to the system, the one below is fit for the
requirement, it has an external 7.1 channel sound card connected via USB Type-A,
stereo 3.5mm jack, mono microphone input, is compatible with USB 2.0 full speed
specification (12Mbps), and is powered by a USB port by default, so it doesn't require
an external power adapter.

Also, it supports several of common operating systems such Windows, Mac OS and
Linux, which will be used and explained after the next few sections
Fig. 4.10. External USB 7.1 audio card

Paired with audio card there isa wireless microphone to transfer voice data to it, for this
purpose the K35 microphone is used.

Fig. 4.11. K35 Wireless microphone

The main key feature is it operates on connector of 3.5mm jack which connects well to
the audio card and recognized, other features apart from being portable and wireless are
worth to mention like noise cancellation, operation frequency 2.40GHz, transmission
distance (without walls) is up to 20 m, signal-to-noise ratio is 65 dBm and Sensitivity is
about 42dB±2dB.
4.3.5 Logic Level Converter
This chip solves the problem of connecting and sending data from 5V logic level
devices to 3.3V logic microcontrollers such as Raspberry Pi and Arduino. This chip
stands in between Arduino and the Sensor and converts the 5V signals from the sensor
to 3.3V which can be directly fed to Arduino. 74LVC245 can be used with digital

signals and works great with SPI, Serial, Parallel bus, and other logic interfaces.

Fig.4.12. Logic level converter

By looking on the IC schematic, we'd find that the bi-directional logic level converter is
actually a very simple device. There is basically one level-shifting circuit on the board,
which is repeated four times to create four level-shifting channels. The circuit uses a
single N-channel MOSFET and a couple pull-up resistors to realize bi-directional level
shifting.

Fig.4.13. Logic level converter scheme (Bi-Directional Logic Level Converter Hookup
Guide - SparkFun Learn, n.d.)
4.4 Wiring Diagram
For this part of the electric design, the main goal is to handle the power in the most
efficient way. Not all components function under the same voltage, nor do they draw
the same current. For this matter an efficient wiring diagram had to be designed.
Starting with the power, the main components are the batteries, Arduino, DC motors,
and the motor drivers. Those are the essential components to make the wheeled mobile
robot function.

Four batteries of 3.7 rated voltage were used. Connected to the motor drivers as a 12V
power input, as to power the DC motors efficiently. As previously mentioned, the motor
drivers handle up to 32V, which means 14.8V wasn’t a problem for the drivers to
handle. The DC motors draw around 0.4 amps at maximum speed and maximum load,
making it a convenience for our circuit design efficiency. In the following figure, the
circuit design and wiring are illustrated.

Fig. 4.14. Wiring diagram

During the process of simulating the circuit, four batteries of 3.7V did not allow the
simulation to run, because it was more than what was required by the circuit. So we
altered the design on the simulation for it to work fine.
For the servo motors, they operate on 3.8-5.0V and withdraw current of 1.5-2.0 amps
depending on the load. So, that was the challenge to handle the power for the servo
motors as well as the DC motors.

For this matter, two of the four batteries only were used to power the servo motors.
With total voltage of 7.4-8.0V, connecting the two batteries with voltage regulators so
that it regulates the voltage to prevent any failure in the operation of the servo motors.
Each motor is connected to each one of the regulators, except for the small SG90
motors, the withdraw way lower current than the MG995, so operating two of them on
one regulator should do it.

The Arduino board needs 5V to operate, although it can handle up to 12V. connecting
the Arduino board with the 8V coming out of the two batteries should not form any
problem to the Arduino.

Leaving the circuit with a few components left such as the IR sensors and Ultrasonic for
the robotic arm control technique which will both draw Vcc from the Arduino board. As
for the mic and the module, they will operate under the voltage that the Raspberry pi
outputs. As for the Raspberry Pi itself, an outer power supply such as a power bank is
used to power the board, because it only has one source to power the board and it’s the
type-C USB.

Fig. 4.15. Powering Raspberry pi (Start up Your Raspberry Pi | Coding Projects for
Kids and Teens, n.d.)
Chapter V
Control Algorithm
5.1 Preface
This chapter discusses each aspect of the control algorithm. Regarding the voice-
recognition model, navigation, and picking up techniques. As previously discussed, the
robot needs to carry out functions in a dynamic environment but with fixed tasks and
commands and also fixed coordinates to commute between them. So, in the following
sections we will explain the methods of control of each task.

5.2 Speech-recognition
The speech recognition concept in our projects depends on the recognition of the robot
to the voice commands given to it by using Pico voice platform, which is an on-device
voice AI and large language models (LLMs). LLMs are machine learning models that
can comprehend and generate human language text. It is a machine learning program
that has been fed enough examples to be able to recognize and interpret human language
or other types of complex data They work by analyzing massive data sets of language.
Voice commands are given to the robot and are pre trained so that the robot can
recognize them later on even by using different voices and accents.

A keyword spotter can also be used to create always-listening voice commands. Where
the main benefit of using always-listening voice commands versus follow-on commands
is user convenience, as it is not required to utter the wake phrase first. Because
matching the stream of Speech-to-text output against a list of phrases or keywords
would result in dreadfully low accuracy while using an unnecessary amount of
computing resources. For power-sensitive applications like wearable devices, the power
requirements alone are a complete non-starter.
Fig. 5.1 STT model (Strategy Guide for Voice AI-powered Applications, n.d.)

(https://picovoice.ai/blog/a-strategy-guide-for-voice-applications/)

For this matter, we had to give the robot a name so it’s easier to detect the keywords. In
our case the robot’s name is MARVY. The robot is given a voice command using the
wake-up keyword, so that the robot differentiate between all the words spoken before
and after that keyword. In our case the keyword is “Wake-up MARVY”. Everything
said before the keyword isn’t detected by the robot, and after the keyword comes the
commands which the robot is then to execute. Other pre trained commands are “Follow
Line”, “Pick Up”, “Go Back”, and “Stop”. Under each command is a script to guide the
robot what to do and how to do it, whether the code for the ultrasonic sensor to scan for
an object to pick up or even the script for the robotic arm to execute the movement
successfully. Same concept goes with following the line and going back to initial
position.
5.3 Line Following
In order to the robot to accomplish its task successfully, the robot will need to follow a
line taped to the ground from point A (where it picks the object) to point B (Where it
drops it). Of course, that’s not always the case in such robotic applications, sometimes a
robot follows techniques such as autonomous navigation and localization using various
sensors and complicated algorithms. In our project we chose to keep it simple and easy
to deploy, as long as it carries out its function just right.

The robot will be able to carry out line the following technique using infra-red sensor
fixed to the face of the wheeled mobile robot so it can detect the presence of the black
line on the ground.

The concept of IR sensors is that it has an emitter and a receiver of IR waves, the
emitter is simply an IR LED (Light Emitting Diode) and the detector is an IR
photodiode. Photodiode is sensitive to IR light of the same wavelength which is emitted
by the IR LED. When IR light falls on the photodiode, the resistances and the output
voltages will change in proportion to the magnitude of the IR light received. That’s why
a black line is used, when the IR light falls on the black line it reflects back to the
receiver and then the robot detects the line. And then the robot will be able to adjust
based on whether it’s directly over the line or next to it, so it will fix the orientation to
make sure it's following the line at all times.

Fig. 5.2. Line following technique. (Buddies & Buddies, 2024)


Chapter VI
Implementation Process
6.1 Preface.
In this stage, all the mechanical work from chapter two comes to live. Where the
designed 3D chassis is manufactured and put together. The robotic arm is 3D printed
and mounted. And the wheels is mounted alongside the DC motors and servo motors for
the arm. We will then explain the process of assembling the chassis, the arm,
components, and motors.

6.2 Manufacturing process.


As previously mentioned, the chassis body is made of plywood. For the box shape of
the robot, we went with press-fit assembly method to gather the chassis body parts
together. The plywood sheets were cut using laser cut machine, with teeth all around the
edges of the robot parts as to be put together in the press-fit technique.

Fig. 6.1. Laser cut process.

This method gives the robot durability and stability in comparison to other ways such as
supporting the upper plate on spacers or vertical columns. The chassis parts are then
fastened together with bolts and nuts in addition to being already stable with the press-
fit method.

Fig. 6.2. Press-fit chassis with bolts.

6.3 Assembly.
The next step is fixing the components in place. The place of each component depends
on the circuit wiring and analysis explained above. Each component is fixed using bolts
and nuts in pre-determined positions and with predesigned holes. From fixing the motor
mounts with the wheels mounting, to fixing the electric components such as the motor
drivers and electronic boards.

Fig. 6.3. Fixing components with bolts.


Assembling the DC motors with the mecanum wheels is also critical. Because of the
way mecanum wheels work, each two wheels of the same orientation has to placed
diagonally so that the robot functions well. The DC motors is tested first and made sure
it’s operating under the power requirements. The motor mounts are designed, and 3D
printed to be set in place. The motor coupling is also 3D printed which was a challenge
itself to design suitable coupling to withstand the torque output of the motors.

Fig. 6.4. DC motor mounts and wheels.

The robotic arm is then assembled, each link is fastened to the link previous to it and the
motors is fastened. Then each motor is adjusted so that the zero of the servo motor is in
the middle of the range of motion to the link. This helps in the coding algorithm for the
pick and place technique. The other aspect to care for when assembling the robotic arm
is making sure that its working area is free of any blind spots, so we had to make sure
that the surrounding area is free from components and that the wires of each motor isn’t
loose and that it’s tied together fixed to the arm itself.
Fig. 6.5. Assembling robotic arm.
Chapter VII
Testing and Results

Chapter VIII
Cost Analysis
8.1 Preface
In this chapter, the main target is to make a thorough analysis of the cost of our project.
Specially in the applications of robotics, it is usually loaded with components whether
sensors or boards which help the robot perform its function. Taking into consideration
the aspects of every decision we made and on what basis. We had to make this analysis
to study to evaluate whether we took the optimal decisions based on the resources we
had or we could have made better decisions. For example, we could have chosen from a
variety of engineering materials other than plywood for the chassis. This chapter
illustrates the whole process for this matter.

8.2 Analysis
starting to work on the project, every decision is taken based on research that is made
prior to taking the decision. One of the main aspects of this research is the cost of each
component, alongside with the capabilities of the components and functions.

The first decision we had to make is choosing the material, at previously discussed in
project report part I, the first choice was stainless steel sheets. In the process of
manufacturing, we found out that stainless steel will be expensive. So, we went with
MDF as our choice for the material. Which we later on decided to use it only for testing
and as a model, because it was not as durable as we expected it to be. Although
increasing the thickness from 3 mm to 5 mm were discussed but we decided to choose
the 6mm Finnish plywood panel for its characteristics as previously shown in chapter
two.
Fig. 8.1. MDF chassis.

The second decision we took was based on the motor sizing we performed on the robot
based on the calculated weight of the chassis and the arm. Choosing the stainless-steel
material would have added more weight to the robot, which in return would have
resulted in choosing bigger motors, which by result means higher cost.

After the motor selection, the robotic arm is designed and printed. Which was based on
research to the market availability, and we found that the best choice is to 3D print the
arm parts. The next step was to choose the basic components for the robot functions,
and even that part took time and cost us more than what we had researched. We choose
components and then during testing some components were burnt and we had to replace
them. And other components were not suitable for the application and then we had to
choose other components. For example, we choose Arduino Uno instead of the Mega at
first but then we noticed that it doesn’t have enough GPIOs for the rest of the
components. Also, some of the voltage regulators were burnt due to high voltage input
after we discovered that the 3.7V batteries outputs 4V in reality. After some changes in
the power circuit, we finally made it right.

Another issue we faced was whether to choose expensive sensors with higher accuracies
or to choose cheaper ones and improvise in the coding stage or to tune them manually.
For example, we could have used a LIDAR sensor instead of Ultrasonic and IR sensors.
This would have cost more, but it would have also made it easier for us to work with.
During the process of building up the robot, many enhancements had to be made on
some part or the other, which is a mandatory part in the process. Another example is the
fabrication of suitable motor coupling for the wheels. The first thing we tried is trying to
make the coupling that came with wheels work, but it did not go as expected. After that
we choose to buy metal coupling. Unfortunately, it wasn’t suitable for the encoder disk
that had to be mounted on the motor shaft. That lead us to making our own design for
the coupling, and even that was a hassle of itself. More than a design was made, with
trial and error we finally reached the optimal design.

In the following, Table 8.1. shows a detailed cost analysis of all the components and
materials used in building the robot.

Table 8.1. Detailed cost analysis.

Item Cost Item Cost


Chassis and robotic arm Electronic components
MDF 120 L.E. Arduino Uno 380 L.E.
3D printed arm 1400 L.E. Arduino Mega 1500 L.E.
DC motors 1800 L.E. Raspberry Pi 4GB 6200 L.E.
Servo motors 1900 L.E. IR & ultrasonic sensors 690 L.E.
Motor Mounts 240 L.E. Regulators & capacitors 130 L.E.
Wheels 1600 L.E. Bread board & Jumpers 210 L.E.
Plywood 420 L.E. Motor drivers 470 L.E.
Couplers metal & printed 540 L.E. Mic & Module 240 L.E.
Other Material Batteries 640 L.E.
Bolts, nuts, tape & heat shrink ≅ 200 L.E. Logic Level Converter 110 L.E.
Total 18,520 L.E.
Chapter IX
Conclusion and Future Work
9.1. Conclusion
In this project, the design, modelling, simulation, and implementation of a multitasking
voice-controlled robot has been presented. From the work done in this project
conclusions can be made as following:

 Previous work has been reviewed and summarized.


 The chassis is designed to withstand its environment.
 Robot tasks can be carried out smoothly.
 Voice-recognition technique can be used in a manner that facilitates the function
of the robot.
 WMR modelling has been carried out in regard to mecanum (omni directional)
wheels, proving its sustainability.
 A robotic arm is designed to facilitate the task of carrying objects and
transporting them.
 Obstacle avoidance technique using ultrasonic sensors has been explained and
implemented.
 Path planning techniques give the robot the level of autonomy required to
facilitate its navigation in such a dynamic environment.

9.2. Future Work


A lot of work were done to get the robot functional in the ways were planned to apply
the concept of the project and bring it real. However, a lot more is required in order to
enhance functionality, optimization and quality of life.

From the work done in this project some concepts may be done in new versions:

 Improve the used materials: for better durability and more cohesive structure
it's ought to cut and use metallic materials, it might be a little heavier but more
stable and efficient. There are some recommended materials for such robot can
vary depends on the application and environment but some common for such is
aluminum alloy or stainless steel.
 Utilize ROS and LIDAR approaches for voice recognition and mapping: In
order to create a more natural and effective human-robot interaction, LiDAR
uses laser pulses to measure reflections in order to provide accurate distance
measurements, the environment is mapped in great detail using this data.
Furthermore, robot control and navigation become more effective and user-
friendly when voice recognition and LiDAR are integrated into ROS. This
technique creates a robotic system that is effective, approachable, and flexible
by utilizing the natural interface of voice commands and the accuracy of LiDAR
sensing. This combination increases the effectiveness of user interaction, boosts
navigation accuracy, and expands the range of applications for robots.

 Adding more gadgets with more functionality: As it was mentioned, there are
lots of possibilities for such project to have different jobs like firefighting,
scanning for defects or mines or even painting. By providing the right equipment
designed and attached to it, it can be reprogrammed and adding more functions
to the microcontroller and map it to triggering voice lines.

 Consider using AI model with camera: AI models and image processing have
been a large approach in many fields and precisely robotics. In projects like this,
an effective and user-friendly method of controlling and navigating robots is
provided by ROS's integration of voice recognition with AI-based camera
models, the camera is important for mapping and navigation purpose for it
provide to save preset locations via built AI model and adaptability of different
locations. This method makes use of sophisticated computer vision techniques
for dynamic navigation and detailed environmental understanding, as well as
natural language processing for simple interaction. It offers improved
adaptability, versatility, and usability, making it appropriate for a broad range of
uses.

 Improving interface interreacting with user: For quality-of-life purpose, it's


preferred to develop and enhance the user interface and to be friendly-user and
facilitated. This is an important feature for both marketing and quality of use.

References
[1] Ahmad, S. (2021 ). On the Design and Fabrication of a Voice-controlled Mobile
Robot. International Conference on Informatics in Control, Automation and
Robotics (ICINCO 2021) (p. 6). Dubai: Mechatronics Engineering Technology,
Higher Colleges of Technology.
[2] Daniel Jeswin Nallathambi1, A. T. (2018). Voice Controlled Robotic
Manipulator with Decision Tree and Contour Identification Techniques.
International Conference on Intelligent Systems (IS) (p. 4). Chennai, India:
Department. of Computer Science and Engineering SSN College of Engineering.
[3] Dr. Ali Ahmed Abed, D. A. (2016). DESIGN AND IMPLEMENTATION OF
WIRELESS VOICE CONTROLLED MOBILE ROBOT. Al-Qadisiyah Journal
For Engineering Sciences.
[4] Humayun Kabir1, A. U. (2011). DEVELOPMENT OF A VOICE
CONTROLLED ROBOTIC ARM. Proceedings of the Conference on
Engineering Research, Innovation and Education 2011. Bangladish : Chittagong
University of Engineering & Technology.
[5] Maruthi Naik R K, C. R. (2022). Fire Fighting Robot with a Voice.
International Journal of Engineering Research & Technology .
[6] Ms. B. Sarada, M. H. (2023). Voice Controlled Robot using Bluetooth.
International Journal of Engineering Research & Technology.
[7] Ms. Dipti Damodar Patil, P. U. (2014). Design and Development of Voice/Telie
Operated Intelligent Mobile Robot. Int. Journal of Engineering Research and
Applications .
[8] Svitlana Maksymova, R. M. (2017). Software for Voice Control Robot:
Example of. Open Access Library Journal.
[9] Inverse Kinematics Analysis and Simulation of a 5 DOF Robotic Arm using
MATLAB. Abbood, Tahseen F. Abaas Ali A. Khleif and Mohanad Q. 2020.
s.l. : Al-Khwarizmi Engineering Journal, , 2020.

[10] Young Sang Choi, C. D. (2008). Laser Pointers and a Touch Screen: Intuitive
Interfaces for Autonomous Mobile Manipulation for the Motor Impaired.
[11] Microsoft Speech Application Programming Interface (API) and SDK, Version
5.1, Microsoft Corporation, http://www.microsoft.com/speech

[10] The modeling of Inverse Kinematics for 5 DOF manipulator. V.N. Iliukhina,
K.B. Mitkovskiib , D.A. Bizyanova a and A.A. Akopyan. 2017. V.N. Iliukhina,
*, K.B. Mitkovskiib , D.A. Bizyanova a , A.A. Akopyan : Samara National
Research University,, 2017.

[12] Jamil, A. S., Hassan, N. F., & Obaida, T. H. (2023). Real-time voice recognition
and modification by convolutional neural network. ResearchGate.
https://www.researchgate.net/publication/373629934_Real-
time_Voice_Recognition_and_Modification_By_Convolutional_Neural_Networ
k
[13] Meng, Z., Liu, H., & Alfred, C. (2023). Optimizing voice recognition informatic
robots for effective communication in outpatient settings. Cureus.
https://doi.org/10.7759/cureus.44848
[14] Sharma, M. D., & Kumar, P. (2023). Application of artificial intelligence for
voice recognition. ResearchGate.
https://www.researchgate.net/publication/368543413_Application_of_Artificial
_Intelligence_for_Voice_Recognition/related
[15] Li, S., Liu, Y., Chen, Y., Feng, H., Shen, P., & Wu, Y. (2023). Voice Interaction
Recognition design in Real-Life Scenario Mobile robot applications. Applied
Sciences, 13(5), 3359. https://doi.org/10.3390/app13053359
[16] Rendyansyah, R., Prasetyo, A. P. P., & Sembiring, S. (2022). Voice command
recognition for movement control of a 4-DOF robot arm. Elkha, 14(2), 118.
https://doi.org/10.26418/elkha.v14i2.57556
[17] Kulkarni, M. (2023). Voice controlled robot. International Journal for Research
in Applied Science and Engineering Technology, 11(5), 2918–2922.
https://doi.org/10.22214/ijraset.2023.51864
[18] Qin, M., Kumar, R., Shabaz, M., Agal, S., Singh, P., & Ammini, A. C. (2023).
Broadcast speech recognition and control system based on Internet of Things
sensors for smart cities. Journal of Intelligent Systems, 32(1).
https://doi.org/10.1515/jisys-2023-0067
[19] Rashid, M. M., Ali, M. Y., & Sailan, S. (2023). Service robot using voice
recognition. Nucleation and Atmospheric Aerosols.
https://doi.org/10.1063/5.0112691
[20] Khoon, T. N., Sebastian, P., & Saman, A. B. S. (2012). Autonomous Fire
Fighting mobile platform. Procedia Engineering, 41, 1145–1153.
https://doi.org/10.1016/j.proeng.2012.07.294
[21] Naeem, B., Saeed-Ul-Hassan, & Yousuf, N. (2023). An AI based Voice
Controlled Humanoid Robot. Research Square (Research Square).
https://doi.org/10.21203/rs.3.rs-2424215/v1
[22] Luo, C. (2023). A voice recognition sensor and voice control system in an
intelligent toy robot system. Journal of Sensors, 2023, 1–8.
https://doi.org/10.1155/2023/4311745
[23] Li, D., Shi, X., & Dai, M. (2024). An improved path planning algorithm based
on A* algorithm. In Lecture notes in electrical engineering (pp. 187–196).
https://doi.org/10.1007/978-981-99-9239-3_19
[24] M, P. K. (2022). Speech Recognition Robot using Endpoint Detection
Algorithm. International Journal of Engineering and Management Research,
12(6), 206–210. https://doi.org/10.31033/ijemr.12.6.28
[25] Kumar, K. S. (2022). Voice controlled robot vehicle using Arduino.
International Journal for Research in Applied Science and Engineering
Technology, 10(6), 2786–2791. https://doi.org/10.22214/ijraset.2022.44458
[26] Kanode P, Kanade S (2020) Raspberry pi project-voice controlled robotic
assistant for senior citizens. Int Res J Eng Technol (IRJET) 7(10):1044–1049
[27] Zhang J, Wang M, Li CY, Li XM, Xu W (2012) Design of intelligent-tracking
car based on infrared photoelectric sensor and speech recognition technology. J
Chongqing Univ Technol (Nat Sci) 7.
[28] Rendyansyah, R., Prasetyo, A. P. P., & Sembiring, S. (2022b). Voice command
recognition for movement control of a 4-DOF robot arm. Elkha, 14(2), 118.
https://doi.org/10.26418/elkha.v14i2.57556
[29] https://www.unifiedalloys.com/blog/stainless-grades-families

[30] https://www.upmet.com/sites/default/files/datasheets/316-316l.pdf

[31] https://www.fireextinguisheronline.com.au/blog/post/what-are-dry-chemical-
fire-extinguishers
[32] Abdullah, M. A., Mansor, M. R., Tahir, M. M., & Ngadiman, M. N. (2012).
Design, analysis and fabrication of chassis frame for UTEM Formula VarsityTM
race car.
[33] Liu, Z., Li, Z., Sun, Y., Liu, A., & Jing, S. (2022). Fusion of binocular vision,
2D lidar and IMU for outdoor localization and indoor planar mapping.
Measurement Science and Technology, 34(2), 025203.
https://doi.org/10.1088/1361-6501/ac9ed0

[34] Korendiy, V., Kachur, O., Boikiv, M., & Yaniv, O. (2023). Analysis of
kinematic characteristics of a mobile caterpillar robot with a SCARA-type
manipulator. ResearchGate. https://doi.org/10.23939/tt2023.02.056

[35] AlHaza, T., Alsadoon, A., Alhusinan, Z., Jarwali, M., & Alsaif, K. A. (2015).
New concept for indoor fire fighting robot. Procedia - Social and Behavioral
Sciences, 195, 2343–2352. https://doi.org/10.1016/j.sbspro.2015.06.191
[36] Hendzel, Z., & Rykała, Ł. (2017). Modelling of Dynamics of a Wheeled Mobile
Robot with Mecanum Wheels with the use of Lagrange Equations of the Second
Kind. International Journal of Applied Mechanics and Engineering, 22(1), 81–
99. https://doi.org/10.1515/ijame-2017-0005
[37] Amodei, D., Ananthanarayanan, S., Anubhai, R., & Zhu, Z. (2015). Deep
Speech 2: End-to-End speech recognition in English and Mandarin.
ResearchGate.
https://www.researchgate.net/publication/286513561_Deep_Speech_2_End-to-
End_Speech_Recognition_in_English_and_Mandarin
[38] Agbeyangi, A., Alashiri, O. A., Odiete, J., & Adenekan, O. (2020). An
autonomous obstacle avoidance robot using ultrasonic sensor. Journal of
Computer Science and Its Application, 27(1).
https://doi.org/10.4314/jcsia.v27i1.15
[39] Yang, X., Liu, H., & Zhang, S. (2023). Overview of path planning algorithms.
Recent Patents on Engineering, 18.
https://doi.org/10.2174/1872212118666230828150857
[40] Ata, A., & Myo, T. (2008). Optimal trajectory planning and obstacle avoidance
for flexible manipulators using generalized pattern search. ResearchGate.
https://www.researchgate.net/publication/255592185_Optimal_trajectory_planni
ng_and_obstacle_avoidance_for_flexible_manipulators_using_generalized_patt
ern_search
[41] Xu, X., Cao, Y., & Li, X. (2023). Improved DQN algorithm for path planning of
autonomous mobile robots. Research Square (Research Square).
https://doi.org/10.21203/rs.3.rs-3566583/v1
[42] Zhang, Han. (2023). Path planning for storage robots using the RRT algorithm.
Theoretical and Natural Science. 26. 279-285. 10.54254/2753-
8818/26/20241118.
Appendices
Appendix A: Plywood Mechanical Properties
Appendix B: Datasheets of Components
Geared DC motor
Different DC geared motor data:

Rated Breaking
Gearbox Length Stage Torque Torque
Reduction Ratio
(mm) Number
kg.cm

14.7 3 8.2/1 12/1 0.6 1.5

20.4/
16.2 4 25/1 28/1 31/1 35/1 1.0 2.5
1

17.7 5 50/1 67/1 70/1 75/1 87/1 1.5 3.0

173/
19.2 6 128/1 187/1 217/1 2.0 5.0
1

319/ 542/
20.7 7 266/1 438/1 467/1 2.5 6.0
1 1

898/ 1085/ 1220/


22.2 8 798/1 2.5 6.0
1 1 1

MG995 servo motor

The device includes 30cm of color-coded wire leads with a 3 X 1 pin female header
connector with 0.1" pitch that is compatible with the majority of receivers.

Specifications

• Weight: 55 g

• Dimension: 40.7 x 19.7 x 42.9 mm approx.

• Stall torque: 8.5 kgf·cm (4.8 V ), 10 kgf·cm (6 V)

• Operating speed: 0.2 s/60º (4.8 V), 0.16 s/60º (6 V)

• Operating voltage: 4.8 V to 7.2 V

• Dead band width: 5 μs

• Stable and shock proof double ball bearing design

• Temperature range: 0 ºC – 55 ºC
HC-SR04 Ultrasonic Sensor Module

Electrical Parameters Value

Operating Voltage 3.3Vdc ~ 5Vdc

Quiescent Current <2mA

Operating Current 15mA

Operating Frequency 40KHz

Operating Range & Accuracy 2cm ~ 400cm ( 1in ~ 13ft) ± 3mm

Sensitivity -65dB min

Sound Pressure 112dB

Effective Angle 15°

Connector 4-pins header with 2.54mm pitch

Dimension 45mm x 20mm x 15mm

Weight 9g

Arduino Compatible Line Tracking Sensor

Electrical Parameters Value

Operating Voltage 3.3Vdc ~ 5Vdc

Operating Current 16-20mA

Maximum Detection Distance 10mm ± 2mm

Connector 3-pins header

Dimension 30mm x 10mm

Weight 7g
IR Sensor Module
Raspberry Pi 4 Model Pinout
L289N Dual H-Bridge Motor Driver Specifications

Electrical Parameters Value

Input Voltage 3.2 Vdc ~ 35Vdc

Storage Temperature -25 ℃ ~ +130 ℃

Max Power Consumption 20W (@ Temperature = 75 ℃)

Dimensions 3.4cm x 4.3cm x 2.7cm

Max Current 2A

Operating current range ~ 36mA

Arduino Mega2560 Specifications


‫الملخص‬
‫توضح هذه الدراسة مفهوم استخدام تقنية التعرف على الصوت إلدارة‬
‫الروبوتات وتوجيهها‪ ،‬وتهدف إلى إدارة المهام في أماكن العمل بكل سهولة وإبداع‪.‬‬
‫تم تصميم التصميم بحيث يستخدم تقنيات مختلفة‪ ،‬مثل تجنب العوائق‪ ،‬والتتبع إلى‬
‫نقطة معينة عبر متتبع الخط‪ ،‬وحمل األشياء ووضعها في المكان المطلوب عبر‬
‫‪.‬الذراع اآللية والمزيد من اإلمكانيات‬

‫على افتراض أنه تم تجميع التصميم الميكانيكي وتصميم الدوائر‪ ،‬من خالل‬
‫برمجة المتحكمات الدقيقة وتوفير الكيمياء بينها‪ ،‬مما يجعله ملًما بالعبارات في‬
‫لغتنا حتى يتمكن من التعرف على االقتباسات وتحليلها والتعرف على األوامر‬
‫المبرمجة يفضل تحسين دقة عناصر التعرف على الصوت لزيادة الكفاءة وتقليل‬
‫‪،‬األخطاء أو األخطاء المحتملة‪ .‬عالوة على ذلك‬

‫لذلك‪ ،‬قمنا بصناعة روبوت يتعرف على مجموعة واسعة من االقتباسات‪ ،‬باستخدام‬
‫المواد والتصميم المناسبين‪ ،‬ويحتوي على بعض األدوات لبعض الوظائف الميدانية‪،‬‬
‫كما تمت برمجته لتنفيذ التعليمات بشكل جيد‪ ،‬وسيكون نموذًج ا أولًيا يلهم إجراء‬
‫‪.‬ترقيات ميدانية حقيقية‬

‫الكلمات الداللية‪ :‬الروبوت المتنقل‪ ،‬تعدد المهام‪ ،‬الذكاء االصطناعي‪ ،‬التعرف‬


‫‪.‬على الصوت‪ ،‬الروبوتات‪ ،‬متابع الخط‪ ،‬الذراع الروبوتية‪ ،‬األوامر الصوتية‬
‫‪214Equation Chapter 4 Section 1‬‬

‫‪MSA‬‬
‫جامعة اكتوبر للعلوم الحديثة واالداب‬
‫كلية الهندسة‬
‫قسم هندسة نظم الميكاترونكس‬

‫روبوت متعدد المهام بالتحكم‬


‫الصوتي‬
‫مشروع التخرج‬

‫يقدم كجزء من متطلبات درجة البكالوريوس في هندسة نظم‬


‫الميكاترونكس‬
‫)الجزء الثاني(‬

‫تم تقديمه بواسطة‬

‫الرقم‬ ‫الطالب‪ :‬علي محمد علي احمد بركات‬


‫التعريفي‪203679 :‬‬
‫الرقم‬ ‫الطالب‪ :‬محمد احمد عبد الحي علي‬
‫التعريفي‪195499 :‬‬

‫اشراف‬

‫أ‪.‬م‪.‬د‪ /‬محمد صالح الدين شيبة حامد‬

‫ربيع ‪2024‬‬

You might also like