Cargo Delivery System With Drone: Gazi University Engineering Faculty Electrical-Electronics Engineering Department
Cargo Delivery System With Drone: Gazi University Engineering Faculty Electrical-Electronics Engineering Department
GROUP :
1- Emrullah ÜLKER (151110072)
2- Oğuzhan BULUT (151112008)
3- Utku ÇELİKER (151112009)
4- Oğuz Kaan ULUIŞIK (161110071)
COUNSELOR:
Dr. Öğr. Üyesi Mehmet KARAKAYA
1. PROBLEM STATEMENT….……………………………………….………………..……........1
1.1. Need Statement…………………………………………………………………………1
1.2. Objective………………………………………………………………………………..1
1.3. Background And Related Works……………………………………………………….2
2. REQUIREMENTS SPECIFICATION…………………………………………………………..4
2.1 The Requirements………………………………………………………………………4
2.1.1. Mission Requirements………………………………………………………..4
2.1.2. Design Requirements…………………………………………………………4
2.2.Constraints………………………………………………………………………5
2.2.1 Sustainability………………………………………………………………….5
2.2.2 Health, Environment and Safety………………………………………………5
2.2.3 Legal…………………………………………………………………………..5
2.2.4 Ethic…………………………………………………………………………...5
2.2.5 Design Constraints…………………………………………………………….6
2.3 Standards………………………………………………………………………..6
2.4. Evaluated Concepts and Configurations………………………………………..6
2.5. Calculation and Amount of Unit Weights……………………………………...7
2.6. Final Conceptual Design Configuration………………………………………..7
3.DESIGN…………………………………………………………………………………………..8
3.1 Level 0…………………………………………………………………………………...8
3.2 Level 1…………………………………………………………………………………...8
3.2.1 Dimensional Parameters of Final Design…………………………………...14
3.2.2 Structural Features and Capabilities of the Final Design…………………...14
3.3. Level 2 and Device Selection …………………………………………………………14
iv
4. DESIGN VERIFICATION……………………………………………………………………..24
4.1 Test Results…………………………………………………………………………….24
4.1.1 Testing the GPS……………………………………………………………….24
4.1.2 Testing the ESC and Motors………………………………………………….25
4.1.3 Testing the FCB………………………………………………………………29
4.1.4 Testing the DSP…………………….…………………………………………30
4.2. Verifications…………………………………………………………………………...32
4.2.1 Design Constraints……………………………………………………………32
4.2.2 Mission Requirements………………………………………………………...33
4.2.3 Design Requirements…………………………………………………………33
4.3. Standarts………………………………………………………………………..34
6. ADMINISTRATION ………………………………………..…………….…..………...……..36
6.1. Team Organization……………………………………………………………………36
6.2 Timeline………………………………………………………………………………..37
6.3 Work packages………………………………………………………………………...37
7. TEST PLAN…………………………………………………………………….……..………..38
REFERENCES.………………………………………………………..………..…….………...…41
ADDITIONS………………………………...……………………………..…………..………….42
v
SYMBOLS AND ABBREVIATIONS
The symbols and abbreviations used in this study are presented below with their
explanations.
Symbols Descriptions
0
C Santigrat Degree
cm Santimeter
gr Gram
m Meter
m² Meter square
Abbreviations Descriptions
A Amper
DSP Digital Signal Processor(Dijital sinyal işleyici)
ESC Electronic Speed Controller(Elektrik Hız Kontol)
FCB Flight Control Board(Uçuş kontrol kartı)
GPS Global Positioning System(Küresel konum sistemi)
KV Volt başına rpm
Li-Po Lityum-Polimer Batarya
mAh Mili amper hour(Miliamper -saat)
Prop Propeller(pervane)
Rpm Repeat per minute(Dakika başına devir)
TL Turkish Lira
UAV Unmanned Aerial Vehicle
V Volt
vi
1. PROBLEM STATEMENT
This detailed design report gives detailed information about the design process followed in
the design stages of the project on "Cargo Delivery System with Drone" of the "EEE492"
course.
The main problem addressed in the project is that shipping systems are too dependent on
human beings, so the accidents and mistakes encountered can negatively affect human,
product and environmental factors.
Job descriptions and rules determined by our consultants in the design of the transport
system, today's scientific studies in this field, the necessity of having an appropriate and
affordable cost, and the environmental factors in the environments where the use of the
vehicle (drone) is aimed have played a major role.
1.2. Objective
The main purpose of the design is to minimize the human factor in product transportation
and to contribute to the industry by increasing the occupational safety in transportation
systems.The design of the vehicle has been made in accordance with the user goals, taking
human health and environmental factors, dangers into consideration. The project also aims
to ensure that the vehicle reaches its target in the shortest possible way in a healthy way,
thus providing fast and easy use and increasing energy savings.
10 environment and a virtual simulation drone was created by compiling ArduCopter using
MavProxy in the Mavlink infrastructure. The created virtual simulation drone is monitored
through Mission Planner. Image 2 shows the route the drone follows.
2.2. Constraints
2.2.1 Sustainability
Sustainability is the ability to continue the project even after it is completed, in other
words, ensuring the permanence of the project. Updating the electronic devices and
software used in the project ensures the sustainability of the project.
Health and environmental criteria have always been prioritized in the project. Personal
protective equipment was used in the use of solder, heater, and cutting tools during the
assembly phase. The health and environmental damage risks that may be caused by the
product during operation were examined and reduced to zero. Occupational Health and
Safety criteria were used as a provider tool in these three issues.
2.2.3 Legal
Persons who damage living, non-living things and the environment as a result of using the
design outside of its intended purpose and not being used under the specified standards and
conditions will be legally subject to material and moral penal sanctions. In addition, the
manufacturer will face legal sanctions as a result of the failure to perform the tests and
trials required during the production of the product and this situation endangers the
primary person and the environment. In addition, as a result of unauthorized copying and
production of the electronic and mechanical design of the vehicle, our patent owner team
has the right to initiate a material and moral legal struggle against the real or legal persons
who perform this transaction. This legal process may result in the other party's
compensation or imprisonment.
2.2.4 Ethic
During the project process, attention was paid to act in accordance with originality and
ethical principles. The topics such as showing the sources used in the research, awareness
of responsibility, confidentiality and security are handled in accordance with ethical rules.
In addition, group members always acted with courtesy and respect towards each other.
1) The developed vehicle should reach the coordination with the GPS module with the
lowest error rate and energy expenditure on the shortest route.
2) The engines of the vehicle must be suitable for lifting 5 * 5 * 5cm loads.
3) The package should be left within an area of 1 m².
4) After the drone has delivered the cargo, it should return to its starting point
autonomously.
5) The system cost should be limited to a maximum of 3000 TL.
6) Electronic circuits must be able to function fully between -20 ° C and + 40 ° C
IEEE802.11p /LTE-V2V : It covers all vehicle communication systems as an
umbrella protocol where sensor data can be transferred at high speed with low
loss.
3.DESIGN
3.1 Level 0
Digital Signal
camera Proccessor Battery
(DSP)
Communication
Control unit
unit Power unit
Camera
The system needs image data as it will operate on a computer vision basis. Therefore, the
camera is an essential component of this system. At this stage, it was decided to work with
a 480p camera in order not to take much memory. The camera would only send data to the
DSP module. Data is digital data.
480p
Vision data DSP
Camera
DSP
DSP is the unit that will process the data it receives from the camera. At this stage, it was
decided to use a raspberry pi electronic card for the DSP unit.
This card, which will be programmed in Python language, will process the image data and
transmit it to the central control unit in the most appropriate format.
Energy Other
Power unit
Battery components
Power Unit
The power unit is responsible for transmitting the
necessary power to the motors for the movement to take
place. In drones, this element is called ESC (Electronic
Energy
control
signal
Control unit
ESC power
Communication Unit
The communication unit is responsible for the
communication between the ground control
center and the drone, as well as the location
determination by using coordinates since it
contains a GPS module. Here, the ground
control center is intended as a control. It detects simple commands that will come from the
control and transmits them to the control unit. At the same time, the battery fill rate, height,
engine rotation speed, etc. From the sensor group. It
is also responsible for sending the data to the
controller.
It was decided to use NRF24 module for bi-
directional communication within the unit. The
communication band is 2.4 GHz and the
communication between two stations can go up to 250 meters in an unobstructed
environment. The module can work integrated with Ardunio. In addition, PCB design can
also be made.
Remote Communica
Remote Control tion Unit
Control Unit
The most important part of the drone is its brain, the control unit. This unit is called the
“Flight Control Board (FCB)”. Market review has shown that some types have built-in
gyroscope and GPS modules.
Flight control card is the unit that receives data from sensors, evaluates them and
makes final decisions. This unit also evaluates the commands that will come from the
control and implements them immediately.
Since our drone will be processing the image, it takes the processed image data
from the DSP module and whether it is where it should be, or whether it sees the signs or
images that will enable it to perform the task, whether to approach depending on this,
decide whether to reduce the height or do any of these, cancel the task and return to the
starting point. Takes the necessary decision for.
Since the flight control card works in conjunction with the ESC, the flight control
card also monitors the speed of the motors and drives the ESC to change the speed data
where necessary.
Communication ESC
unit
FCB
Raspberry Pi
The first thing to do in the design is to choose a controller that mainly distributes the tasks
to be performed and controls the performance of the tasks. The fact that this controller is
compatible with other subsystems to be used, can operate at low power, is at an acceptable
level in terms of cost, and is easy to program, will add extra advantages to the design.
Arduino family Raspberry Pi
It is included in the microcontroller It is included in the microprocessor family.
family. It contains units such as It is a central processing unit, it can only
processor, memory, input / output, ROM, manage and decide transactions, but cannot
RAM and does not need to be added perform transactions alone. Peripherals must
peripherals. Therefore, it can control and be added from outside.
perform transactions alone.
It is used to control simple systems. It is used for controlling complex systems.
It is open source, easy to program, easy Programming is complex and may take a
to learn and its programming interface is long time to learn. Requires installation of
more user-friendly. It does not require its own operating system.
installing an extra operating system.
It is low cost and easier to obtain. It can It is costly and more difficult to obtain. It is
be used in simple systems to further used in projects where high performance is
reduce costs. required.
Tablo 3. Arduino and Raspberry Pi Comparison
As seen in the table, it was seen that the Arduino family has more advantages over the
Raspberry Pi family. But the tasks to be done by the project are at certain levels, and
Arduino was removed from the options because it was not at a level to meet these
requirements.
DSP
DSP, raspberry pi was chosen in our drone vehicle. This card is of one type and we have
decided to use RASPBERERY-4.
Broadcom BCM2711, quad-core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz
2GB / 4GB LPDDR4-3200 SDRAM
4 / 5.0 GHz 802.11ac supported wireless network, Bluetooth 5.0, BLE supported
2xUSB 3.0, 2xUSB 2.0 ports
2 × micro-HDMI ports (4K 60fps supported)
2-line MIPI DSI display port
2-line MIPI CSI camera port
265 (4kp60 decode), H264 (1080p60 decode, 1080p30 encode)
Graphics processor with OpenGL ES 3.0 support
Micro SD card slot for operating system and data storage
5V power input (minimum 3A) via USB-C
5V power input (minimum 3A) via GPIO connection
50 grams
Important Note: GPIO pins on Raspberry Pi work at 3.3V level. It is not compatible with
5V.
Camera
The system needs image data as it will operate on a computer vision basis. Therefore, we
took care that it is both cheap and high quality. As a result of the comparisons made from
different sales sites on the Internet, it was decided to use the following camera.
ANGL
VIDEO
SYNCHRONIZATIO SIZE(mm RESOLUTIO SENSOR E OF PRIC
SHOOTIN
N ) N TYPE VISIO E
G
N
OmniVisio
5MP 1080p
Compatible with n 54x41 129,9
25x20x9 (2592x1944 720p
Raspberry pi 3-4 OV5647 degree 9 TL
piksel) 640x480p
(5 MP)
Estimated weight is 20 grams
+ +
Buzzer
Accelerometer: ICM- +
20689 Neo-M8N Gps Module
+
SMT32f40 +
Black Gps Holder
Pixhawk Barometer: MS5607 +
7 (32bit Banggood
PX4 + Power module
36x36 200 Cortx-M7 + .com
v2.4.8 Compass: IST8310 Minim OSD w / shell
168Mhz Support for up to three + 689,83 TL
Mini I2C Expand Splitter Module
Arm) + +
Compass with built-in PPM Module w / shell
+
+
433 MHZ 100 mw / 500 MW telemetry
2x outer support 2 +
Cable
(maximum)
L3GD20 3 axis digital 16 bit PIX
32 bit 2M gyroscope +
+ Security key
flash
LSM303D 3-axis 14 bit +
memory accelerometer / Flight control shell
Pixhawk magnetometer +
Banggood
STM32F4
PX4 - 37 + Buzzer .com
27 Cortex MPU6000 6-axis +
v2.4.8 accelerometer / From 6 pins to 6 pins
542,92 TL
M4
magnetometer +
168MHZ + 4Pin to 4Pin line
MS5611 high precision +
RAM
barometer 3pin DuPont line
Communication Unit
The communication unit is built into the flight control card. There is no need for an
external communication unit or GPS transceiver.
Land Control
There is no need to design or purchase and compile an external controller. On the contrary,
the interface of the flight control card is suitable for communication with the computer.
Sensors
The sensors that are internally on our flight control card are sufficient for us, and it has
been decided to use the following sensors as an option.
1) Optical reflection sensor (to measure motor speed)
Body
AXLE
SUGGESTI SPECIFICATI
IMAGE WEIGHT(gr) DISTANCE PRICE
ONS ONS
(mm)
4 x Motors
Emax XA2212
1400KV
Brushless Motor
PCB frame board
+
Optimized frame
4 x Hobbywing
design, making
SKYWALKER
20A Brushless the wiring of
ESC ESCs and battery
160 330 + safer and easier 96,53
4 x 8045 +
Propellers Plenty of
+
mounting space
1 x Battery 3S-
for autopilot
4S LiPo
systems.
+
1 x CC3D
Flight
Controller
FC mounting
hole: 30x30mm
118 250 - + 132,50
Camera mounting
distance: 27mm
Flight Control:
APM 2.6 /
Pixhawk 2.4.6
Flight Controller
Engine: EMAX
Can connect the
MT2213 935KV ESC and power
460 450 Engine
225,50
cable directly to
ESC: 20-30A ESC
Motor
1.motor
48,50TL
2. motor
52,97 TL
3. motor
65,74 TL
77,57 TL
5. motor
90,81 TL
Weight factor is important in motor selection. The calculated weight, including the body of
the drone, was 534 grams. It is predicted to be 600 grams with internal sensors. Although it
is large, the maximum weight of the drone was calculated as 700 grams in case of a 15%
margin of error, and the engine selection was made in this regard and the 1st engine was
won. Depending on the choice of the propeller to be made later and the selection of the
battery, the engine may change, but since the selected engine is well above the drone
weight values, it will not be affected by the weight of the propeller, battery and other
connection elements.
Here; L: length, P: pitch, B: number of palms.
For example, we have a 6 * 4.5 (also known as 6045) propeller. Here 6 represents the
length and 4.5 represents the pitch. Length units are in inches.
To give another example, the numbers 5 * 4 * 3 or 5040 * 3; Represents a 5 inch long, 4
inch pitch, 3 blade or 3 blade propeller.
Sometimes you can see the word "BN" on the impeller. The "BN" here indicates a stubby
nose propeller (bullnose). Sometimes you may see the words "R" or "C" on the impeller.
The expressions here refer to the rotation of the propeller. If you see the word "R" on the
propeller, it means "reversed". It should be attached to the propeller rotating clockwise. If
there is a "C" it must be attached to the motor rotating counterclockwise.As the drone
team, we decided to use the 2-blade 7045 propeller, that is, the propeller with the above
features. The 7 gram weight of the drone is included in the calculated weight, the
calculation is correct. The price of our product is 28,13 TL
Battery
Considering the engine to be used in the drone, the most suitable battery for us
ZOP Power 11.1V 1500mAh 40C 3S Lipo Battery.
Its price is 140.96 TL and its weight is 115 grams.Please pay attention to the operating
voltage of the motor and the operating voltage of the battery.
4.DESIGN VERIFICATION
By using the location marking and target location identification codes included in the
Ardupilot open source library, the targeted points in the Gazebo Simulation Program were
determined. Afterwards, the coordinates were determined and the program written in
python language shown in Appendix was entered and the program was run. GPS motion
tests of MavLink software with the help of one-to-one scale maps were verified
simultaneously with GazeboSim.
4.1.2 Testing the ESC and Motors
In the shematic above, a BLDC-ECS complex was tried to modeled. Although we didn’t
use one by one components, circuit response it similar to real one. In the simulation
PLECS simulator is used and we thank you Gazi University fort his feature.
First of all we tried to model ESC, which has a same topology as DC-AC inverter. We
used this topology, because BLDC motors are driven sinusodial pulse width modulation
generators and control by changing output frequency. In SPWM, carrier signal changing
the switching of mosfet frequency and referance sinüs signal changes the output frequency.
While referance frequency is increasing, stator magnetic field is changing faster and motor
is rotating faster. However, when output frequency is changed, motor reactance is
changing and thus, phase current are changing.
Graph which is shown at the top side is called sinusodial carrier modulateder signal and the
signal which is state at the bottom side is switching frequency of the inverter mosfets. By
using this method, we can creat more efficient sinusoidal wave forms. These technique
reduces motor’s vibrations.
In BLDC motor controlling, frequency is changed and angular speed is directed
proportional with frequency. However, the most diffucult way of controlling BLDC motors
is : While Frequency and velocity is increasing, current is decreasing. On the other hand,
while drone is operating at high loads, ESC will try produce more frequent switcghing
signal and current so we need a closed loop controller. In SPWM, directly PID using is not
operating correctly but there is more efficient control algorithms.
At motor testing step, we measured the parameters and detected worst conditions. In high
current systems, the most important topic is heating problems. When drone is operating, it
will carry a load and this causes a force, and it means electric current. For prevent burning,
we designed a thermal fuse in PLECS simulator.
When switch is turning off, in PLECS simulator does not turn on the switch at thermal
mode. But er can see that when the current is increasing, the components are heating and
fuse is cutting the circuit
Let say that this ESC is produsig current which causes 105°C. When we decrease the
frequency, motor reactance will decreases and phase current will increase. Thus our
component will be heated by current. If will current flows in a long time, our component
might melt so we must cut the circuit by a fuse in a emergency situation.
4.1.3 Testing the FCB
We used the Pixhawk4 card as the FCB, that is, the flight control board. We aimed to test
GazeboSim and Ardupilot softwares supported by the control card and using python
language.
The Pixhawk4 control card is responsible for the stable performance of the drone's landing-
take-off, speed adjustment and movement tasks. It processes the data from the sensors it
contains and sends this data to its own software, thus adjusting the drone engine speeds.
These tasks have been successfully tested in the simulation software used. The python
codes used are given in Appendix FCB Code.
For 4 meters we
used 15x10 cm blue
plastic layer because our
camera couldn’t
cath 8x8 cm paper.
For 10 meters we used 15x10 cm blue plastic layer. Although our program works well in
10 meters with 15x10 cm plastic layer, we decided to use 1 m2 blue layer to guarantee the
stable operation of the program.
4.2 Verifications
As a result, almost all of the desired conditions and constraints can be predicted
through the calculations, assumptions and simulations made. The cost constraint that
cannot be realized is related to economic conditions and varies according to the prices of
electronic elements in the market..
4.3 Standarts
IEEE802.11p /LTE-V2V :
Sensörlerin mevcut olduğu PX4 elektronik kontrol kartı IEEE802.11p /LTE-V2V
standartına uygun üretilmiştir.
ISO 45001:
Drone fiziksel olarak yapım aşamasına geçmediği için güvenlik tedbir ve önlemleri de
alınamamıştır
.
ISO 27001:
Bilgi ve verilerimizin korunması yüklenen antivirüs programları ile sağlanmıştır.
Haberleşme aracı olarak drone kurulum görev belirleme esnasında bluetooth ve wi-fi
bağlantıları kullanılacaktır. Görev isnasında ise bağlantı mesafesi aşılacağı için herhangi
bir haberşelme cihazı olmayacaktır. Kurulum ve görev belirleme esnasında kullanılan
haberleşme araçları IEEE802.11p /LTE-V2V standartına uygundur.
In this study, it is aimed to develop an autonomous drone for use in cargo delivery. With
the autonomy of the device, the workforce used has been reduced to almost zero. In
addition, accidents that may occur during today's cargo transportation process are
prevented. The vehicle goes to the GPS coordinates entered into it, thanks to the control
card and Python programming, and determines the point where it should leave the load via
the camera and positions itself. Then it leaves the mechanical arm he has to drop the cargo
with the programmed software again. After completing the cargo drop process, it
determines the return route to the starting point (can be the cargo distribution center) and
returns. Units to be used were determined with the tests performed and design plans were
made. Then the units were tested one by one and made ready for integration.
In this testing process, motor and ESC are tested together for safety of the autonumous
drone. It is decided to replace a thermal fuse for drone safety, because of any over load
condition. Through test results, we determined the safe operating interval for the drone
approximately.
At the second part, we performed drone simulation test and we observed that drone can
At the last step, we developed image and video signal processing and the results were
incredibly high rated from the wanted. Responses of the drone is really fast for a real time
operating with video processing part.
After successful completion of the tests with integrated systems, the project was made
ready for completion. During the project construction process, ethical values were
complied with and the design was shaped by considering health, environment and safety.
Accordingly, it is envisaged to be completed in the targeted time in the plans determined
for the project.
6- ADMINISTRATION
The project group consists of four people. It is aimed for each group member to have full
knowledge of the project, and detailed research topics were given to the group members in
line with their interests and abilities. These people and their research topics are given
below.
Organization chart
Team Leader
Oğuzhan Bulut
Company
Counselor
Consultant
Emrullah Ülker
Utku Çeliker
Counselor
Oğuz Kaan Uluışık
Electronics and
Software and
Structural Design
Algorithm Engineer
Engineer
Utku Çeliker
Emrullah Ülker
6.2 Timeline
In this section, the work to be done during the design process is planned by dividing into
processes and shown on a work time chart. Gann diagram is preferred to show this scheme.
This diagram is shown in Table 1 in the "Annexes" section.
During the project process, this schedule will be followed. The processes completed from
October until the report delivery period are marked in red, and the planned process after
this process is marked in black on the timeline.
7. TEST PLAN
In this section, the components of the system to be produced and the tests to be performed
on all
plan will take place. The plans include how and when each test step will take place.
It should include a process to be implemented. Three types of test plans will be made. 1)
Unit testing,
2) Integration test, 3) Acceptance test.
7.1.2.Raspberry Pi Tests
Camera is integrated into Raspberry Pi device. Display connection is made with an HDMI
cable. The camera hardware is powered up and tested.
GPS is integrated into the Raspberry Pi device. Display connection is made with an HDMI
cable. GPS positions are requested and the results obtained are compared with the current
GPS positions.
1564728874059.html
2. https://www.mathworks.com/videos/motor-control-part-3-bldc-speed-control-using-
pwm-1578294719821.html
3. https://www.researchgate.net/publication/335210748_Implementation_of_Autonomou
s_Visual_Detection_Tracking_and_Landing_for_ARDrone_20_Quadcopter
4. https://github.com/tizianofiorenzani/how_do_drones_work/tree/master/scripts
5. https://www.evansville.edu/majors/eecs/downloads/projects2016/CameronRobertsRep
ort.pdf
7. http://www.set-science.com/manage/uploads/ISAS2019-
ENS_0042/SETSCI_ISAS2019-ENS_0042_0011.pdf.
8. https://www.mathworks.com/videos/image-and-signal-processing-with-matlab-
simulink-82478.html
9. https://stackabuse.com/introduction-to-image-processing-in-python-with-opencv/
10. https://www.raspberrypi.org/documentation/hardware/raspberrypi/bcm2711/rpi_DATA_2711_
1p0_preliminary.pdf
11. https://projects.raspberrypi.org/tr-TR/projects/raspberry-pi-setting-up/5
12. https://github.com/ArduPilot/pymavlink
13. https://dronekit-python.readthedocs.io/en/latest/guide/copter/guided_mode.html
14. https://maker.robotistan.com/raspberry-pi-dersleri-12-dc-motor-kontrolu/
ADDITIONS
Finish Date
November
Start Date
December
February
January
(mouth)
BUSINESS PACKAGES and ACTIVITIES
March
Süre
April
June
May
1. Creating Conceptual Design 20.10.2020 15.12.2020 2
1.1 Literature Review 3
1.2 Conceptual Design 2
2. Detailed Design and Material Selection 15.12.2020 15.02.2021 2
2.1 Detailed Design 2
2.2 Material Selection and Mechanical Design 1
2.3 Updating the Design 1
3. Prototype Manufacturing and System Integration 15.02.2021 10.05.2021 3
3.1 Prototype Mechanical Manufacturing and Assembly 2
3.2 Electric Electronic System Integration 2
3.3 Creating Software and Algorithm 3
4. Creating the Final Report and Planning the Tests 10.05.2021 25.06.2021 1.5
4.1 Unit Tests 1
Communication Test 1
4.2 Integration Tests 1
Systematic Movement and Orientation Tests 1
Load Carrying Tests 1
System Function Tests 1
4.3 Acceptance Test 1
EK-1 EK-2
import cv2 from dronekit import connect, VehicleMode,
from collections import deque LocationGlobalRelative, Command
import numpy as np import time
boyut = 16
from pymavlink import mavutil
ddq = deque(maxlen = boyut)
import argparse
mavi1 = (84, 98, 0)
mavi2 = (179, 255, 255) parser = argparse.ArgumentParser(description='commands')
vid = cv2.VideoCapture(0) parser.add_argument('--connect')
vid.set(3,960) args = parser.parse_args()
vid.set(4,480) connection_string = args.connect
while True: print("Connection to the vehicle on %s"%connection_string)
goo, img1 = vid.read() vehicle = connect(connection_string, wait_ready=True)
if goo:
blur = cv2.GaussianBlur(img1, (11,11), 0)
hsv = cv2.cvtColor(blur, cv2.COLOR_BGR2HSV) EK 3
mask = cv2.inRange(hsv, mavi1, mavi2) Drone yükseklik ve hız ayarları yapılır:
mask = cv2.erode(mask, None, iterations = 2) def arm_and_takeoff(tgt_altitude):
mask = cv2.dilate(mask, None, iterations = 2) print("Arming motors")
(contours,_) = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL, while not vehicle.is_armable:
cv2.CHAIN_APPROX_SIMPLE) time.sleep(1)
center = None vehicle.mode = VehicleMode("GUIDED")
if len(contours) > 0: vehicle.armed = True
c = max(contours, key = cv2.contourArea) while not vehicle.armed: time.sleep(1)
rect = cv2.minAreaRect(c) print("Takeoff")
((x,y), (width,height), rotation) = rect vehicle.simple_takeoff(tgt_altitude)
s = "x: {}, y: {}, width: {}, height: {}, rotation: #-- wait to reach the target altitude
{}".format(np.round(x),np.round(y),np.round(width),np.round(height),np.round(rotation while True:
)) altitude = vehicle.location.global_relative_frame.alt
print(s) if altitude >= tgt_altitude -1:
rect1 = cv2.boxPoints(rect) print("Altitude reached")
rect1 = np.int64(rect1) break
m = cv2.moments(c) time.sleep(1)
center = (int(m["m10"]/m["m00"]), int(m["m01"]/m["m00"])) arm_and_takeoff(10)
cv2.drawContours(img1, [rect1], 0, (0,0,255),3) #-- set the default speed
cv2.circle(img1, center, 5, (255,255,255),-1) vehicle.airspeed = 7
cv2.putText(img1, s, (25,50), cv2.FONT_HERSHEY_COMPLEX_SMALL, 1, Drone’un gitmesi istenen GPS konumu enlem, boylam ve yükseklik
(255,255,255), 2) cinsinden verilir:
ddq.appendleft(center) print ("go to wp1")
for i in range(1, len(ddq)): wp1 = LocationGlobalRelative(35.9872609, -95.8753037,
if ddq[i-1] is None or ddq[i] is None: continue 10)
cv2.line(img1, ddq[i-1], ddq[i],(0,255,255),1) vehicle.simple_goto(wp1)
cv2.imshow("Orijinal Tespit",img1) Drone hareketleri tanımlama:
if cv2.waitKey(1) & 0xFF == ord("q"): break def set_velocity_body(vehicle, vx, vy, vz):
msg =
vehicle.message_factory.set_position_target_local_ned_encode(
0,
EK-4
0, 0,
import cv2
from collections import deque mavutil.mavlink.MAV_FRAME_BODY_NED,
import numpy as np 0b0000111111000111, #-- BITMASK -> Consider
boyut = 16 only the velocities
ddq = deque(maxlen = boyut) 0, 0, 0, #-- POSITION
mavi1 = (84, 98, 0) vx, vy, vz, #-- VELOCITY
mavi2 = (179, 255, 255) 0, 0, 0, #-- ACCELERATIONS
vid = cv2.VideoCapture(0)
0, 0)
vid.set(3,960)
vid.set(4,480) vehicle.send_mavlink(msg)
while True: vehicle.flush()
goo, img1 = vid.read() if BURASI UYGULAMA SONRASINDA KAMERA
if goo: KONUMUNA GÖRE BELİRLENİP EKLENECEK
blur = cv2.GaussianBlur(img1, (11,11), 0) set_velocity_body(vehicle, gnd_speed, 0, 0)
hsv = cv2.cvtColor(blur, cv2.COLOR_BGR2HSV) elif BURASI UYGULAMA SONRASINDA KAMERA
mask = cv2.inRange(hsv, mavi1, mavi2)
KONUMUNA GÖRE BELİRLENİP EKLENECEK
mask = cv2.erode(mask, None, iterations = 2)
mask = cv2.dilate(mask, None, iterations = 2) set_velocity_body(vehicle,-gnd_speed, 0, 0)
(contours,_) = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL, elif BURASI UYGULAMA SONRASINDA KAMERA
cv2.CHAIN_APPROX_SIMPLE) KONUMUNA GÖRE BELİRLENİP EKLENECEK
center = None set_velocity_body(vehicle, 0, -gnd_speed, 0)
if len(contours) > 0: elif BURASI UYGULAMA SONRASINDA KAMERA
c = max(contours, key = cv2.contourArea) KONUMUNA GÖRE BELİRLENİP EKLENECEK
rect = cv2.minAreaRect(c)
set_velocity_body(vehicle, 0, gnd_speed, 0)
((x,y), (width,height), rotation) = rect
s = "x: {}, y: {}, width: {}, height: {}, rotation: Drone başlangıç noktasına geri dönüş:
{}".format(np.round(x),np.round(y),np.round(width),np.round(height),np.round(rotation)) vehicle.mode = VehicleMode("RTL")
print(s)
rect1 = cv2.boxPoints(rect)
rect1 = np.int64(rect1)
m = cv2.moments(c)
center = (int(m["m10"]/m["m00"]), int(m["m01"]/m["m00"]))
cv2.drawContours(img1, [rect1], 0, (0,0,255),3)
cv2.circle(img1, center, 5, (255,255,255),-1)
cv2.putText(img1, s, (25,50), cv2.FONT_HERSHEY_COMPLEX_SMALL, 1,
(255,255,255), 2)
ddq.appendleft(center)
for i in range(1, len(ddq)):
if ddq[i-1] is None or ddq[i] is None: continue
cv2.line(img1, ddq[i-1], ddq[i],(0,255,255),1)
cv2.imshow("Orijinal Tespit",img1)
if cv2.waitKey(1) & 0xFF == ord("q"): break
FCB (Continued)
EK-5 # WAYPOINT
arm_and_takeoff(1) komut.add(Command(0, 0, 0,
import RPi.GPIO as GPIO mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIV
FCB Code from time import sleep E_ALT,
GPIO.setmode(GPIO.BOARD) mavutil.mavlink.MAV_CMD_NAV_WAYPOINT, 0,
Motor1A
from dronekit import= Command,
16 connect, 0, 0, 0, 0, 0, -35.36265286, 149.16514170, 20))
VehicleMode,Motor1B = 18
LocationGlobalRelative komut.add(Command(0, 0, 0,
Motor1E = 22 mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIV
import time
GPIO.setup(Motor1A,GPIO.OUT) E_ALT,
from pymavlink import mavutil
GPIO.setup(Motor1B,GPIO.OUT) mavutil.mavlink.MAV_CMD_NAV_TAKEOFF, 0, 0,
iha = connect("127.0.0.1:14550",
GPIO.setup(Motor1E,GPIO.OUT) 0, 0, 0, 0, 0, 0, 1))
wait_ready=True)
Print“Ileri hareket”
def takeoff(irtifa): komut.add(Command(0, 0, 0,
GPIO.output(Motor1A,GPIO.HIGH)
while iha.is_armable is not True: mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIV
GPIO.output(Motor1B,GPIO.LOW)
print("İHA GPIO.output(Motor1E,GPIO.HIGH)
arm edilebilir durumda değil.") E_ALT,
time.sleep(1) sleep(2) mavutil.mavlink.MAV_CMD_NAV_TAKEOFF, 0, 0,
print("İHA Print“Gerihareket”
arm edilebilir.") 0, 0, 0, 0, 0, 0, 10))
iha.mode =GPIO.output(Motor1A,GPIO.LOW)
VehicleMode("GUIDED") # RTL
iha.armedGPIO.output(Motor1B,GPIO.HIGH)
= True komut.add(Command(0, 0, 0,
GPIO.output(Motor1E,GPIO.HIGH)
while iha.armed is not True: mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIV
print("İHA sleep(2)
arm ediliyor...") E_ALT,
print “Motor durdu”
time.sleep(0.5) mavutil.mavlink.MAV_CMD_NAV_RETURN_TO_L
print("İHA GPIO.output(Motor1E,GPIO.LOW)
arm edildi.") AUNCH, 0, 0, 0, 0, 0, 0, 0, 0, 0))
GPIO.cleanup()
iha.simple_takeoff(irtifa) # DOĞRULAMA
while iha.location.global_relative_frame.alt < komut.add(Command(0, 0, 0,
irtifa * 0.9: mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIV
print("İha hedefe yükseliyor.") E_ALT,
time.sleep(1) mavutil.mavlink.MAV_CMD_NAV_RETURN_TO_L
def gorev_ekle(): AUNCH, 0, 0, 0, 0, 0, 0, 0, 0, 0))
global komut komut.upload()
komut = iha.commands print("Komutlar yükleniyor...") Test
Test Plan Name
komut.clear()
Drone Unit Tests takeoff(10) Iden 5.1.
time.sleep(1) gorev_ekle() tific
# TAKEOFF komut.next = 0 atio
komut.add(Command(0, 0, 0, iha.mode = VehicleMode("AUTO")n
while True:
mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIV Num
E_ALT, next_waypoint = komut.next
ber:
print(f"Sıradaki komut {next_waypoint}")
mavutil.mavlink.MAV_CMD_NAV_TAKEOFF, 0, 0,
0, 0, 0, 0, 0, 0, 10)) Unit tests and results of all sub-elements
time.sleep(1)in the WhiteBox
if next_waypoint is 4: ☒
cargo drone system
Definition print("Görev bitti.") Type Black box
break
print("Döngüden çıkıldı.")
☐
Experiment Performer
Project team Date
The camera
Camera-GPS
03
module
should be able to ✓ Camera operation has been observed and
be turned on and GPS coordinates have been taken at
operation
off and the GPS expected values..
data should give
accurate data.
40
Code written in the The code worked efficiently and
Target
04
recognition
Python program ✓ detected the object identified as a blue
should work cross from different angles and
software
without errors . distances.
Overall Results ✓
Test
Test Plan Name Integration Test-2 Iden 5.2.2. ve 5.2.3.
tifica
tion
Num
ber:
UD
Making the Successful take- The drone has correctly performed the
0
drone ready off and landing
✓ ascent / descent commands given to it in the
1
for flight of the drone Python interface. With the help of the meter,
its height was measured and verified with the
desired data.
Direction Correct East / West and North / South movements are
0
al recognition of
✓ defined as X (+ -) and Y (+ -). Their
2
moveme compass movements in these directions were tested and
nts directions and their controls were matched with the compass.
movement
Moveme Accessing the It has completed the movement to the GPS
0
nt to given GPS
✓ location entered. The distance difference is
3
given position compared with the measured GPS data sheet
location data. The margin of error is due to GPS. The
by GPS system is working properly.
Fixing Returning from Memorized the position of the starting point.
0
the the shortest
✓ The position received by the print function
4
starting route to the coincided with the origin position. The return
point and starting point command in the resource library has returned
return assigned as a to the starting point with a tested success.
GPS code
Recognizing the The camera detected the place defined by the
0
Image place
✓ cross by entering the field of view. The drone
5
detection determined with made the determination of the movement in a
and the cross and way to take the vertical position of the cross.
moveme coming to the
nt to the region so that
41
area the route
determination is
perpendicular
Overall Results ✓
P: Pass; F: Fail; CA: Could not apply
Test
Test Plan Name Integration Test-1 Identifi 5.2.1.
cation
Numbe
r:
Experiment
Project team Date
Performer
42
Test
Integration Test-3 Iden 5.2.4.
Test Plan Name tifica
tion
Num
ber:
Test of the drone that reaches the cargo drop White Box
Definition area by means of a mechanical arm. Type ☒
Black box
☐
Experiment
Project team Date
Performer
Hardware Version Time
Adım No
Comments
G
Descent The drone The drone verifies its location in the Python
0
should verify
✓ interface both in the GPS interface and via the
1
through its camera. With the outputs being "true", it made
software that it the descending movement in a controlled
has reached the manner and reached the desired height level..
coordination
given to it and
begin its landing
motion.
Cargo Cargo arm The drone positioned perpendicular to the
0
handle taking /
✓ cargo on the ground has successfully
2
opening releasing the received / released the cargo.
and 5x5x5 cm load
closing given to it
Overall Results ✓
P: Pass; F: Fail; CA: Could not apply
43
GAZI IS THE FUTURE…
44
45