0% found this document useful (0 votes)
440 views

Interim Report

The document is an interim report for the SUAVE project, which aims to create a semi-autonomous unmanned aerial vehicle (UAV) using a quadcopter helicopter. So far, the required hardware has been obtained and the system design has been finalized. The quadcopter will include additional systems for navigation using GPS, object detection using a camera, and wireless communication using WiFi and GPRS. Testing of individual components is underway. The project is currently on schedule according to the interim report.

Uploaded by

devil_j
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
440 views

Interim Report

The document is an interim report for the SUAVE project, which aims to create a semi-autonomous unmanned aerial vehicle (UAV) using a quadcopter helicopter. So far, the required hardware has been obtained and the system design has been finalized. The quadcopter will include additional systems for navigation using GPS, object detection using a camera, and wireless communication using WiFi and GPRS. Testing of individual components is underway. The project is currently on schedule according to the interim report.

Uploaded by

devil_j
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

19.

521 - Group Project


Interim Report
Semi-Autonomous Unmanned Aerial Vehicle
Philipp Karagiannakis
Colin McGuire
Niall McKenzie
Neil Sinclair
December 19, 2008

Mentor: Professor R. Stewart


Abstract
The SUAVE project aims to create a Semi-Autonomous Unmanned Aerial Vehicle for
use in observation and surveillance applications. A quadcopter helicopter based on a
commercial open-source project called Mikrokopter will be used as foundation. Further
systems will be added to increase the functionality of the quadcopter including an object
detection system, GPS-powered navigation and a bi-directional wireless communication
system using WiFi and GPRS. A camera will be integrated into the system to provide
an example of a potential surveillance application.

Currently the project is on schedule and all of the required hardware has been ob-
tained. The system design has been finalised and a work plan created to implement this
in the coming semester.

ii
Contents
1 Introduction 1
1.1 Aims and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Project Organisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Report Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Applications 3
2.1 Microwave Cavity Resonator . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Air Pollution Sensor (Inner-city) . . . . . . . . . . . . . . . . . . . . . . . 3
2.3 Remote Volcano Sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.4 Camera Surveillance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3 Background 5
3.1 How Quadcopter Works . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3.2 Proportional Integral Derivative Control . . . . . . . . . . . . . . . . . . 7
3.3 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.3.1 WiFi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.3.2 GPRS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.1 Sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.2 Imaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

4 Design Methodology 13
4.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
4.2 Navigation and Autonomous Control System . . . . . . . . . . . . . . . . 13
4.2.1 Requirements 1 and 3 Send and Receive commands and flight data
to the communication system . . . . . . . . . . . . . . . . . . . . 16
4.2.2 Requirement 2 Receive and Pre-Process Data . . . . . . . . . . . 16
4.2.3 Requirement 4 - Process Available Data and Decide on Movements
Required to Control the MikroKopter . . . . . . . . . . . . . . . . 17
4.2.4 Requirement 5 Disengage Manual Control and Control quadcopter
in Autonomous Mode . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.2.5 Requirement 6 Monitor Current Situation and Prepare for Au-
tonomous Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.3 Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

iii
5 Equipment 22
5.1 Flight Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
5.2 Navigation System Equipment . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2.1 Central Processing Unit . . . . . . . . . . . . . . . . . . . . . . . 24
5.2.2 GPS Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2.3 Magnometer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.3 Communications System . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.3.1 Freescale Coldfire Microcontroller . . . . . . . . . . . . . . . . . . 25
5.3.2 Fargo Maestro - 100 GPRS Modems . . . . . . . . . . . . . . . . 26
5.3.3 S103 WiFi Module . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.4 Surveillance Module and Compression . . . . . . . . . . . . . . . . . . . . 28
5.5 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

6 Progress to date 31
6.1 Construction and Test Flights . . . . . . . . . . . . . . . . . . . . . . . . 31
6.2 Selection of Navigation Equipment and Testing . . . . . . . . . . . . . . 32
6.2.1 Component Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 32
6.3 Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
6.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

7 Future Work and Goals 37


7.1 Navigation System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
7.2 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
7.3 Camera Surveillance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

8 Conclusion 39

A Risk Assessment 43

B Gantt Chart 47

List of Figures
1 quadcopter with mounted camera[2] . . . . . . . . . . . . . . . . . . . . . 4
2 quadcopter showing direction of each propeller[3] . . . . . . . . . . . . . 5
3 Thrust comparison of 10” and 12” propellers . . . . . . . . . . . . . . . . 8
4 PID controller[8] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
5 Illustration of using infrared or ultrasound to detect the presence of objects[11] 11

iv
6 RoboRealm example, showing the original image (left) and the edge de-
tected image (right)[12] . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
7 RoboRealm example, showing the side-filled image (left) and the eroded
image (right)[12] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
8 RoboRealm example, showing the final image overlaid with the original[12] 12
9 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
10 Basic Navigation System Connections . . . . . . . . . . . . . . . . . . . 14
11 Navigation Processor Data Inputs and Outputs . . . . . . . . . . . . . . 15
12 The three levels of control: go to command, object avoidance and basic
flight stabilisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
13 Control loops used to achieve autonomous flight. (a) yaw: controls the
direction of vehicle (b) pitch: controls the horizontal movement. (c)
throttle: controls the height of the vehicle . . . . . . . . . . . . . . . . . 19
14 Transmission Structure of SUAVE communications . . . . . . . . . . . . 20
15 Reception Structure of SUAVE communications . . . . . . . . . . . . . . 20
16 Freescale MCF528x features[28] . . . . . . . . . . . . . . . . . . . . . . . 25
17 Freescale Coldfire Microcontroller and Development board . . . . . . . . 26
18 Fargo Maestro- 100 GPRS modems[29] . . . . . . . . . . . . . . . . . . . 26
19 S103 Wifi module[31] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
20 Polar response of a Parallax Ping Sensor, tested with a 12” square piece
of white card[33] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
21 MikroKopter kit before construction . . . . . . . . . . . . . . . . . . . . 31
22 Fully constructed MikroKopter . . . . . . . . . . . . . . . . . . . . . . . 32
23 Magnometer test rig schematic . . . . . . . . . . . . . . . . . . . . . . . 33
24 Circuit diagram of a Sharp Distance Measuring Sensor in use . . . . . . . 35
25 Screen captures of the output pin of the Sharp Distance Sensor . . . . . . 36

v
1 Introduction
Autonomous vehicles create substantial and varied engineering challenges. With ever in-
creasing processing power and the reduction in size, weight and cost of electronic devices,
this challenge can now be extended into the sky.

Designing and constructing an autonomous aerial vehicle is a substantial challenge


and requires knowledge of several areas, both mechanical and electrical. Expertise in
digital communications, control engineering, embedded system design and integration,
power economy and data processing are all required. This multidisciplinary aspect and
the novelty of flight appealed to all of the group members. A quad rotor helicopter
provides a stable and agile platform for surveillance. It avoids some of the complexities
involved with maintaining stable flight compared to a standard helicopter making it an
excellent choice for an initial autonomous aerial vehicle project.

1.1 Aims and Objectives


The aim of the project is to design and construct a Semi-autonomous Unmanned Aerial
Vehicle (SUAVE) for observation and surveillance.

The platform is based on a quad-rotor design which offers superior stability, ease of
control and payload capacity compared to a standard helicopter. A quad-rotor design is
also extremely agile and is free to move in any direction, making it ideal for surveillance
and observation.

A navigation system will incorporate basic object avoidance to allow safe autonomous
flight between predetermined points and wireless communications will facilitate the trans-
fer of control commands and sensor data between the vehicle and a base station.

The following objectives have been identified for this project:

• Construct a quad rotor helicopter and achieve manually controlled stable flight of
the vehicle using radio frequency (RF) control.

• Create a control system capable cable of autonomous hover.

• Create a system to facilitate wireless bi-directional data communication and control.

• Integrate a basic object detection system with an improved control system to avoid
collisions.

1
• Incorporate a navigation system to allow autonomous flight and navigation.

• Integrate a surveillance device and pre-process the data for transmission over the
communications channel.

1.2 Project Organisation


The project is divided into four manageable sections. Each group member is responsible
for one section yet substantial overlapping and crossovers exist. These sections were
assigned according to each members interests. A Gantt chart (Appendix B) was created
containing the parallel work streams to provide a schedule for the group to follow. To keep
all group members and the project mentor informed of progress, formal weekly meeting
are scheduled and an online record of the groups progress in the form of a blog, is kept
at http://project.rawxing.org

1.3 Report Structure


The aim of this report is to inform the reader of the current progress as of 18th December
2008, provide an overview of the design to be implemented and highlight the future work
required to meet the project objectives.

Section 2 details the possible applications of an autonomous quadrotor helicopter and


the chosen application for the SUAVE project.

Section 3 provides background information to aid the reader in understanding how a


quadrotor helicopter operates as well as providing an overview of the technologies which
will be referred to in this report.

Section 4 gives an overview of the current system design which will be implemented.
It is divided into the following sections: Navigation and Autonomous Control, Commu-
nications, Object Detection System and Camera and Video Compression. Each section
will discuss the architecture of the individual system, the connections to other systems
and the equipment selected.

Section 6 highlights the work accomplished to date in all sections and Section 7 dis-
cusses future work and goals and concludes the report.

Appendix A contains the specific risk assessment for this project as of 18th December
2008.

2
2 Applications
A number of options were considered as a possible final application for the quadcopter.
Below is a list of some of the applications investigated and the reasons why they were
eliminated.

2.1 Microwave Cavity Resonator


An idea put forward by Dr Ian Glover a Strathclyde lecturer involved testing the reflec-
tivity of air contained in the boundary layer. The boundary layer is the first few hundred
meters above the earths surface. The reflectivity of air affects the transmission of radio
waves. The proposed method of measurement included building a microwave cavity res-
onator. The resonator chamber would allow air to pass through it and by testing the
capacitance of the air the reflectivity could be determined. The rotating propellers posed
a problem because they would force air pass the cavity thus affecting the measurement.
A different method was to actively monitor the temperature and humidity of the sur-
rounding air and perform a calculation using the two variables. The problem with this is
obtaining an instantaneous and accurate measurement of the humidity. Once again the
rotating blades would affect the measurements.

2.2 Air Pollution Sensor (Inner-city)


It was thought that the vehicle could be used to navigate a pre-determined flight path
and measure several air pollutant levels en route. The resulting measurements could then
be used as an indication of the air pollution in a particular area of a city. However, after
some correspondence with the Technical Officer at the Land Environmental Services of
the Glasgow City Council the idea was abandoned. The main reasons came down to a
problem with the measurement sensors: they were either both large and bulky or a passive
measurement was needed. The passive measurement would once again be affected by the
moving vehicle and is usually taken over a longer period.

2.3 Remote Volcano Sensing


Andrew McGonigle, a Scottish scientist working for the University of Sheffield, has de-
veloped a novel means for determining whether or not a volcano is due to erupt[1]. It
involves sending a Radio Controlled (RC) helicopter into the crater of a volcano and
measuring the level of various gases. This information can then be processed once the
helicopter returns. We are hoping to develop an autonomous version of this which was

3
capable of transmitting the data back for real time processing. Once again the sensors
were the main reason this idea was abandoned. They were found to be too expensive and
there was nowhere to test the vehicle i.e. a local active volcano.

2.4 Camera Surveillance


Possibly the most obvious use for an aerial platform would be as a mount for a camera
so that it may be used as a mobile surveillance device, shown in Figure 1. A decision
was made to build the quadcopter for use as a platform for a camera which would be
connected via a standard serial connection such as RS232. The standardised connection
could then be used to attach different devices and essentially enable the quadcopter to
be used for a multitude of applications.

Figure 1: quadcopter with mounted camera[2]

4
3 Background
3.1 How Quadcopter Works
A quad-rotor helicopter avoids much of the complexity needed to fly a standard helicopter
by eliminating the need for variable pitch rotor blades and a thrust stabilising tail rotor.
Instead it uses four identical propellers which are evenly spaced; they each produce an
equal amount of lift and torque with each pair of opposing propellers being spun in the
same direction. This gives us a pair of clockwise and anti-clockwise propellers. This is
shown in Figure 2 below. By varying the thrust produced by the propellers, movement of
the helicopter in any direction is possible. One of the clockwise rotating arms is defined
as being the front of the vehicle and all movement is in relation to it.

Figure 2: quadcopter showing direction of each propeller[3]

As all the propellers are spinning at the same angular velocity, a yaw stabilising tail
rotor like the one found on a standard helicopter is not needed. Yaw is achieved by in-
creasing the torque produced by a pair of rotating motors – either the clockwise pair or the
anti-clockwise pair and decreasing the torque produced by the other pair. Similarly the
pitch of the rotors need not be altered. Instead, movements in the pitch (forward/back-

5
ward) and roll (left/right) axes can be achieved separately, simply by altering the thrust
produced by a pair of motors and all this can be done without affecting the yaw of the
vehicle. The front and rear propellers control the pitch of the vehicle while in flight and
the right and left propellers control the roll of the vehicle.

The Flight Control (FC) is the main controlling board on the quadcopter and all
the necessary sensors and microprocessors used to achieve and maintain stable flight are
located on this board. The gyroscopes (gyro, also known as rotational speed sensors) are
considered to be the most important of the onboard sensors. The programmed microcon-
troller uses the measurements from them to compensate for external influences such as
wind. The X, Y and Z axis are all assigned their own gyro and they measure the degrees
per second the vehicle turns in a certain plane.

The rate of angular change is measured using three accelerometers; once again each
axis is assigned its own accelerometer. It is possible to fly the vehicle without these
accelerometers however it would not be possible for the quadcopter to return to a level
state by itself. By including these sensors it is possible for the pilot to release the remote
control (RC) sticks and the vehicle will stabilise itself. The final sensor is a barometer
and it allows the vehicle to maintain a specific flying height.

Due to the nature of the project a strict power to weight ratio budget was designed.
From initial estimations, maximum payload for the vehicle would be about 400 grams.
The motors used in RC vehicles vary greatly so it was desired to find a light motor with
low power consumption. The main requirement was that it provided 1000 rpm V−1 . A
Brushless motor fulfils the specifications. Brushless motors are more complicated to op-
erate because they need to change the DC supply voltage of the battery into a phased AC
(usually three phase) with a controlled power output so that the speed of the motors can
be accurately controlled. In total there are four Brushless Controllers (BL-Ctrl) one for
each motor. There are numerous BL-Ctrls on the market. To achieve a very stable level
of flight it is desired to change the throttle value very rapidly (<0.5m s−1 ); this would be
best done using an Inter-Integrated Circuit (I2 C) bus interface.

Control of the vehicle is achieved through use of a RF transmitter and receiver pair. A
transmitter with a minimum of four channels is required to control the quadcopter in the
various planes. It must use Pulse Position Modulation (PPM). The receiver demodulates
the instructions given by the pilot and converts them into electrical signals. A receiver
with a summing signal is required for the MikroKopter (MK). This signal contains all of

6
the channels sent by the transmitter and is made available to the FC for processing.

The design of the frame is only limited by one factor: it must be as light as possible.
The frame designed took on a basic cross shape similar to many open source projects[4].
Bearing in mind that the frame must be lightweight some of the materials considered
include: aluminium, carbon fibre or polystyrene; it is possible to have a combination of
these. The most cost effective material for this project is aluminium because it is readily
available and cheap.

The batteries needed to power all the onboard electronics have to have a low weight to
power ratio so that maximum flight time may be achieved. The different types of battery
considered suitable for use in RC vehicles are: Lithium polymer (Lipo), Nickel-cadmium
(NiCd) and Nickel metal hydride battery (NiMh). The Lipo types are the most advanced
and offer the most power. They are volatile and must be handled with care but they
can have a capacity of over 10,000 mAh. NiMh batteries are a slightly older technology
but they are still capable of capacities around 5000 mAh. However if a poor quality of
battery is purchased then bad cells can develop quickly when the capacity is increased.
NiMh batteries are cheaper than Lipo batteries and are also less volatile. NiCd batteries
are the cheapest overall and have similar characteristics to that of NiMh but are unable
to achieve an equal capacity. A decent flying time is desired to test the software and
hardware which will be added and due to the powerful motors and large propellers the
best weight to power ratio with a high capacity can be achieved using the Lipo batteries.

The propellers play a major role on the take-off weight and the speed that the vehicle
will fly at. The diameter of the propellers as well as their pitch must be considered. Due
to the slow spinning motors a low pitch is desired. After using a thrust calculator it was
determined that a good combination for the propellers would be: 12” propellers with
4.5” pitch. A snapshot of the thrust calculator at 4000 rpm for 10” and 12” propellers
is shown below in Figure 3. The affect of altering the blade diameter by 2” doubles the
static thrust developed.

3.2 Proportional Integral Derivative Control


Proportional Integral Derivate (PID) control is a common method used to control pro-
cesses and robotic systems [5]. It allows automatic control of an output, depending on
the current output and various inputs. It consists of a combination of three individual
controllers combined together; Proportional, Integral and Derivative as shown in Figure 4.

7
Figure 3: Thrust comparison of 10” and 12” propellers

Figure 4: PID controller[8]

As shown in Figure 4, the error signal is an input minus the current output. This error
signal is processed by the PID controller and the outputs of each element are combined
to create the output to control the process.

8
Proportional control can be thought of as a gain. The error signal is multiplied by
the gain; Kp . It influences the amount the input has to change to cause a variation of
0-100% in the output[6].

Integral control uses feedback (most commonly positive) to reset the controller[7].
This is achieved by integrating the error signal and multiplying it by the gain, Ki . This
has the effect of controlling the output so it stabilises at a constant steady-state.

Differential control differentiates the error signal and multiplies it by the gain, Kd .
This influences the rate of change of the output signal.

Combining these three elements creates an effective controller which can be imple-
mented in software.

3.3 Communication System


3.3.1 WiFi

The WiFi version that will be implemented is 802.11b and it has the following characteristics[9]:

• Release Date - October 1999

• Frequency band - 2.4 GHz

• Typical Throughput - 4.5 Mbps

• Bit Rate - 11 Mbps

• Range (Indoor) - ∼38 m

These aspects describe how ideally suited it is to our application. Relevant appli-
cations are short range communications that do not require extremely large data rates.
This version of WiFi is very commonly used as a wireless LAN technology.

The main issue with 802.11b is that there are other devices that operate on the 2.4
GHz frequency band. These devices include microwaves, Bluetooth, baby monitors and
cordless phones. When these devices operate in the same region as 802.11b they cause
interference to occur in the WiFi signal.

9
3.3.2 GPRS

Whilst GSM is excellent for communications networks that involve the transmission and
reception of voice calls, it is not designed for the use of sending or receiving data. This
is due to the data rate of GSM not being high enough for data transmission.

GPRS (General Packet Radio Service) improves the performance of GSM to allow
it to send and receive data. GPRS is one of the communications technologies that are
classed as part of 2.5 G, these are designed to move the performance of 2G networks
closer to the performance of 3G networks. The bit rate of GPRS for a pair of Class 10
modems is 16- 24 kbps for uploading and 32- 48 kbps for downloading[10].

Due to GPRS being part of the standard cellular network, there is currently wide
coverage, especially in urban areas. This means that if a remote control device is being
controlled over the cellular network then it can be controlled from anywhere else with
network coverage, no matter how far away.

3.4 Object Detection


Robots have no spatial awareness (being non-human), and as such need to be constantly
updated with its surroundings. When the quadcopter is flying from Point A to Point
B, it is important that it does not collide with any obstacles on route. There are two
potential paths for Object Detection: Sensing and Imaging.

3.4.1 Sensing

The Sensing approach uses either Ultrasound or Infrared waves to detect the presence of
objects. A short pulse is emitted from a transmitter and then a detector will receive the
reflected signal. Either the time delay or the magnitude of the signal will then be used
to calculate the distance to any objects within range of the sensor. This approach, as
illustrated in Figure 5, is the same as is used with touchless hand driers found in public
toilets.
There are three types of outputs that can be taken from a distance sensor: Analogue
Range, Pulse Width Modulation (PWM) Range and Distance switch.

Analogue Range is a variable voltage output (between ground and the supply volt-
age) that is dependent on the distance to the object. This can then be quantised using
a Analogue-to-Digital Converter (ADC) and with the aid of a look-up table, a distance
realised. A PWM range output works in a similar manner, where by the pulse width is

10
Figure 5: Illustration of using infrared or ultrasound to detect the presence of objects[11]

varied dependent on distance. The Distance Switch is the most basic of the three, and
when an object is within range of the sensor, the output is toggled (either High to Low
or vice-versa).

3.4.2 Imaging

The use of a camera allows the quadcopter to ”see” the surrounding environment. Robo-
Realm has open source software[12] showing how a simple webcam can be used to detect
objects and calculate a safe route. Firstly edge detection is applied to an image as seen
in Figure 6.

Figure 6: RoboRealm example, showing the original image (left) and the edge detected
image (right)[12]

A technique called ”side-fill” is used whereby a vertical line is taken from the bottom
of the image and white is placed in each pixel, until an edge is detected, at which point
the rest of the line is filled black. The white area of the image is then eroded by an object
such that the area left is large enough for the vehicle to pass. These steps are shown in
Figure 7.

11
Figure 7: RoboRealm example, showing the side-filled image (left) and the eroded image
(right)[12]

The image is smoothed and the white pixel with the greatest y value is found and it
is this pixel that the vehicle aims for. In this example, as seen in Figure 8, this is at the
x value of 193.

Figure 8: RoboRealm example, showing the final image overlaid with the original[12]

12
4 Design Methodology
This section details the architecture of the overall system, and the design of and inter-
connections between internal subsystems.

4.1 System Overview


A diagram of the complete system is shown in Figure 9. The navigation system controls
the actual flight of the quadcopter and passes flight telemetry (current position, heading,
height, battery level, gyroscope and accelerometer values) to the communications system
to send back to the user. The communications system receives control commands to tell
the quadcopter where to go (latitude/longitude/height) from the user and passes them
to the navigation system. It also encodes and transmits sensor data back to the user.
The object detection system will detect objects in the quadcopters path and inform the
navigation system which will take the appropriate action.

Figure 9: System Overview

Communication between the systems will be via an I2 C bus. The navigation system
will act as the master device and will ask the other two systems for data when required.
This will be discussed further in Section 4.2.2.

The design of the navigation, communications and object detection systems are de-
tailed in the sections below.

4.2 Navigation and Autonomous Control System


The navigation system is centred around a microcontroller hereby referred to as NAVPRO.
This is the centre of the whole system, controlling the overall operation of the Unmanned
Aerial Vehicle (UAV) and acting as master to the other communication and object de-
tection systems (ODS). The MKFC will be used to maintain stable flight when manually

13
and autonomously controlled.

To meet the requirements specified in Section 1.1, a Global Positioning System (GPS)
module and magnometer are required for autonomous flight. The GPS module allows the
quadcopter to be aware of its current location (by providing a latitude and longitude)
and the magnometer allows the quadcopter to be aware of its current orientation to allow
it to know in which direction to start moving.

A diagram of the navigation system and internal and external connections is shown
in Figure 10.

Figure 10: Basic Navigation System Connections

The primary requirements of the navigation processor are:

R1 - Receive instructions from the communications system to determine mode of


operation (manual flight or autonomous flight).
R2 - Receive and pre-process the following data:

• Current position from the GPS module.


• Current orientation from the magnometer.
• Object alerts from the object detection system.

14
• Current height, fuel (battery level), pitch, yaw and role accelerometer and
integral data from the MKFC.

R3 - Send the current flight telemetry (as specified in R2) to the communication
system.

R4 - Process available data as specified in R2 to decide on movements required to


control the MikroKopter.

R5 - Disengage manual control and send control commands to the MikroKopter FC


when in autonomous mode.

R6 - Monitor current system and prepare for autonomous control when in manual
mode.

Each requirement also requires a specific hardware and software design, detailed be-
low. The flow of data to and from NAVPRO is shown in Figure 11.

Figure 11: Navigation Processor Data Inputs and Outputs

15
4.2.1 Requirements 1 and 3 Send and Receive commands and flight data to
the communication system

Communication between Communications Processor (COMPRO) and Navigations Pro-


cessor (NAVPRO) will be via the I2 C bus. There is also an option to use the second
asynchronous serial port on the NAVPRO if the loading on the I2 C bus becomes too
high, but the I2 C bus is the preferred option.

4.2.2 Requirement 2 Receive and Pre-Process Data

Three separate communication methods will be used to allow the NAVPRO to obtain all
data required for autonomous flight as shown in Figure 11. All the current data values
will be stored in RAM for use to fulfil Requirement 4.

As the I2 C bus will be used to allow four devices to communicate, timing will be very
important. The COMPRO will act as the master device and will request data from each
device in a predefined sequence at a regular interval. This will ensure the NAVPRO will
have all the data required to make a decision and fly the quadcopter when required.

GPS-320FW GPS Module

The GPS-320FW module utilises an asynchronous serial connection to send and re-
ceive data. Therefore one of the NAVPRO SCI modules will be used to obtain positional
data and initialise the GPS module when needed. The current position will be updated
at 1Hz which is the maximum rate supported by the GPS-320FW. The serial interface
protocol is based on the National Marine Electronics Associations NMEA 0183 ASCII
interface specification which consists of a string of ASCII characters which contain all
relevant positional data. The NAVPRO will decode these messages and extract the cur-
rent position from these.

CMPS-03 Magnometer

The CMPS-03 magnometer will be interfaced via the I2 C bus due to the ease of use
and the low refresh rate required.

Object Detection System

The ODS will communicate with NAVPRO via the I2 C bus. The Object Detection

16
System (ODS) will send data with required actions (e.g. change in direction, stop) to the
NAVPRO. The NAVPRO will send commands to request data from the object detection
system and to manage the I2 C bus. A relatively low data transfer rate, roughly 6 times
per second, will be required to communicate with ODC.

MikroKopter Flight Controller

Communication with the MikroKopter FC will be using the SPI interface. This will
allow the NAVPRO to send control commands to the MikroKopter flight controller and
receive the current battery level, height, nick, roll and yaw accelerometer values and gy-
roscope positions. These values are already sent by the MKFC for debugging using the
MikroKopter tool; these same values will be sent to the COMPRO.

The MikroKopter flight controller software will be modified to output the flight data
and receive control commands over the SPI interface. The control commands will mimic
those sent via the RF transmitter i.e. provide a value for pitch, yaw, thrust and roll.
The software will also contain a function to use either control commands from the PPM
receiver or disable this input and use the NAVPRO.

Communications System

The MikroKopter will receive commands from the COMPRO over the I2 C bus. These
commands will be sent periodically.

4.2.3 Requirement 4 - Process Available Data and Decide on Movements


Required to Control the MikroKopter

The NAVPRO will use all the available data to decide on the best course of action to
control the MikroKopter. There are three levels of control required as shown in Figure 12.
The highest level is the Go to Command. This command will be received by the
NAVPRO from the COMPRO and will be the ultimate destination of the quadcopter. It
is made up of three parts: GPS latitude, GPS longitude and height. This will be stored
in a memory location and will be compared the current GPS latitude, longitude, height
and orientation to decide on what action is required.

The next level is the object avoidance. The NAVPRO must make sure there is no
threat to the quadcopter in its current situation to meet the go to command; if there is
it must act to stop to quadcopter.

17
Figure 12: The three levels of control: go to command, object avoidance and basic flight
stabilisation

The lowest level is the basic flight stabilisation. This underpins the two layers above
and is used to carry out the go to command and satisfy the object avoidance. Initially
all autonomous movements of the quadcopter will be made by rotating and pitching the
vehicle. A set of control loops based on PID controllers (section 1.2) will be used to
control the pitch, yaw and height of the craft by providing pseudo control values to the
MKFC. The MKFC will stabilise the roll of the vehicle. A block diagram of the control
loops is shown in Figure 13. The ‘system’ block refers to the physical movement of the
aircraft. There may be some options to use the control loops on the MKFC to help
implement this requirement. This will be investigated further.

4.2.4 Requirement 5 Disengage Manual Control and Control quadcopter in


Autonomous Mode

The MKFC has two sets of input command variables for thrust, yaw, pitch and roll:
‘externalStick’ and stick. The ‘externalStick’ set of variables will be used for autonomous
commands and will be set to the values sent over the SPI channel from the NAVPRO.

There will be two methods of engaging autonomous control, through the manual RF
transmitter or via the advanced communications link (see Section 4.3). The software
on the MKFC will be modified to let one of the RF PPM channels toggle autonomous
mode and select whether to use primary or secondary control commands. The MKFC
will then communicate with the NAVPRO which mode it is in. An alternative will be
via the COMPRO. When the COMPRO sends a signal to engage autonomous mode the
NAVPRO will contact the MKFC to disengage manual control, hold position and wait
for further instructions.

18
Figure 13: Control loops used to achieve autonomous flight. (a) yaw: controls the
direction of vehicle (b) pitch: controls the horizontal movement. (c) throttle: controls
the height of the vehicle

19
4.2.5 Requirement 6 Monitor Current Situation and Prepare for Autonomous
Control

Requirement 5 demands that autonomous mode must be able to be switched on at any


time therefore the NAVPRO will carry out its calculations to maintain the current posi-
tion at all times but let the MKFC regain control via RF.

4.3 Communications
The design of the communications section of the project is in the form shown in Figure 14
and 15.

Figure 14: Transmission Structure of SUAVE communications

Figure 15: Reception Structure of SUAVE communications

The methodology is such that the data taken by the camera is passed to the micro-
controller using the serial output, this data is then stored within the microcontroller. A
large amount of RAM is required to store the data from the camera due to the resolution
and the number of frames created by the camera. Once the data has been compressed,
the data is then output through another serial connection to the GPRS modem or WiFi
module for wireless transmission.

When there is navigation data sent from the base station, the quadcopter has to
receive this information and move appropriately. The data that is received will be a
series of bytes that make up a packet; this packet will then be broken up to obtain the
navigation data. The navigation data will then be transmitted down the I2 C to another
Coldfire board and processed to determine what movements the user requests from the
quadcopter.

20
4.4 Object Detection
The question arises as to how often objects must be searched for and how often the results
of the searches are returned to the Navigations module. Consideration must be taken as
to the maximum speed and stopping distance of the quadcopter, and the range of the
sensors. For example, if the quadcopter is flying at 5 m s −1 and the sensors have a range
of 1 m, then searching must be performed at least 5 times a second in the direction of
travel. Alternatively, continuous searching may be performed and the results are com-
municated to the Navigation module at least 5 times a second. These calculations take
no consideration of the stopping distance, so the refresh rate may in fact have to be higher.

If the decision is made to only communicate when objects are detected, the problem of
a busy communications bus (assuming a shared bus – the Magnometer uses I2 C and the
GPS module uses an Asynchronous Serial link) may delay the transmission. This delay
could prove disastrous for the vehicle. It is due to this problem that continuous searching
will be used for this application and that the Navigation Module will periodically contact
the Object Detection System asking for a status update.

As mentioned in Section 3.4, the Infrared and ultrasound sensors have the possibilities
of variable outputs (be that voltage or pulse width), however, to find a microcontroller
that has the capabilities of multiple ADCs and PWM captures would prove problematic.
To overcome this, one pin could be used as an input, i.e. either a PWM capture pin or a
serial connection for an ADC, and each sensor is then toggled using separate pins. This
has the advantage of using a microcontroller with a lower specification, but now removes
the ability to have continuous sensing in all directions. If, however, distance switching
sensors are used, single pins can be used.

21
5 Equipment
5.1 Flight Control
As the flight controller is key to the ability to perform stable, autonomous flight, a sub-
stantial amount of time was spent deciding whether to build a flight controller or buy an
off-the-shelf model.

The advantage of building a flight controller is the flexibility to meet the exact spec-
ifications of the project. The disadvantage is the substantial amount of time required
to design and construct the flight controller which could be spent on creating advanced
navigation functionality. For this reason, several off-the-shelf flight controllers were con-
sidered using the following factors:

• Ease of integration and modification number and types of input-output ports, re-
programmable, open source.

• Performance Flight history and community opinions, stability etc.

• Cost

Several contenders were considered:

Wolferl 3.1 UAVP - Universal Aerial Video Platform

The UAVP [13] is an international open source project to create a reasonably priced
quadcopter. The flight controller is called Wolferl. Wolferl utilizes three gyros, a 3-
axis linear accelerometer, a compass sensor and a barometric sensor. The addition of
a compass sensor is an advantage over competing flight controllers but there are some
disadvantages:

• A populated and tested Universal Aerial Video Platform” V3.1x Flight Controller
Board can be purchased for £265[14] which is expensive.

• An 8 bit PIC16F876 PIC is the processor for the unit and is heavily utilized already
which would make it difficult to add functionality to the flight controller.

These disadvantages made it unsuitable for the project.

22
Paparazzi

Paparazzi is open source hardware and software platform for autonomous aircraft [15].
It is designed for airplanes but there is evidence that it has been adapted for a quadcopter
[16]. The paparazzi control board is called Tiny [17] and incorporates a GPS module and
a powerful LPC2148 microcontroller. It is available preassembled with GPS module but
no gyros, accelerometers and barometric sensors (IMU) for £160 [18]. To use the Tiny,
an IMU will be required which is likely to raise the total price to an unacceptable level.
The GPS integration is advantageous but there is little example software for quadcopters
available which means the software would have to be written from scratch. The high
price and the lack of a skeleton software made paparazzi unattractive.

MikroKopter

The MikroKopter flight controller (MKFC) [19] is an open source flight controller
which is part of the MikroKopter project. The MikroKopter project includes frame
designs, a flight controller, electronic speed controllers, a magnometer and a GPS based
navigation board yet the latter two are not open source. The MKFC was selected for the
SUAVE project for the following reasons:

• The MKFC is an open source software project which allows the code to be changed
and adapted to suit the needs of this project.

• A high quality software application is available (Flight Tool) for debugging, testing
and changing the flight parameters and settings.

• A strong and enthusiastic community exists [20] which helped ensure the MKFC
met our specifications.

• The MKFC has several synchronous and asynchronous input-output ports which
allow interfacing with external hardware.

• The MKFC costs £170 [21] which is cheaper than other flight controllers especially
considering the high performance evident from various videos and user comments
on the internet [22].

23
5.2 Navigation System Equipment
5.2.1 Central Processing Unit

A central processing unit is required to perform the navigation calculations and control the
navigation peripherals as detailed in Section 4.2. The Freescale Coldfire MCF528CVF66
Microcontroller (MCU) was selected for the following reasons:

• The MCU has a similar specification to the LPC2148 used in the Tiny flight con-
troller which performs autonomous navigation (Section 5.1):

– 32 bit
– 66,80 MHZ clock speed
– 512kb ROM,
– 64kB RAM

• The MCU also has a number of input-output ports which are required for interfacing
with the other systems and peripherals, these are: 1xSPI (synchronous serial port)
3xSCI (asynchronous serial port) 1xI2 C (Inter-Integrated Circuit) port.

• A development board was available within the department

5.2.2 GPS Module

Several GPS modules were considered including those used in other autonomous vehicle
projects such as the LEA-4P [16] but proved inhibitively expensive despite good perfor-
mance. After researching alternatives the RF Solutions GPS-320FW was selected [23].
The GPS-320FW was selected due to its excellent specifications, integrated ceramic an-
tenna and a synchronous TTL level interface despite the low price (£24) [24]. It features
-155dBM tracking sensitivity, 5m CEP accuracy (positional error) and a 1Hz update rate
which make it suitable for the SUAVE project.

5.2.3 Magnometer

The CMPS-03 magnetic compass [25] was selected for use as it was used successfully in
a robotics project last year[26]. It can be interfaced with using I2 C or by capturing and
decoding a pulse width modulated signal, creating flexibility in the system. The module
also preprocesses data and so outputs the current heading as a number of degrees from
north (0.0o to 359.9o ) which simplifies the coding required on the navigation processor.

24
5.3 Communications System
The hardware platforms within the project were chosen with the knowledge that they
would be connected together to work as a whole in the final implementation. Once con-
nected the hardware will have the ability to take video (camera), process the information
(Freescale Coldfire MCF5282CVF66) and transmit wirelessly (GSM modem of GPRS
modem).

5.3.1 Freescale Coldfire Microcontroller

The choice of processing devices used for the project had to fulfil the requirements of both
the communications and other aspects of the project. The physical requirements of the
communications processing device were that it had significant process power, large size
flash, an I2 C, 2 SCI ports, TCP/IP stack and a real time operating system[27]. These
are the requirements because receiving video from the camera and transmitting it via
the communications hardware needs to be completed in real time with little delay. The
choices for this were a Freescale Coldfire microcontroller or a Gumstix microcontroller as
they both have high processing power.

The features of this microcontroller can be viewed in Figure 16.

Figure 16: Freescale MCF528x features[28]

The Freescale microcontroller board that will be implemented into the quadcopter
design is a MCF5282CVF66. This is shown in Figure 17.

25
Figure 17: Freescale Coldfire Microcontroller and Development board

The decision for this was a Freescale Coldfire microcontroller. Due to the departments
strong relationship with Freescale these were already being stored within the university
for use within student projects. The MCF5282CVF66 has an excellent learning and
prototyping environment in the form of the Freescale Codewarrior. The choice of micro-
controller was also made quick and easy by the Freescale C & I Microselector tool.

5.3.2 Fargo Maestro - 100 GPRS Modems

The requirement from the GPRS modem is that it has the ability to change the RS232
signal from the microcontroller into a wireless transmission in the format required and
vice versa. The data rates involved have to be high enough to be able to handle a constant
transmission of video data, they should be able to handle RS232 data, and they should
be as small and light as possible.

The GPRS modems that will be used are Fargo Maestro - 100 as these fit the purpose
that is required for the project and have already been bought by the university. The
modem that will be used are shown in Figure 18.

Figure 18: Fargo Maestro- 100 GPRS modems[29]

26
The Fargo Maestro-100 has the following features[30]:

• Dualband GSM 900/1800 MHz.

• Support Data, SMS, Voice and Fax.

• Max Power Output: 2 W (900 MHZ), 1 W (1800 MHz).

• GPRS Class B Class 10 (4Rx+1 or 3Rx+2Tx) at maximum speed.

• SimToolKit Class 2.

• AT command set.

And the following interfaces:

• SIM Holder.

• 15-pin Sub D connector (serial and audio connection).

• 4-pin power supply connector.

• External Antenna.

5.3.3 S103 WiFi Module

The WiFi module that will be used is a S103 WLAN Compact Serial Module as this
is small, light and should automatically do all the conversions required to change the
RS232 output from the microcontroller to the WiFi data transmission and vice versa.
The module is shown in Figure 19.

Figure 19: S103 Wifi module[31]

27
The S103 WLAN Compact Serial Module has the following features[31]:

• UDP version configurable baud rate (9600, 38400, 57600, 115200 bps).

• TCP version configurable baud rate (9600, 38400, 57600 bps).

• Full-duplex (serial data in and out).

• Hardware flow control.

• Firmware upgrade through RS232 port.

• Easy set-up via console or web page.

• LED status and activity indicators for easy installation, monitoring and diagnostics.

• Link to WLAN Access Point (infrastructure mode).

5.4 Surveillance Module and Compression


Deciding to use a camera introduced an interesting set of problems. Firstly not only does
the camera need to be light and robust but it must also have a suitable viewing range.
The biggest challenge will be encoding the video data so that it may be transmitted back
to a base station over a WiFi or GPRS link. The live video quality must be kept as high
as possible and the latency with which the base station receives and decodes the data
must be as low as possible.

There are numerous cheap cameras and web cams on the market. These are mostly
of the CMOS (Complementary Metal Oxide Semiconductor) and CCD (Charged Couple
Device) types. This design helps to keep the cost of these cameras low, however, the
majority of these only have a very limited ability to capture video from a distance. A
second problem is that the video data must be compressed so that it may be transmitted
over a low rate data link.

A second approach is to purchase an integrated system which combines video cap-


ture and compression. Such a device is the CAM-VGA 100 module on offer by Round
Solutions[32]. It is an integrated serial camera board module that can be attached to a
wireless host being used as a video camera or a JPEG compressed still camera. It pro-
vides RS232 serial interface and a JPEG compression engine which acts as a low cost and
low powered camera module for use with GPRS/GSM modules. The main application
of such a device is on mobile phones or PDAs and thus the picture quality and viewing

28
range will limited.

If a readymade system, such as the one described above was purchased, the data rates
would be restrained to those designed by the manufacturer. The solution which offers
the most flexibility, would be to design a camera board module, with our own camera
and a video codec. This would involve buying a CMOS camera board and implement-
ing a video codec. There are a number of codecs which fit our purpose these include:
MPEG4, MJPEG, Strathclyde Compression Transform (SCT), H.263 or H.264. MPEG4,
H.263 and H.264 are all more complex to implement than MJPEG but offer superior com-
pression. Not enough is known about the SCT and some more investigation will need to
be carried out before a final decision is made on which compression technique will be used.

5.5 Object Detection


When using the sensing approach, as detailed in Section 3.4, an imaginary protective
barrier could be formed around the quadcopter, and if an object is detected within this
barrier, evasive action is taken. However, this leads to the problem that if the vehicle is
required to navigate a corridor that is narrower than the barrier surrounding the vehicle;
the system would not allow the vehicle to pass and would force it to find a different route.
In this case, using a method that allows the exact distance to be detected would be ad-
vantageous. What needs to be considered is the complexity of the navigation system;
does the system need to adapt its trajectory on-the-fly, or does it stop and re-evaluate
its potential options? Initially it was decided that non-adaptive navigation would be an
easier route to take and if time is sufficient, adaptive navigation will be implemented.

As mentioned in Section 3.4, ultrasound or infrared could be used to sense if an object


was within range of the quadcopter. Ultrasound detecters appear to need a highly re-
flective surface, and also have a poor polar response to flat surfaces, as shown in Figure 20.

Use of an imaging device is, by default, a far more complex option, however, it can
return much more usable results and can allow the navigation module to be more efficient
in its evasive actions. When an object is encountered and the vehicle starts to rotate to
avoid the object, constant information can be passed to the navigation module to inform
it if the object has been cleared. The use of imaging is also very limited; one camera
requires a substantial amount of processing power (compared to Ultrasound or Infrared
sensors). Object detection is also limited to one direction, which means the quadcopter
must rotate if it is to see potentially viable routes. It is for these reasons that distance

29
Figure 20: Polar response of a Parallax Ping Sensor, tested with a 12” square piece of
white card[33]

sensing will be used in all directions.

During development, a Microchip PIC microcontroller will be used for the Object
Detection system as to minimise the complexity of having multiple systems running
on the one processor. In order for the Object Detection System to communicate with
the Navigation module, the PIC selected must have the capabilities of I2 C or RS232.
The microcontroller must also have room for multiple inputs; for these reasons, the
Microchip PIC16F876 was chosen[34]. A PIC microcontroller has the advantage over
a conventional Microcontroller Development Board (like the Freescale Coldfire board
being used in the Communications and Navigations modules) as it does not have any
unnecessary components (such as DACs, switches) and has a very low power consumption.

30
6 Progress to date
This section details the work accomplished as of 17th December 2008. The work ac-
complished falls into four sections; selection and construction of quadcopter, selection of
navigation equipment and testing, selection of communications equipment and selection
and testing of object detection equipment. Each of these sections is detailed below.

6.1 Construction and Test Flights


The complete MK set was purchased and a kit containing all the components shown in
Figure 21 was received. Building the frame was the first step of construction. The MK
came with preassembled control boards and brushless motor controllers. All the surface
mounted components were already soldered on.

Figure 21: MikroKopter kit before construction

However, several bulkier pieces such as electrolytic capacitors, switches, the baromet-
ric sensor and the header connectors still required soldering. The wiring assembly used
to carry the power and control signals was prepared. All the parts were then connected
together and a successful dry test of the electronics was undertaken. All the separate
components were then brought together and placed on the frame. The propellers were
mounted and a pre-flight check was carried out. Using the MK Tool, the RC transmitter
channels were configured to match the MK receiver and a successful maiden test flight
was made[35]. Figure 22 below shows the MK after construction.

31
Figure 22: Fully constructed MikroKopter

6.2 Selection of Navigation Equipment and Testing


Two core elements of the navigation and control system had to be selected: the standard
flight controller which performs basic stabilisation of the quadcopter and the advanced
navigation system which will add the functionality required to meet the objectives in
Section 1.1.

As the flight controller is key to the ability to perform stable, autonomous flight, a
substantial amount of time was spent deciding whether to build a flight controller or buy
an off-the-shelf model.

To ensure the specifications of the GPS unit, magnometer and microcontroller are
suitable to meet the project objectives several modules were considered as described in
Section 5.2.

6.2.1 Component Testing

Currently only the CMPS-03 magnometer has been tested. It was tested by interfacing it
via I2 C with a Softec HCS12E128 starter kit. This starter kit was used due to the avail-
ability of I2 C on the microcontroller and the authors familiarity with the board and the
opportunity to reuse I2 C code from a previous project. A Freescale PBMCUSLK devel-
opment board with an LCD module was used as an output device to display the compass
heading. The MC9S12E128 drove the LCD module via the Serial Peripheral Interface

32
(SPI). Jump leads were constructed to connect the required pins on the MC9S12E128 to
the CMPS03 and PBMCUSLK. A circuit diagram is shown in Figure 23.

Figure 23: Magnometer test rig schematic

The magnometer functioned as expected and a 3 digit number representing degrees


from North was displayed on the LCD. A possible problem emerged during testing as
when the compass is tilted an error occurs to the heading. The further the magnometer
is tilted from horizontal, the larger the error. This will have to be addressed, possibly
with adjusting the value using the gyroscope values.

6.3 Communications
When confronted with the challenge of communications between a mobile aerial vehicle
and a base station there are many aspects that need to be considered. The data rate,
range, cost and availability were the main properties that were compared when investi-
gating communication systems. Another challenge is finding a processing device that has
the processing power to take the data in, process it and transmit it via some communi-

33
cations hardware. All of the processing has to be completed without a high level of lag
as it needs to be transmitted in real time.

After consideration the choice of communications protocol was reduced to WiFi, GSM
(GPRS), HSDPA and RF. The investigation came to the following conclusions:
• High Speed Download Packet Access (HSDPA) is too expensive to implement due
to the costs of the hardware and the monthly subscription costs for the use of 3G
services. As a final implementation rather than a proof of concept this would be
the ideal choice of communications standard due to its high data rates and large
range. The large coverage is due to the wide availability of 3G, it is available in
most urban areas making it ideal for surveillance use.

• RF is cheap to implement and has a range of up to 10 km. The range is large


enough to control the quadcopter as the battery life is only 10 - 15 minutes which
limits the distance that it can cover. The main problem with RF is that it is not as
readily available as other standards, so only covers the distance from an RF trans-
mitter. Another issue is that the data rate is only up to 10 kbps which is not high
enough for good quality video transmission. The MikroKopter is equipped with a
basic RF receiver that can be used for control, but will not be upgraded for data
communications.

• GSM (GPRS) is cheap to implement due to the maturity of the technology. The
range of GPRS is large as it works over the cellular network so will work anywhere
there is a GSM signal, most urban areas have very good signal quality. The data
rate of the GPRS class 10 modems is 16-24kbps upload and 32-48kbps download.
In theory this will be large enough to send a decent quality video stream. GPRS
offers the ideal balance of range, cost and availability and will be implemented in
this project.

• WiFi is a good choice as proof of concept, as it is relatively cheap to implement


and has a wide enough range for testing purposes. The data rate for the WiFi
transceiver is 11Mbps which is large enough for good quality video transmission.
For proof of concept, this will be used before GPRS is implemented as it has a wide
enough range for testing. WiFi is more suited to close range applications and is
therefore easier to demonstrate at the Trade Fair.

34
SUAVE control will initially use RF as this comes as standard on the MikroKopter.
This will be upgraded to WiFi as this is the best choice for a proof of concept and is
easy to demonstrate. Finally the communications to and from the quadcopter will be
implemented in GSM, this is the best choice of standard for the final implementation of
the project as it increases the possible applications of the quadcopter.

6.4 Object Detection


A Sharp GP2Y0D310K Distance Measuring Sensor was obtained and some basic test-
ing was carried out on it. When the sensor was set up in the configuration shown in
Figure 24, it was found that when no object was within range, the output pin was at a
fixed 5 V (that being the supply voltage) and if an object came within range, a pulse was
seen at the output. However, this pulse was only 3 mV in magnitude and as such could
be regarded as zero; these outputs are shown in Figure 25. It should be noted that the
output signal was taken from the pin LED FB, as the pin Vo was very noisy.

Figure 24: Circuit diagram of a Sharp Distance Measuring Sensor in use

The output from the sensor was fed into a PIC16F876 microcontroller and if an object
was detected to be within range of the sensor, an LED would light. As yet, no testing

35
Figure 25: Screen captures of the output pin of the Sharp Distance Sensor

has been done to determine a polar pattern for this sensor.

36
7 Future Work and Goals
7.1 Navigation System
The GPS unit and MKFC will be first tested to make sure they can be interfaced with
and the required data can be sent to and from them. This will be done using the
Freescale MC9S12C32 as the development environment is less complex than the Freescale
MCF528CVF66.

The control loops specified in Section 4.2.3 will then be created in software on the
Freescale MCF528CVF66 for autonomous flight. These will be implemented one at a
time to test their performance, starting with the height then the yaw and pitch.

The first objective will be to achieve autonomous stationary hover at a constant


height. This will then be expanded to include autonomous take off and landing.

The final stage will involve integrating the ODS and COMPRO systems to allow the
quadcopter to avoid objects and go to a GPS position. The control loops will then be
improved to allow stable autonomous flight in an direction.

7.2 Communication System


The design of the communications section of the project has been completed. The future
work required is to implement a bi-directional link using WiFi. This is necessary for
the final implementation of the project because without the ability to communicate the
quadcopter cannot transmit data back to a base station. More specifically the ability to
communicate via WiFi then GPRS is required to advance the range of the quadcopter
and extend the applications that can be completed using this equipment.

The tasks that still have to be completed are:

• Testing of the output of the WiFi module.

• Connect the WiFi module to a laptop wirelessly.

• Testing the range and effectiveness of the WiFi module using the quadcopter.

• Implement an override to stop autonomous mode and enter manual mode.

• Implement the GPRS modems into the quadcopter design in place of the WiFi
module.

37
• Test the output of the GPRS modems.

• Test the effectiveness of the WiFi module using the quadcopter.

Once these are completed, the communications used within the quadcopter will allow
it to be used for various applications.

7.3 Camera Surveillance


Now that a final application has been decided upon the next step is to implement the
camera and coding system. This will then be tested to ensure the ability to encode
and decode the video data while maintaining a reasonable quality. Finally the ability to
transmit the coded data over the communications channel needs to be tested. This final
section will involve a close collaboration with the communications module.

7.4 Object Detection


The use of Ultrasound has not been fully explored, but initial explorations show that it
requires a highly reflective surface with which to operate. Also, all suitable ultrasound
detectors researched use PWM[33] as the default output method; as a result this will
increase the complexity of the overall system.

Sensors with a larger range (in the order of 1 m) will be sought and tested. Ideally,
Analogue Range sensors will be used, and converted to Distance switches. This is so that
if time is sufficient, the actual distance between the sensor and object can be realised and
can be used to improve the Navigation module’s reaction to objects.

A communications link between the navigation module and the Object Detection
System will also have to be devised; the link will be either Serial or I2 C. To test this, a
simple link will be set up between the main PIC and either another PIC (in the case of
I2 C) or a computer (in the case of Serial).

38
8 Conclusion
In order to meet the specifications in Section 1.1, a Gantt Chart was created listing the
main tasks required to complete each objective. This can be found in Appendix B. Cur-
rently the project is on schedule and even some of the work streams are ahead of schedule
such as testing the magnometer (Section 6.2).

The overall system architecture and the individual design of the internal systems has
been finalised and equipment choices have been made. All the equipment required to cre-
ate the autonomous vehicle is now present apart from the camera. Each group member
has planned the work required to implement the design of their individual sections over
the coming semester as shown in the Gantt chart.

The MikroKopter has been constructed and is currently undergoing flight tests to
judge the potential performance of the quadcopter and allow experimentation with the
flight controller parameters. This period will also allow the team to improve their piloting
skills in order to meet the upcoming milestone of manually controlled flight.

Four weeks have been set aside at the end of semester two (6th April 2009 to 3rd May
2009) to allow for the final report to be written and the exhibition at the trade show
event to be planned.

39
References
[1] BBC, Volcano scientist wins accolade.
http://news.bbc.co.uk/1/hi/scotland/edinburgh and east/7735284.stm
Accessed: 26th November 2008

[2] MikroKopter, en / MikroKopter (2008)


http://www.mikrokopter.de/ucwiki/en/MikroKopter/
Accessed: 15th December 2008

[3] RC Helicopter, Quad rotor helicopter design


http://www.rchelicopter.com/category/rc-helicopter-construction-design/rc-helicopter-
quad-rotor-design
Accessed: 10th November 2008

[4] Quadcopter, Quadcopter Home (2008)


http://www.quadcopter.org
Accessed: 11th December

[5] Petropolakis, Dr L., EE405 Robotics Notes, (University of Strathclyde, 2007)

[6] Ogata, Katsuhiko, Modern Control Engineering, (Prentice Hall, 2002)

[7] Shaw, John, The PID Control Algorithm (2006)


http://learncontrol.com/pid/description.htm
Accessed: 17th December 2008

[8] Radhesh PID Controller Simplified (2008)


http://radhesh.wordpress.com/2008/05/11/pid-controller-simplified/
Accessed: 17th December 2008

[9] IEEE, IEEE 802.11 WIRELESS LOCAL AREA NETWORKS (2008)


http://www.ieee802.org/11/
Accessed: 3rd December 2008

[10] Mobile Phones UK, GPRS (General Packet Radio Service), HSCSD & EDGE
(2008)
http://www.mobile-phones-uk.org.uk/gprs.htm
Accessed: 12th December 2008

[11] SHARP, GP2Y0D310K Datasheet (2006)

40
[12] RoboRealm, Obstacle Avoidance (2008)
http://www.roborealm.com/tutorial/Obstacle Avoidance/slide010.php
Accessed: 27th November 2008

[13] Guindehi, Amir, Universal Aerial Platform (2008)


http://uavp.ch/moin/FrontPage
Accessed: 17th December 2008

[14] Quadroufo, UAVP Flight Controller + 5 Sensors


http://www.quadroufo.com/product info.php?products id=29
Accessed: 17th December 2008

[15] Community, Paparazzi Paparazzi (2008)


http://paparazzi.enac.fr/wiki/Index.php/Main Page
Accessed: 17th December 2008

[16] –, Booz (2008)


http://paparazzi.enac.fr/wiki/Booz
Accessed: 17th December 2008

[17] –, Tiny V2 (2008)


http://paparazzi.enac.fr/wiki/Tiny v2
Accessed: 17th December 2008

[18] PPZUAV, Tiny (2008)


http://ppzuav.com/osc/catalog/product info.php?cPath=4 2&products id=35 Ac-
cessed: 17th December 2008

[19] french-copter, IngoBusker Translated by, FlightController Manual - English.


MikroKopter.de (2007)
http://www.mikrokopter.de/ucwiki/en/FlightCtrlManual
Accessed: 17th December 2008

[20] Busker, Holger Buss & Ingo, MikroKopter Forum - International Area.
http://forum.mikrokopter.de/forum-17.html
Accessed: 28th November 2008

[21] Schweiz, Quadrokopter Flight-Ctrl V1.3


http://www.flyinghigh.ch/index.php/home.html?page=shop.product details&flypage
=flypage.tpl&product id=4&category id=2
Accessed: 17th December 2008

41
[22] –, MikroKopter - Von der ersten Schraube bis zum Flug
http://www.mikrokopter.com/ucwiki/VideoAbspielen?id=106
Accessed: 15th November 2008

[23] Solutions, RF, GPS-320FW Datasheet (2008)

[24] RF Solutions, Board Level GPS Receivers (2007)


http://www.rfsolutions.co.uk/acatalog/Board Level GPS Receiver Module.html
Accessed: 17th December 2008

[25] Rapid, CMPS-03 Datasheet (2003)

[26] Ali Alvi et al., Advanced Robot Control. (University of Strathclyde, 2008)

[27] Freescale Semiconductor, MCF5282 Integrated Microcontroller Product Brief


(2006)

[28] Freescale Semiconductor, MCF528X Product Summary Page


http://www.freescale.com/webapp/sps/site/prod summary.jsp?code=MCF528X&fsrch=1
Accessed: 18th November 2008

[29] ECT soft, Hardware Specifications


http://www.ectsoft.net/ect-products.html
Accessed: 1st December 2008

[30] RF Solutions, GSM20 and GSM 100T Modem Datasheet (2004)

[31] Wireless Products, S103 WLAN Compact Serial Module Datasheet (2007)

[32] Solutions, R, CAM-VGA100 User Manual (2004)


http://www.areasx.com/files/articoli/8122/CAM-VGA100-User-ManualV2-0.pdf
Accessed: 17th December 2008

[33] Parallax, PING))) Ultrasonic Distance Sensor (#28015) Datasheet (2008)

[34] Microchip, PIC16F87X Datasheet (2001)

[35] Suave Project Blog, Construction and test flight


http://suave.rawxing.org/archives/84
Accessed: 10th December 2008

42
A Risk Assessment

SPECIFIC PROJECT RISK ASSESSMENT FORM

Section A

Department Electronic and Electrical Engineering.


Project/Works Title Semi-autonomous unmanned aerial vehicle.
Description of Work Activity Build and test quad rotor RC autonomous helicopter.

Section B

Description of significant hazards: (e.g. slipping/tripping; fire; work at height;


pressure systems; electricity; dust; fumes; manual handling; noise; poor lighting;
low temperatures; vehicles; moving parts of machinery; etc)
Flying vehicle; possibility of fire and fumes; rotating propellers.

Section C

Groups who may be at risk: Staff/Students with Special Needs ❒ Inexperienced Staff x

Academic Staff ❒ Office Staff ❒ Maintenance Staff ❒ Technicians ❒

Postgraduates ❒ Undergraduates ❒ Lone Workers ❒ Contractors ❒

Visitors ❒ Members of Public x Other ❒ Specify _________________________

43
Section D

Assess the foreseeable risks from the hazards identified and state the controls to
be implemented to eliminate or minimise the risk where it is reasonably
practicable to do so.

Give priority to those risks which affect large numbers of people and/or could
result in serious harm. Apply the following principles, if possible, in the
following order:

 Remove the risk completely


 Try a less risky option
 Prevent access to hazard (eg guarding)
 Organise work to reduce exposure to hazard
 Issue personal protective equipment
 Provide welfare facilities (eg washing facilities for removal of contamination)

This section should be used to specify the scheme of work which should include
details of the significant controls, safety procedures and working methods which
need to be adhered to. It should also refer to existing safety procedures, eg
already contained in the departmental safety regulations or University local
rules. The following should also be considered in the risk assessment process:

Do the control measures - meet the standards set by the legal requirements?
- comply with a recognised industry standard?
- represent good practice?
- reduce risk as far as reasonably practicable?

Flying Vehicle- Vehicle will be flown outdoors and indoors. When flying indoors the
room will be closed to the general public. During outdoor testing an open area that is
closed to the general public will be sought. Good weather conditions are required i.e.
no wind or rain. Public liability insurance will be taken out

Fire/ Fumes- Lithium polymer batteries are being used; these are dangerous if
handled incorrectly. Manufacturer’s guidelines regarding recharging will be adhered
to. A balanced charger with a timer cut-off has been purchased. Non-flammable
surface is required (Pyrex bowl with sand). Fire extinguishers of class B and C are at
the end of the hall if needed. The battery will be disposed of if damaged.

Rotating Propellers- All people will be kept from being close to the vehicle once the
motors have been started. Safety goggles will be worn when the vehicle is being
tested. If required a guard will be in place over the propellers. The transmitter
instructions for activating the vehicle will be adhered to.

44
45
Section E

Accreditation

Assessor:

Signature Print Date

Supervisor:

Signature Print Date

Signature of all persons receiving a copy of this risk assessment.

Philipp Karagiannakis 10/12/2008


Signature Print Date
Colin McGuire 10/12/2008
Signature Print Date
Niall McKenzie 10/12/2008
Signature Print Date
Neil Sinclair 10/12/2008
Signature Print Date

Signature Print Date

Signature Print Date

Signature Print Date

Date of next assessment(s)

1. 2. 3.

Note:

1. The completed risk assessment form must be kept by the assessor and supervisor with a
copy given to Departmental Safety Convener.
2. The assessment must be reviewed and where appropriate revised if there is reason to
suspect it is no longer valid or there has been a significant change to the task procedure.
3. The responsibility for ensuring that the risk assessment is suitable and sufficient
lies with the Supervisor not with the Departmental Safety Convener.
4. A copy of this assessment must be retained by the Department for as long as is
relevant (in case of the form being used as a COSHH risk assessment this must be kept
for 30 years and in some cases 40 years after the project or works are terminated).

46
B Gantt Chart

Name Start Finish November 2008 SUAVE - Gantt ChartJanuary 2009


December 2008 February 2009 March 2009 April 2009

03 10 17 24 01 08 15 22 29 05 12 19 26 02 09 16 23 02 09 16 23 30 06 13 20 27
Offical Project Milestones & Paperwork 08/11/2008 03/05/2009
Write Final Report 06/04/2009 03/05/2009
Prepare Interim Presentation 01/02/2009 09/02/2009
Write Interim Report 08/12/2008 21/12/2008
Statement of Intent 08/11/2008 08/11/2008
Final Report & Presentation 03/05/2009 03/05/2009
Individual Interviews 14/11/2008 14/11/2008
Individual Interviews 19/02/2009 19/02/2009
Interim Presentation 12/02/2009 12/02/2009
Interim Report 21/12/2008 21/12/2008
Subsystem Development 03/11/2008 06/04/2009
Video System 08/12/2008 06/04/2009
Investigation of Video Codec 08/12/2008 22/12/2008
Video Codec Implementation on Board 22/12/2008 02/02/2009
Video Codec Implementation at the Groun 05/01/2009 16/02/2009
Integration with Communications System 22/12/2008 06/04/2009
Skeleton Vehicle 03/11/2008 21/12/2008
Order Parts 09/11/2008 09/11/2008
Quadcopter Purchase Decision 03/11/2008 17/11/2008
Vehicle Construction and Test Flights 16/11/2008 08/12/2008
Tuning for manually controlled stable fligh 08/12/2008 21/12/2008
Manually Controlled Stable Flight 21/12/2008 21/12/2008
Object Detection System 11/11/2008 06/04/2009
Investivative Options 11/11/2008 21/01/2009
Simple test of IK 01/12/2008 19/12/2008
Test of Ultrasound 26/01/2009 11/02/2009
Test of IR 26/01/2009 11/02/2009
Choice of Sensor 11/02/2009 11/02/2009
I2C/Serial Link 19/01/2009 30/01/2009
I2C Finished 30/01/2009 30/01/2009
Integration with Navigation 11/02/2009 06/04/2009
Basic Detection 13/03/2009 13/03/2009
Autonomous Flight and Navigation 22/12/2008 05/04/2009
GPS & Magnomoter Testing & Integration 22/12/2008 02/01/2009
Design and Implement Control Algorithms 04/01/2009 20/02/2009
Integrate Object Detection and Communic 15/02/2009 06/03/2009
Achieve Autonomous Flight Between Two 30/03/2009 30/03/2009
Design and Implement Navigation Algorith 23/02/2009 05/04/2009
Acheive Stable Autonomous Hover 20/02/2009 20/02/2009
Bi-directional Data Communication 11/11/2008 06/04/2009
Communication Standards Comparison 11/11/2008 17/11/2008
WiFi Hardware Comparison 17/11/2008 01/12/2008
GSM Hardware Comparison 17/11/2008 01/12/2008
Microcontroller Decision 03/12/2008 12/12/2008
Exams 19/12/2008 25/01/2009
WiFi Implementation 12/12/2008 08/02/2009
Testing the output from the WiFi modul 12/12/2008 19/12/2008
Connecting the WiFi module wirelessly t 25/01/2009 27/01/2009
Obtain the navigation data from the pac 27/01/2009 02/02/2009
Pass the navigation data though the IIC 02/02/2009 03/02/2009
Testing the range and reliability of the 03/02/2009 08/02/2009
Implement an override to stop autonom 03/02/2009 08/02/2009
GPRS Implementation 09/02/2009 06/04/2009
Testing the output from the GPRS mod 09/02/2009 16/02/2009
Transmission from the modems 16/02/2009 26/02/2009
Reception to the modems 26/02/2009 06/03/2009
Obtain navigation data form the packets 06/03/2009 20/03/2009
Pass the navigation data though the IIC 20/03/2009 27/03/2009
Testing the range and reliability of the G 27/03/2009 03/04/2009
Implement an override to stop autonom 04/04/2009 06/04/2009

47

You might also like