Interim Report
Interim Report
Currently the project is on schedule and all of the required hardware has been ob-
tained. The system design has been finalised and a work plan created to implement this
in the coming semester.
ii
Contents
1 Introduction 1
1.1 Aims and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Project Organisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Report Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 Applications 3
2.1 Microwave Cavity Resonator . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Air Pollution Sensor (Inner-city) . . . . . . . . . . . . . . . . . . . . . . . 3
2.3 Remote Volcano Sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.4 Camera Surveillance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
3 Background 5
3.1 How Quadcopter Works . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3.2 Proportional Integral Derivative Control . . . . . . . . . . . . . . . . . . 7
3.3 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.3.1 WiFi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.3.2 GPRS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.1 Sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.2 Imaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4 Design Methodology 13
4.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
4.2 Navigation and Autonomous Control System . . . . . . . . . . . . . . . . 13
4.2.1 Requirements 1 and 3 Send and Receive commands and flight data
to the communication system . . . . . . . . . . . . . . . . . . . . 16
4.2.2 Requirement 2 Receive and Pre-Process Data . . . . . . . . . . . 16
4.2.3 Requirement 4 - Process Available Data and Decide on Movements
Required to Control the MikroKopter . . . . . . . . . . . . . . . . 17
4.2.4 Requirement 5 Disengage Manual Control and Control quadcopter
in Autonomous Mode . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.2.5 Requirement 6 Monitor Current Situation and Prepare for Au-
tonomous Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.3 Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
iii
5 Equipment 22
5.1 Flight Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
5.2 Navigation System Equipment . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2.1 Central Processing Unit . . . . . . . . . . . . . . . . . . . . . . . 24
5.2.2 GPS Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2.3 Magnometer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.3 Communications System . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.3.1 Freescale Coldfire Microcontroller . . . . . . . . . . . . . . . . . . 25
5.3.2 Fargo Maestro - 100 GPRS Modems . . . . . . . . . . . . . . . . 26
5.3.3 S103 WiFi Module . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.4 Surveillance Module and Compression . . . . . . . . . . . . . . . . . . . . 28
5.5 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6 Progress to date 31
6.1 Construction and Test Flights . . . . . . . . . . . . . . . . . . . . . . . . 31
6.2 Selection of Navigation Equipment and Testing . . . . . . . . . . . . . . 32
6.2.1 Component Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 32
6.3 Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
6.4 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
8 Conclusion 39
A Risk Assessment 43
B Gantt Chart 47
List of Figures
1 quadcopter with mounted camera[2] . . . . . . . . . . . . . . . . . . . . . 4
2 quadcopter showing direction of each propeller[3] . . . . . . . . . . . . . 5
3 Thrust comparison of 10” and 12” propellers . . . . . . . . . . . . . . . . 8
4 PID controller[8] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
5 Illustration of using infrared or ultrasound to detect the presence of objects[11] 11
iv
6 RoboRealm example, showing the original image (left) and the edge de-
tected image (right)[12] . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
7 RoboRealm example, showing the side-filled image (left) and the eroded
image (right)[12] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
8 RoboRealm example, showing the final image overlaid with the original[12] 12
9 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
10 Basic Navigation System Connections . . . . . . . . . . . . . . . . . . . 14
11 Navigation Processor Data Inputs and Outputs . . . . . . . . . . . . . . 15
12 The three levels of control: go to command, object avoidance and basic
flight stabilisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
13 Control loops used to achieve autonomous flight. (a) yaw: controls the
direction of vehicle (b) pitch: controls the horizontal movement. (c)
throttle: controls the height of the vehicle . . . . . . . . . . . . . . . . . 19
14 Transmission Structure of SUAVE communications . . . . . . . . . . . . 20
15 Reception Structure of SUAVE communications . . . . . . . . . . . . . . 20
16 Freescale MCF528x features[28] . . . . . . . . . . . . . . . . . . . . . . . 25
17 Freescale Coldfire Microcontroller and Development board . . . . . . . . 26
18 Fargo Maestro- 100 GPRS modems[29] . . . . . . . . . . . . . . . . . . . 26
19 S103 Wifi module[31] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
20 Polar response of a Parallax Ping Sensor, tested with a 12” square piece
of white card[33] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
21 MikroKopter kit before construction . . . . . . . . . . . . . . . . . . . . 31
22 Fully constructed MikroKopter . . . . . . . . . . . . . . . . . . . . . . . 32
23 Magnometer test rig schematic . . . . . . . . . . . . . . . . . . . . . . . 33
24 Circuit diagram of a Sharp Distance Measuring Sensor in use . . . . . . . 35
25 Screen captures of the output pin of the Sharp Distance Sensor . . . . . . 36
v
1 Introduction
Autonomous vehicles create substantial and varied engineering challenges. With ever in-
creasing processing power and the reduction in size, weight and cost of electronic devices,
this challenge can now be extended into the sky.
The platform is based on a quad-rotor design which offers superior stability, ease of
control and payload capacity compared to a standard helicopter. A quad-rotor design is
also extremely agile and is free to move in any direction, making it ideal for surveillance
and observation.
A navigation system will incorporate basic object avoidance to allow safe autonomous
flight between predetermined points and wireless communications will facilitate the trans-
fer of control commands and sensor data between the vehicle and a base station.
• Construct a quad rotor helicopter and achieve manually controlled stable flight of
the vehicle using radio frequency (RF) control.
• Integrate a basic object detection system with an improved control system to avoid
collisions.
1
• Incorporate a navigation system to allow autonomous flight and navigation.
• Integrate a surveillance device and pre-process the data for transmission over the
communications channel.
Section 4 gives an overview of the current system design which will be implemented.
It is divided into the following sections: Navigation and Autonomous Control, Commu-
nications, Object Detection System and Camera and Video Compression. Each section
will discuss the architecture of the individual system, the connections to other systems
and the equipment selected.
Section 6 highlights the work accomplished to date in all sections and Section 7 dis-
cusses future work and goals and concludes the report.
Appendix A contains the specific risk assessment for this project as of 18th December
2008.
2
2 Applications
A number of options were considered as a possible final application for the quadcopter.
Below is a list of some of the applications investigated and the reasons why they were
eliminated.
3
capable of transmitting the data back for real time processing. Once again the sensors
were the main reason this idea was abandoned. They were found to be too expensive and
there was nowhere to test the vehicle i.e. a local active volcano.
4
3 Background
3.1 How Quadcopter Works
A quad-rotor helicopter avoids much of the complexity needed to fly a standard helicopter
by eliminating the need for variable pitch rotor blades and a thrust stabilising tail rotor.
Instead it uses four identical propellers which are evenly spaced; they each produce an
equal amount of lift and torque with each pair of opposing propellers being spun in the
same direction. This gives us a pair of clockwise and anti-clockwise propellers. This is
shown in Figure 2 below. By varying the thrust produced by the propellers, movement of
the helicopter in any direction is possible. One of the clockwise rotating arms is defined
as being the front of the vehicle and all movement is in relation to it.
As all the propellers are spinning at the same angular velocity, a yaw stabilising tail
rotor like the one found on a standard helicopter is not needed. Yaw is achieved by in-
creasing the torque produced by a pair of rotating motors – either the clockwise pair or the
anti-clockwise pair and decreasing the torque produced by the other pair. Similarly the
pitch of the rotors need not be altered. Instead, movements in the pitch (forward/back-
5
ward) and roll (left/right) axes can be achieved separately, simply by altering the thrust
produced by a pair of motors and all this can be done without affecting the yaw of the
vehicle. The front and rear propellers control the pitch of the vehicle while in flight and
the right and left propellers control the roll of the vehicle.
The Flight Control (FC) is the main controlling board on the quadcopter and all
the necessary sensors and microprocessors used to achieve and maintain stable flight are
located on this board. The gyroscopes (gyro, also known as rotational speed sensors) are
considered to be the most important of the onboard sensors. The programmed microcon-
troller uses the measurements from them to compensate for external influences such as
wind. The X, Y and Z axis are all assigned their own gyro and they measure the degrees
per second the vehicle turns in a certain plane.
The rate of angular change is measured using three accelerometers; once again each
axis is assigned its own accelerometer. It is possible to fly the vehicle without these
accelerometers however it would not be possible for the quadcopter to return to a level
state by itself. By including these sensors it is possible for the pilot to release the remote
control (RC) sticks and the vehicle will stabilise itself. The final sensor is a barometer
and it allows the vehicle to maintain a specific flying height.
Due to the nature of the project a strict power to weight ratio budget was designed.
From initial estimations, maximum payload for the vehicle would be about 400 grams.
The motors used in RC vehicles vary greatly so it was desired to find a light motor with
low power consumption. The main requirement was that it provided 1000 rpm V−1 . A
Brushless motor fulfils the specifications. Brushless motors are more complicated to op-
erate because they need to change the DC supply voltage of the battery into a phased AC
(usually three phase) with a controlled power output so that the speed of the motors can
be accurately controlled. In total there are four Brushless Controllers (BL-Ctrl) one for
each motor. There are numerous BL-Ctrls on the market. To achieve a very stable level
of flight it is desired to change the throttle value very rapidly (<0.5m s−1 ); this would be
best done using an Inter-Integrated Circuit (I2 C) bus interface.
Control of the vehicle is achieved through use of a RF transmitter and receiver pair. A
transmitter with a minimum of four channels is required to control the quadcopter in the
various planes. It must use Pulse Position Modulation (PPM). The receiver demodulates
the instructions given by the pilot and converts them into electrical signals. A receiver
with a summing signal is required for the MikroKopter (MK). This signal contains all of
6
the channels sent by the transmitter and is made available to the FC for processing.
The design of the frame is only limited by one factor: it must be as light as possible.
The frame designed took on a basic cross shape similar to many open source projects[4].
Bearing in mind that the frame must be lightweight some of the materials considered
include: aluminium, carbon fibre or polystyrene; it is possible to have a combination of
these. The most cost effective material for this project is aluminium because it is readily
available and cheap.
The batteries needed to power all the onboard electronics have to have a low weight to
power ratio so that maximum flight time may be achieved. The different types of battery
considered suitable for use in RC vehicles are: Lithium polymer (Lipo), Nickel-cadmium
(NiCd) and Nickel metal hydride battery (NiMh). The Lipo types are the most advanced
and offer the most power. They are volatile and must be handled with care but they
can have a capacity of over 10,000 mAh. NiMh batteries are a slightly older technology
but they are still capable of capacities around 5000 mAh. However if a poor quality of
battery is purchased then bad cells can develop quickly when the capacity is increased.
NiMh batteries are cheaper than Lipo batteries and are also less volatile. NiCd batteries
are the cheapest overall and have similar characteristics to that of NiMh but are unable
to achieve an equal capacity. A decent flying time is desired to test the software and
hardware which will be added and due to the powerful motors and large propellers the
best weight to power ratio with a high capacity can be achieved using the Lipo batteries.
The propellers play a major role on the take-off weight and the speed that the vehicle
will fly at. The diameter of the propellers as well as their pitch must be considered. Due
to the slow spinning motors a low pitch is desired. After using a thrust calculator it was
determined that a good combination for the propellers would be: 12” propellers with
4.5” pitch. A snapshot of the thrust calculator at 4000 rpm for 10” and 12” propellers
is shown below in Figure 3. The affect of altering the blade diameter by 2” doubles the
static thrust developed.
7
Figure 3: Thrust comparison of 10” and 12” propellers
As shown in Figure 4, the error signal is an input minus the current output. This error
signal is processed by the PID controller and the outputs of each element are combined
to create the output to control the process.
8
Proportional control can be thought of as a gain. The error signal is multiplied by
the gain; Kp . It influences the amount the input has to change to cause a variation of
0-100% in the output[6].
Integral control uses feedback (most commonly positive) to reset the controller[7].
This is achieved by integrating the error signal and multiplying it by the gain, Ki . This
has the effect of controlling the output so it stabilises at a constant steady-state.
Differential control differentiates the error signal and multiplies it by the gain, Kd .
This influences the rate of change of the output signal.
Combining these three elements creates an effective controller which can be imple-
mented in software.
The WiFi version that will be implemented is 802.11b and it has the following characteristics[9]:
These aspects describe how ideally suited it is to our application. Relevant appli-
cations are short range communications that do not require extremely large data rates.
This version of WiFi is very commonly used as a wireless LAN technology.
The main issue with 802.11b is that there are other devices that operate on the 2.4
GHz frequency band. These devices include microwaves, Bluetooth, baby monitors and
cordless phones. When these devices operate in the same region as 802.11b they cause
interference to occur in the WiFi signal.
9
3.3.2 GPRS
Whilst GSM is excellent for communications networks that involve the transmission and
reception of voice calls, it is not designed for the use of sending or receiving data. This
is due to the data rate of GSM not being high enough for data transmission.
GPRS (General Packet Radio Service) improves the performance of GSM to allow
it to send and receive data. GPRS is one of the communications technologies that are
classed as part of 2.5 G, these are designed to move the performance of 2G networks
closer to the performance of 3G networks. The bit rate of GPRS for a pair of Class 10
modems is 16- 24 kbps for uploading and 32- 48 kbps for downloading[10].
Due to GPRS being part of the standard cellular network, there is currently wide
coverage, especially in urban areas. This means that if a remote control device is being
controlled over the cellular network then it can be controlled from anywhere else with
network coverage, no matter how far away.
3.4.1 Sensing
The Sensing approach uses either Ultrasound or Infrared waves to detect the presence of
objects. A short pulse is emitted from a transmitter and then a detector will receive the
reflected signal. Either the time delay or the magnitude of the signal will then be used
to calculate the distance to any objects within range of the sensor. This approach, as
illustrated in Figure 5, is the same as is used with touchless hand driers found in public
toilets.
There are three types of outputs that can be taken from a distance sensor: Analogue
Range, Pulse Width Modulation (PWM) Range and Distance switch.
Analogue Range is a variable voltage output (between ground and the supply volt-
age) that is dependent on the distance to the object. This can then be quantised using
a Analogue-to-Digital Converter (ADC) and with the aid of a look-up table, a distance
realised. A PWM range output works in a similar manner, where by the pulse width is
10
Figure 5: Illustration of using infrared or ultrasound to detect the presence of objects[11]
varied dependent on distance. The Distance Switch is the most basic of the three, and
when an object is within range of the sensor, the output is toggled (either High to Low
or vice-versa).
3.4.2 Imaging
The use of a camera allows the quadcopter to ”see” the surrounding environment. Robo-
Realm has open source software[12] showing how a simple webcam can be used to detect
objects and calculate a safe route. Firstly edge detection is applied to an image as seen
in Figure 6.
Figure 6: RoboRealm example, showing the original image (left) and the edge detected
image (right)[12]
A technique called ”side-fill” is used whereby a vertical line is taken from the bottom
of the image and white is placed in each pixel, until an edge is detected, at which point
the rest of the line is filled black. The white area of the image is then eroded by an object
such that the area left is large enough for the vehicle to pass. These steps are shown in
Figure 7.
11
Figure 7: RoboRealm example, showing the side-filled image (left) and the eroded image
(right)[12]
The image is smoothed and the white pixel with the greatest y value is found and it
is this pixel that the vehicle aims for. In this example, as seen in Figure 8, this is at the
x value of 193.
Figure 8: RoboRealm example, showing the final image overlaid with the original[12]
12
4 Design Methodology
This section details the architecture of the overall system, and the design of and inter-
connections between internal subsystems.
Communication between the systems will be via an I2 C bus. The navigation system
will act as the master device and will ask the other two systems for data when required.
This will be discussed further in Section 4.2.2.
The design of the navigation, communications and object detection systems are de-
tailed in the sections below.
13
and autonomously controlled.
To meet the requirements specified in Section 1.1, a Global Positioning System (GPS)
module and magnometer are required for autonomous flight. The GPS module allows the
quadcopter to be aware of its current location (by providing a latitude and longitude)
and the magnometer allows the quadcopter to be aware of its current orientation to allow
it to know in which direction to start moving.
A diagram of the navigation system and internal and external connections is shown
in Figure 10.
14
• Current height, fuel (battery level), pitch, yaw and role accelerometer and
integral data from the MKFC.
R3 - Send the current flight telemetry (as specified in R2) to the communication
system.
R6 - Monitor current system and prepare for autonomous control when in manual
mode.
Each requirement also requires a specific hardware and software design, detailed be-
low. The flow of data to and from NAVPRO is shown in Figure 11.
15
4.2.1 Requirements 1 and 3 Send and Receive commands and flight data to
the communication system
Three separate communication methods will be used to allow the NAVPRO to obtain all
data required for autonomous flight as shown in Figure 11. All the current data values
will be stored in RAM for use to fulfil Requirement 4.
As the I2 C bus will be used to allow four devices to communicate, timing will be very
important. The COMPRO will act as the master device and will request data from each
device in a predefined sequence at a regular interval. This will ensure the NAVPRO will
have all the data required to make a decision and fly the quadcopter when required.
The GPS-320FW module utilises an asynchronous serial connection to send and re-
ceive data. Therefore one of the NAVPRO SCI modules will be used to obtain positional
data and initialise the GPS module when needed. The current position will be updated
at 1Hz which is the maximum rate supported by the GPS-320FW. The serial interface
protocol is based on the National Marine Electronics Associations NMEA 0183 ASCII
interface specification which consists of a string of ASCII characters which contain all
relevant positional data. The NAVPRO will decode these messages and extract the cur-
rent position from these.
CMPS-03 Magnometer
The CMPS-03 magnometer will be interfaced via the I2 C bus due to the ease of use
and the low refresh rate required.
The ODS will communicate with NAVPRO via the I2 C bus. The Object Detection
16
System (ODS) will send data with required actions (e.g. change in direction, stop) to the
NAVPRO. The NAVPRO will send commands to request data from the object detection
system and to manage the I2 C bus. A relatively low data transfer rate, roughly 6 times
per second, will be required to communicate with ODC.
Communication with the MikroKopter FC will be using the SPI interface. This will
allow the NAVPRO to send control commands to the MikroKopter flight controller and
receive the current battery level, height, nick, roll and yaw accelerometer values and gy-
roscope positions. These values are already sent by the MKFC for debugging using the
MikroKopter tool; these same values will be sent to the COMPRO.
The MikroKopter flight controller software will be modified to output the flight data
and receive control commands over the SPI interface. The control commands will mimic
those sent via the RF transmitter i.e. provide a value for pitch, yaw, thrust and roll.
The software will also contain a function to use either control commands from the PPM
receiver or disable this input and use the NAVPRO.
Communications System
The MikroKopter will receive commands from the COMPRO over the I2 C bus. These
commands will be sent periodically.
The NAVPRO will use all the available data to decide on the best course of action to
control the MikroKopter. There are three levels of control required as shown in Figure 12.
The highest level is the Go to Command. This command will be received by the
NAVPRO from the COMPRO and will be the ultimate destination of the quadcopter. It
is made up of three parts: GPS latitude, GPS longitude and height. This will be stored
in a memory location and will be compared the current GPS latitude, longitude, height
and orientation to decide on what action is required.
The next level is the object avoidance. The NAVPRO must make sure there is no
threat to the quadcopter in its current situation to meet the go to command; if there is
it must act to stop to quadcopter.
17
Figure 12: The three levels of control: go to command, object avoidance and basic flight
stabilisation
The lowest level is the basic flight stabilisation. This underpins the two layers above
and is used to carry out the go to command and satisfy the object avoidance. Initially
all autonomous movements of the quadcopter will be made by rotating and pitching the
vehicle. A set of control loops based on PID controllers (section 1.2) will be used to
control the pitch, yaw and height of the craft by providing pseudo control values to the
MKFC. The MKFC will stabilise the roll of the vehicle. A block diagram of the control
loops is shown in Figure 13. The ‘system’ block refers to the physical movement of the
aircraft. There may be some options to use the control loops on the MKFC to help
implement this requirement. This will be investigated further.
The MKFC has two sets of input command variables for thrust, yaw, pitch and roll:
‘externalStick’ and stick. The ‘externalStick’ set of variables will be used for autonomous
commands and will be set to the values sent over the SPI channel from the NAVPRO.
There will be two methods of engaging autonomous control, through the manual RF
transmitter or via the advanced communications link (see Section 4.3). The software
on the MKFC will be modified to let one of the RF PPM channels toggle autonomous
mode and select whether to use primary or secondary control commands. The MKFC
will then communicate with the NAVPRO which mode it is in. An alternative will be
via the COMPRO. When the COMPRO sends a signal to engage autonomous mode the
NAVPRO will contact the MKFC to disengage manual control, hold position and wait
for further instructions.
18
Figure 13: Control loops used to achieve autonomous flight. (a) yaw: controls the
direction of vehicle (b) pitch: controls the horizontal movement. (c) throttle: controls
the height of the vehicle
19
4.2.5 Requirement 6 Monitor Current Situation and Prepare for Autonomous
Control
4.3 Communications
The design of the communications section of the project is in the form shown in Figure 14
and 15.
The methodology is such that the data taken by the camera is passed to the micro-
controller using the serial output, this data is then stored within the microcontroller. A
large amount of RAM is required to store the data from the camera due to the resolution
and the number of frames created by the camera. Once the data has been compressed,
the data is then output through another serial connection to the GPRS modem or WiFi
module for wireless transmission.
When there is navigation data sent from the base station, the quadcopter has to
receive this information and move appropriately. The data that is received will be a
series of bytes that make up a packet; this packet will then be broken up to obtain the
navigation data. The navigation data will then be transmitted down the I2 C to another
Coldfire board and processed to determine what movements the user requests from the
quadcopter.
20
4.4 Object Detection
The question arises as to how often objects must be searched for and how often the results
of the searches are returned to the Navigations module. Consideration must be taken as
to the maximum speed and stopping distance of the quadcopter, and the range of the
sensors. For example, if the quadcopter is flying at 5 m s −1 and the sensors have a range
of 1 m, then searching must be performed at least 5 times a second in the direction of
travel. Alternatively, continuous searching may be performed and the results are com-
municated to the Navigation module at least 5 times a second. These calculations take
no consideration of the stopping distance, so the refresh rate may in fact have to be higher.
If the decision is made to only communicate when objects are detected, the problem of
a busy communications bus (assuming a shared bus – the Magnometer uses I2 C and the
GPS module uses an Asynchronous Serial link) may delay the transmission. This delay
could prove disastrous for the vehicle. It is due to this problem that continuous searching
will be used for this application and that the Navigation Module will periodically contact
the Object Detection System asking for a status update.
As mentioned in Section 3.4, the Infrared and ultrasound sensors have the possibilities
of variable outputs (be that voltage or pulse width), however, to find a microcontroller
that has the capabilities of multiple ADCs and PWM captures would prove problematic.
To overcome this, one pin could be used as an input, i.e. either a PWM capture pin or a
serial connection for an ADC, and each sensor is then toggled using separate pins. This
has the advantage of using a microcontroller with a lower specification, but now removes
the ability to have continuous sensing in all directions. If, however, distance switching
sensors are used, single pins can be used.
21
5 Equipment
5.1 Flight Control
As the flight controller is key to the ability to perform stable, autonomous flight, a sub-
stantial amount of time was spent deciding whether to build a flight controller or buy an
off-the-shelf model.
The advantage of building a flight controller is the flexibility to meet the exact spec-
ifications of the project. The disadvantage is the substantial amount of time required
to design and construct the flight controller which could be spent on creating advanced
navigation functionality. For this reason, several off-the-shelf flight controllers were con-
sidered using the following factors:
• Ease of integration and modification number and types of input-output ports, re-
programmable, open source.
• Cost
The UAVP [13] is an international open source project to create a reasonably priced
quadcopter. The flight controller is called Wolferl. Wolferl utilizes three gyros, a 3-
axis linear accelerometer, a compass sensor and a barometric sensor. The addition of
a compass sensor is an advantage over competing flight controllers but there are some
disadvantages:
• A populated and tested Universal Aerial Video Platform” V3.1x Flight Controller
Board can be purchased for £265[14] which is expensive.
• An 8 bit PIC16F876 PIC is the processor for the unit and is heavily utilized already
which would make it difficult to add functionality to the flight controller.
22
Paparazzi
Paparazzi is open source hardware and software platform for autonomous aircraft [15].
It is designed for airplanes but there is evidence that it has been adapted for a quadcopter
[16]. The paparazzi control board is called Tiny [17] and incorporates a GPS module and
a powerful LPC2148 microcontroller. It is available preassembled with GPS module but
no gyros, accelerometers and barometric sensors (IMU) for £160 [18]. To use the Tiny,
an IMU will be required which is likely to raise the total price to an unacceptable level.
The GPS integration is advantageous but there is little example software for quadcopters
available which means the software would have to be written from scratch. The high
price and the lack of a skeleton software made paparazzi unattractive.
MikroKopter
The MikroKopter flight controller (MKFC) [19] is an open source flight controller
which is part of the MikroKopter project. The MikroKopter project includes frame
designs, a flight controller, electronic speed controllers, a magnometer and a GPS based
navigation board yet the latter two are not open source. The MKFC was selected for the
SUAVE project for the following reasons:
• The MKFC is an open source software project which allows the code to be changed
and adapted to suit the needs of this project.
• A high quality software application is available (Flight Tool) for debugging, testing
and changing the flight parameters and settings.
• A strong and enthusiastic community exists [20] which helped ensure the MKFC
met our specifications.
• The MKFC has several synchronous and asynchronous input-output ports which
allow interfacing with external hardware.
• The MKFC costs £170 [21] which is cheaper than other flight controllers especially
considering the high performance evident from various videos and user comments
on the internet [22].
23
5.2 Navigation System Equipment
5.2.1 Central Processing Unit
A central processing unit is required to perform the navigation calculations and control the
navigation peripherals as detailed in Section 4.2. The Freescale Coldfire MCF528CVF66
Microcontroller (MCU) was selected for the following reasons:
• The MCU has a similar specification to the LPC2148 used in the Tiny flight con-
troller which performs autonomous navigation (Section 5.1):
– 32 bit
– 66,80 MHZ clock speed
– 512kb ROM,
– 64kB RAM
• The MCU also has a number of input-output ports which are required for interfacing
with the other systems and peripherals, these are: 1xSPI (synchronous serial port)
3xSCI (asynchronous serial port) 1xI2 C (Inter-Integrated Circuit) port.
Several GPS modules were considered including those used in other autonomous vehicle
projects such as the LEA-4P [16] but proved inhibitively expensive despite good perfor-
mance. After researching alternatives the RF Solutions GPS-320FW was selected [23].
The GPS-320FW was selected due to its excellent specifications, integrated ceramic an-
tenna and a synchronous TTL level interface despite the low price (£24) [24]. It features
-155dBM tracking sensitivity, 5m CEP accuracy (positional error) and a 1Hz update rate
which make it suitable for the SUAVE project.
5.2.3 Magnometer
The CMPS-03 magnetic compass [25] was selected for use as it was used successfully in
a robotics project last year[26]. It can be interfaced with using I2 C or by capturing and
decoding a pulse width modulated signal, creating flexibility in the system. The module
also preprocesses data and so outputs the current heading as a number of degrees from
north (0.0o to 359.9o ) which simplifies the coding required on the navigation processor.
24
5.3 Communications System
The hardware platforms within the project were chosen with the knowledge that they
would be connected together to work as a whole in the final implementation. Once con-
nected the hardware will have the ability to take video (camera), process the information
(Freescale Coldfire MCF5282CVF66) and transmit wirelessly (GSM modem of GPRS
modem).
The choice of processing devices used for the project had to fulfil the requirements of both
the communications and other aspects of the project. The physical requirements of the
communications processing device were that it had significant process power, large size
flash, an I2 C, 2 SCI ports, TCP/IP stack and a real time operating system[27]. These
are the requirements because receiving video from the camera and transmitting it via
the communications hardware needs to be completed in real time with little delay. The
choices for this were a Freescale Coldfire microcontroller or a Gumstix microcontroller as
they both have high processing power.
The Freescale microcontroller board that will be implemented into the quadcopter
design is a MCF5282CVF66. This is shown in Figure 17.
25
Figure 17: Freescale Coldfire Microcontroller and Development board
The decision for this was a Freescale Coldfire microcontroller. Due to the departments
strong relationship with Freescale these were already being stored within the university
for use within student projects. The MCF5282CVF66 has an excellent learning and
prototyping environment in the form of the Freescale Codewarrior. The choice of micro-
controller was also made quick and easy by the Freescale C & I Microselector tool.
The requirement from the GPRS modem is that it has the ability to change the RS232
signal from the microcontroller into a wireless transmission in the format required and
vice versa. The data rates involved have to be high enough to be able to handle a constant
transmission of video data, they should be able to handle RS232 data, and they should
be as small and light as possible.
The GPRS modems that will be used are Fargo Maestro - 100 as these fit the purpose
that is required for the project and have already been bought by the university. The
modem that will be used are shown in Figure 18.
26
The Fargo Maestro-100 has the following features[30]:
• SimToolKit Class 2.
• AT command set.
• SIM Holder.
• External Antenna.
The WiFi module that will be used is a S103 WLAN Compact Serial Module as this
is small, light and should automatically do all the conversions required to change the
RS232 output from the microcontroller to the WiFi data transmission and vice versa.
The module is shown in Figure 19.
27
The S103 WLAN Compact Serial Module has the following features[31]:
• UDP version configurable baud rate (9600, 38400, 57600, 115200 bps).
• LED status and activity indicators for easy installation, monitoring and diagnostics.
There are numerous cheap cameras and web cams on the market. These are mostly
of the CMOS (Complementary Metal Oxide Semiconductor) and CCD (Charged Couple
Device) types. This design helps to keep the cost of these cameras low, however, the
majority of these only have a very limited ability to capture video from a distance. A
second problem is that the video data must be compressed so that it may be transmitted
over a low rate data link.
28
range will limited.
If a readymade system, such as the one described above was purchased, the data rates
would be restrained to those designed by the manufacturer. The solution which offers
the most flexibility, would be to design a camera board module, with our own camera
and a video codec. This would involve buying a CMOS camera board and implement-
ing a video codec. There are a number of codecs which fit our purpose these include:
MPEG4, MJPEG, Strathclyde Compression Transform (SCT), H.263 or H.264. MPEG4,
H.263 and H.264 are all more complex to implement than MJPEG but offer superior com-
pression. Not enough is known about the SCT and some more investigation will need to
be carried out before a final decision is made on which compression technique will be used.
Use of an imaging device is, by default, a far more complex option, however, it can
return much more usable results and can allow the navigation module to be more efficient
in its evasive actions. When an object is encountered and the vehicle starts to rotate to
avoid the object, constant information can be passed to the navigation module to inform
it if the object has been cleared. The use of imaging is also very limited; one camera
requires a substantial amount of processing power (compared to Ultrasound or Infrared
sensors). Object detection is also limited to one direction, which means the quadcopter
must rotate if it is to see potentially viable routes. It is for these reasons that distance
29
Figure 20: Polar response of a Parallax Ping Sensor, tested with a 12” square piece of
white card[33]
During development, a Microchip PIC microcontroller will be used for the Object
Detection system as to minimise the complexity of having multiple systems running
on the one processor. In order for the Object Detection System to communicate with
the Navigation module, the PIC selected must have the capabilities of I2 C or RS232.
The microcontroller must also have room for multiple inputs; for these reasons, the
Microchip PIC16F876 was chosen[34]. A PIC microcontroller has the advantage over
a conventional Microcontroller Development Board (like the Freescale Coldfire board
being used in the Communications and Navigations modules) as it does not have any
unnecessary components (such as DACs, switches) and has a very low power consumption.
30
6 Progress to date
This section details the work accomplished as of 17th December 2008. The work ac-
complished falls into four sections; selection and construction of quadcopter, selection of
navigation equipment and testing, selection of communications equipment and selection
and testing of object detection equipment. Each of these sections is detailed below.
However, several bulkier pieces such as electrolytic capacitors, switches, the baromet-
ric sensor and the header connectors still required soldering. The wiring assembly used
to carry the power and control signals was prepared. All the parts were then connected
together and a successful dry test of the electronics was undertaken. All the separate
components were then brought together and placed on the frame. The propellers were
mounted and a pre-flight check was carried out. Using the MK Tool, the RC transmitter
channels were configured to match the MK receiver and a successful maiden test flight
was made[35]. Figure 22 below shows the MK after construction.
31
Figure 22: Fully constructed MikroKopter
As the flight controller is key to the ability to perform stable, autonomous flight, a
substantial amount of time was spent deciding whether to build a flight controller or buy
an off-the-shelf model.
To ensure the specifications of the GPS unit, magnometer and microcontroller are
suitable to meet the project objectives several modules were considered as described in
Section 5.2.
Currently only the CMPS-03 magnometer has been tested. It was tested by interfacing it
via I2 C with a Softec HCS12E128 starter kit. This starter kit was used due to the avail-
ability of I2 C on the microcontroller and the authors familiarity with the board and the
opportunity to reuse I2 C code from a previous project. A Freescale PBMCUSLK devel-
opment board with an LCD module was used as an output device to display the compass
heading. The MC9S12E128 drove the LCD module via the Serial Peripheral Interface
32
(SPI). Jump leads were constructed to connect the required pins on the MC9S12E128 to
the CMPS03 and PBMCUSLK. A circuit diagram is shown in Figure 23.
6.3 Communications
When confronted with the challenge of communications between a mobile aerial vehicle
and a base station there are many aspects that need to be considered. The data rate,
range, cost and availability were the main properties that were compared when investi-
gating communication systems. Another challenge is finding a processing device that has
the processing power to take the data in, process it and transmit it via some communi-
33
cations hardware. All of the processing has to be completed without a high level of lag
as it needs to be transmitted in real time.
After consideration the choice of communications protocol was reduced to WiFi, GSM
(GPRS), HSDPA and RF. The investigation came to the following conclusions:
• High Speed Download Packet Access (HSDPA) is too expensive to implement due
to the costs of the hardware and the monthly subscription costs for the use of 3G
services. As a final implementation rather than a proof of concept this would be
the ideal choice of communications standard due to its high data rates and large
range. The large coverage is due to the wide availability of 3G, it is available in
most urban areas making it ideal for surveillance use.
• GSM (GPRS) is cheap to implement due to the maturity of the technology. The
range of GPRS is large as it works over the cellular network so will work anywhere
there is a GSM signal, most urban areas have very good signal quality. The data
rate of the GPRS class 10 modems is 16-24kbps upload and 32-48kbps download.
In theory this will be large enough to send a decent quality video stream. GPRS
offers the ideal balance of range, cost and availability and will be implemented in
this project.
34
SUAVE control will initially use RF as this comes as standard on the MikroKopter.
This will be upgraded to WiFi as this is the best choice for a proof of concept and is
easy to demonstrate. Finally the communications to and from the quadcopter will be
implemented in GSM, this is the best choice of standard for the final implementation of
the project as it increases the possible applications of the quadcopter.
The output from the sensor was fed into a PIC16F876 microcontroller and if an object
was detected to be within range of the sensor, an LED would light. As yet, no testing
35
Figure 25: Screen captures of the output pin of the Sharp Distance Sensor
36
7 Future Work and Goals
7.1 Navigation System
The GPS unit and MKFC will be first tested to make sure they can be interfaced with
and the required data can be sent to and from them. This will be done using the
Freescale MC9S12C32 as the development environment is less complex than the Freescale
MCF528CVF66.
The control loops specified in Section 4.2.3 will then be created in software on the
Freescale MCF528CVF66 for autonomous flight. These will be implemented one at a
time to test their performance, starting with the height then the yaw and pitch.
The final stage will involve integrating the ODS and COMPRO systems to allow the
quadcopter to avoid objects and go to a GPS position. The control loops will then be
improved to allow stable autonomous flight in an direction.
• Testing the range and effectiveness of the WiFi module using the quadcopter.
• Implement the GPRS modems into the quadcopter design in place of the WiFi
module.
37
• Test the output of the GPRS modems.
Once these are completed, the communications used within the quadcopter will allow
it to be used for various applications.
Sensors with a larger range (in the order of 1 m) will be sought and tested. Ideally,
Analogue Range sensors will be used, and converted to Distance switches. This is so that
if time is sufficient, the actual distance between the sensor and object can be realised and
can be used to improve the Navigation module’s reaction to objects.
A communications link between the navigation module and the Object Detection
System will also have to be devised; the link will be either Serial or I2 C. To test this, a
simple link will be set up between the main PIC and either another PIC (in the case of
I2 C) or a computer (in the case of Serial).
38
8 Conclusion
In order to meet the specifications in Section 1.1, a Gantt Chart was created listing the
main tasks required to complete each objective. This can be found in Appendix B. Cur-
rently the project is on schedule and even some of the work streams are ahead of schedule
such as testing the magnometer (Section 6.2).
The overall system architecture and the individual design of the internal systems has
been finalised and equipment choices have been made. All the equipment required to cre-
ate the autonomous vehicle is now present apart from the camera. Each group member
has planned the work required to implement the design of their individual sections over
the coming semester as shown in the Gantt chart.
The MikroKopter has been constructed and is currently undergoing flight tests to
judge the potential performance of the quadcopter and allow experimentation with the
flight controller parameters. This period will also allow the team to improve their piloting
skills in order to meet the upcoming milestone of manually controlled flight.
Four weeks have been set aside at the end of semester two (6th April 2009 to 3rd May
2009) to allow for the final report to be written and the exhibition at the trade show
event to be planned.
39
References
[1] BBC, Volcano scientist wins accolade.
http://news.bbc.co.uk/1/hi/scotland/edinburgh and east/7735284.stm
Accessed: 26th November 2008
[10] Mobile Phones UK, GPRS (General Packet Radio Service), HSCSD & EDGE
(2008)
http://www.mobile-phones-uk.org.uk/gprs.htm
Accessed: 12th December 2008
40
[12] RoboRealm, Obstacle Avoidance (2008)
http://www.roborealm.com/tutorial/Obstacle Avoidance/slide010.php
Accessed: 27th November 2008
[20] Busker, Holger Buss & Ingo, MikroKopter Forum - International Area.
http://forum.mikrokopter.de/forum-17.html
Accessed: 28th November 2008
41
[22] –, MikroKopter - Von der ersten Schraube bis zum Flug
http://www.mikrokopter.com/ucwiki/VideoAbspielen?id=106
Accessed: 15th November 2008
[26] Ali Alvi et al., Advanced Robot Control. (University of Strathclyde, 2008)
[31] Wireless Products, S103 WLAN Compact Serial Module Datasheet (2007)
42
A Risk Assessment
Section A
Section B
Section C
Groups who may be at risk: Staff/Students with Special Needs ❒ Inexperienced Staff x
43
Section D
Assess the foreseeable risks from the hazards identified and state the controls to
be implemented to eliminate or minimise the risk where it is reasonably
practicable to do so.
Give priority to those risks which affect large numbers of people and/or could
result in serious harm. Apply the following principles, if possible, in the
following order:
This section should be used to specify the scheme of work which should include
details of the significant controls, safety procedures and working methods which
need to be adhered to. It should also refer to existing safety procedures, eg
already contained in the departmental safety regulations or University local
rules. The following should also be considered in the risk assessment process:
Do the control measures - meet the standards set by the legal requirements?
- comply with a recognised industry standard?
- represent good practice?
- reduce risk as far as reasonably practicable?
Flying Vehicle- Vehicle will be flown outdoors and indoors. When flying indoors the
room will be closed to the general public. During outdoor testing an open area that is
closed to the general public will be sought. Good weather conditions are required i.e.
no wind or rain. Public liability insurance will be taken out
Fire/ Fumes- Lithium polymer batteries are being used; these are dangerous if
handled incorrectly. Manufacturer’s guidelines regarding recharging will be adhered
to. A balanced charger with a timer cut-off has been purchased. Non-flammable
surface is required (Pyrex bowl with sand). Fire extinguishers of class B and C are at
the end of the hall if needed. The battery will be disposed of if damaged.
Rotating Propellers- All people will be kept from being close to the vehicle once the
motors have been started. Safety goggles will be worn when the vehicle is being
tested. If required a guard will be in place over the propellers. The transmitter
instructions for activating the vehicle will be adhered to.
44
45
Section E
Accreditation
Assessor:
Supervisor:
1. 2. 3.
Note:
1. The completed risk assessment form must be kept by the assessor and supervisor with a
copy given to Departmental Safety Convener.
2. The assessment must be reviewed and where appropriate revised if there is reason to
suspect it is no longer valid or there has been a significant change to the task procedure.
3. The responsibility for ensuring that the risk assessment is suitable and sufficient
lies with the Supervisor not with the Departmental Safety Convener.
4. A copy of this assessment must be retained by the Department for as long as is
relevant (in case of the form being used as a COSHH risk assessment this must be kept
for 30 years and in some cases 40 years after the project or works are terminated).
46
B Gantt Chart
03 10 17 24 01 08 15 22 29 05 12 19 26 02 09 16 23 02 09 16 23 30 06 13 20 27
Offical Project Milestones & Paperwork 08/11/2008 03/05/2009
Write Final Report 06/04/2009 03/05/2009
Prepare Interim Presentation 01/02/2009 09/02/2009
Write Interim Report 08/12/2008 21/12/2008
Statement of Intent 08/11/2008 08/11/2008
Final Report & Presentation 03/05/2009 03/05/2009
Individual Interviews 14/11/2008 14/11/2008
Individual Interviews 19/02/2009 19/02/2009
Interim Presentation 12/02/2009 12/02/2009
Interim Report 21/12/2008 21/12/2008
Subsystem Development 03/11/2008 06/04/2009
Video System 08/12/2008 06/04/2009
Investigation of Video Codec 08/12/2008 22/12/2008
Video Codec Implementation on Board 22/12/2008 02/02/2009
Video Codec Implementation at the Groun 05/01/2009 16/02/2009
Integration with Communications System 22/12/2008 06/04/2009
Skeleton Vehicle 03/11/2008 21/12/2008
Order Parts 09/11/2008 09/11/2008
Quadcopter Purchase Decision 03/11/2008 17/11/2008
Vehicle Construction and Test Flights 16/11/2008 08/12/2008
Tuning for manually controlled stable fligh 08/12/2008 21/12/2008
Manually Controlled Stable Flight 21/12/2008 21/12/2008
Object Detection System 11/11/2008 06/04/2009
Investivative Options 11/11/2008 21/01/2009
Simple test of IK 01/12/2008 19/12/2008
Test of Ultrasound 26/01/2009 11/02/2009
Test of IR 26/01/2009 11/02/2009
Choice of Sensor 11/02/2009 11/02/2009
I2C/Serial Link 19/01/2009 30/01/2009
I2C Finished 30/01/2009 30/01/2009
Integration with Navigation 11/02/2009 06/04/2009
Basic Detection 13/03/2009 13/03/2009
Autonomous Flight and Navigation 22/12/2008 05/04/2009
GPS & Magnomoter Testing & Integration 22/12/2008 02/01/2009
Design and Implement Control Algorithms 04/01/2009 20/02/2009
Integrate Object Detection and Communic 15/02/2009 06/03/2009
Achieve Autonomous Flight Between Two 30/03/2009 30/03/2009
Design and Implement Navigation Algorith 23/02/2009 05/04/2009
Acheive Stable Autonomous Hover 20/02/2009 20/02/2009
Bi-directional Data Communication 11/11/2008 06/04/2009
Communication Standards Comparison 11/11/2008 17/11/2008
WiFi Hardware Comparison 17/11/2008 01/12/2008
GSM Hardware Comparison 17/11/2008 01/12/2008
Microcontroller Decision 03/12/2008 12/12/2008
Exams 19/12/2008 25/01/2009
WiFi Implementation 12/12/2008 08/02/2009
Testing the output from the WiFi modul 12/12/2008 19/12/2008
Connecting the WiFi module wirelessly t 25/01/2009 27/01/2009
Obtain the navigation data from the pac 27/01/2009 02/02/2009
Pass the navigation data though the IIC 02/02/2009 03/02/2009
Testing the range and reliability of the 03/02/2009 08/02/2009
Implement an override to stop autonom 03/02/2009 08/02/2009
GPRS Implementation 09/02/2009 06/04/2009
Testing the output from the GPRS mod 09/02/2009 16/02/2009
Transmission from the modems 16/02/2009 26/02/2009
Reception to the modems 26/02/2009 06/03/2009
Obtain navigation data form the packets 06/03/2009 20/03/2009
Pass the navigation data though the IIC 20/03/2009 27/03/2009
Testing the range and reliability of the G 27/03/2009 03/04/2009
Implement an override to stop autonom 04/04/2009 06/04/2009
47