Line Following Robot Guide
Line Following Robot Guide
Follower Robot
M. Zafri Baharuddin1, Izham Z. Abidin1, S. Sulaiman Kaja Mohideen1,
Yap Keem Siah1, Jeffrey Tan Too Chuan2
1
Department of Electrical Engineering
2
Department of Mechanical Engineering
Universiti Tenaga Nasional, Km7, Jalan Kajang-Puchong, 43009 Kajang, Selangor, Malaysia
[email protected]
ABSTRACT
Navigation is important to many envisioned applications of mobile robots. The variety of
navigation tools may vary from expensive high accuracy tools to cheap low accuracy tools.
The complexity of these tools would be dependent upon the navigation requirements. The
more complex the navigation requirements, the more expensive the tools required. A cheap
and simple navigation tool would be the line following sensor. However, the challenge posed
in this navigation technique may be complex. A straight or wavy line would be simple to
navigate whereas a T-junction, 90 degree bends and a grid junction would be difficult to
navigate. This is due to the physical kinematics constraints which is limited to the motor
response, position and the turning radius of the robot. This paper presents a proposed line
sensor configuration to improve the navigation reliability of the differential drive line
following robot.
1. INTRODUCTION
A line follower robot is basically a robot designed to follow a ‘line’ or path already
predetermined by the user. This line or path may be as simple as a physical white line on the
floor or as complex path marking schemes e.g. embedded lines, magnetic markers and laser
guide markers. In order to detect these specific markers or ‘lines’, various sensing schemes
can be employed [1]. These schemes may vary from simple low cost line sensing circuit to
expansive vision systems. The choice of these schemes would be dependent upon the sensing
accuracy and flexibility required. From the industrial point of view, line following robot has
been implemented in semi to fully autonomous plants. In this environment, these robots
functions as materials carrier to deliver products from one manufacturing point to another
where rail, conveyor and gantry solutions are not possible[1].
Apart from line following capabilities, these robots should also have the capability to navigate
junctions and decide on which junction to turn and which junction ignore. This would require
the robot to have 90 degree turn and also junction counting capabilities. To add on to the
complexity of the problem, sensor positioning also plays a role in optimizing the robots
performance for the tasks mentioned earlier[2].
This paper attempts to present a simple set of experiments on sensor positioning and also
controlling strategy to enable junction counting and also 90 degree turn accuracy. These
strategies are tested on a basic test robot base (refer to figure 1) system on a test pitch based
upon ROBOCON 2006 [3]requirement as shown in figure 2.
Figure 1: Advanced Line Follower (ALF) Figure 2: ROBOCON 2006 pitch
Robot requirement
The supplied motor card has the ability to control the basic movements of the motor via logic
signals which makes it very versatile for microcontroller control applications. The controller
card is also equipped with a speed feedback and speed control facilities which rely on voltage
signals given to the motor card. Apart from that, the motor card would also have a built in
manually controlled speed selector.
2
Figure 3: The Master controller with the DC motor card connected
In terms of the colours chosen for the LED, based upon literature done [4], it can be seen that
red would be the best choice due to its high spectral intensity as compared to other colours.
The basic line sensor circuit would be as shown in figure 4.
Based upon figure 4, the output of sensor circuit could either be logic high or logic low
depending upon the condition whether the sensor senses a line or not. However, the circuit
shown in figure 4 would require the user to repeatedly tune the circuit when there is a
different ambient light condition. Apart from that the threshold level between line and no line
is very fine as such that different ambient light condition would affect its performance[2].
However, for the ALF robot, a new breed of tune-able sensors were introduced which is based
upon direct sensor reading. This type of sensors would require the use a microcontroller
(PIC16F688) to do onboard processing after which the result of it is conveyed to the master
controller. The advantage of having this facility would improve the flexibility of the sensor if
3
applied to a different controller in the future. Apart from that it would reduce computing
burden onto the master controller. Figure 3 shows the actual circuit used for the ALF robot.
Based upon figure 5, the navigational strategy can be divided into several blocks which are:
the line following algorithm, junction detection – junction counting algorithm and 90 degree
turn algorithm. Each of these blocks was designed in modular form in order to optimize the
memory utilisation. The following will outline the details of the blocks shown in figure 5.
4
Figure 6: Line following algorithm flow chart
Referring to figure 6, the direction of the robot will be determined by the status of the sensors
mentioned earlier. Based upon which sensors that did not detect the line, a corresponding
corrective action signal is sent to the motor card. This corrective action will continue until
both sensors detect the line again. The corresponding corrective action signals are as shown in
figure 6.
In an event that both sensors did not detect a line, the robot will stop and reverse until either
one of the sensor detects a line after which the line following algorithm is activated again. By
incorporating this idea into the line following algorithm, ALF will never miss a line.
5
Figure 7: Junction detection Algorithm
6
Based upon both figure 7 and figure 8, ALF would require 2 sensors, which are SLL and SRR to
detect and count the junctions. It is also worth noting that the junction count and junction
detection would require both of these sensors to see the junction at the same time. If only
either one of these sensors sees the junction, controller would register it as a junction. Hence
placements for both of these sensors are very critical. This issue will be elaborated in Section
5.2 of this paper.
Referring to figure 9, the success of this algorithm will be dependent upon sensor placement
and also the response of the sensors when detecting the last junction. To avoid any overshoot,
the robot will stop first before executing the 90-degree algorithm. Based upon experimental
observation, this strategy would yield an accurate turning sequence with less post turn line
correction. The details of the sensor placement will be elaborated in Section 5.3 of this paper.
7
5. EXPERIMENTAL RESULTS AND DISCUSSION
Based upon the pitch requirement, a basic two-wheeled mobile robot with advance line
follower (ALF) capabilities is constructed as shown in figure 10. This robot is equipped with
a line sensing array with an embedded microcontroller to enable auto tuning and line-surface
differentiation facility.
5.1 Auto tuning of sensor array
Figure 11 illustrates the line sensor array used in ALF. The auto tuning function flow chart of
the sensor array is as shown in figure 10.
Yes
Read Analog value for all the sensor
START SKIP TUNNING at surface
No
Read Analog value for all the sensor
at line
ALF basic controlling strategy is based upon a hierarchical control concept where the robot
would have a master controller which functions as the main coordinator for the robot [6]. This
master controller would require the services of slave controllers to do detailed functions for
specific requirements, e.g. line sensing function and motor control functions. The advantage
of this method is that distributed processing is possible which would reduce computational
burden of the master controller. The modules used for ALF are the line sensor array and the
motor control as shown in figure 11 and figure 12 respectively.
Figure 11: Auto tuning sensor array Figure 12: Top view of ALF
8
5.2 Sensor Array Design
In order to achieve successful navigation, the amount of sensors used and the location of these
sensors play an important role. Inadequate number of sensors would result in a reduction of
sensor resolution and could even prevent the robot from following a line. Based upon the test
pitch requirement discussed earlier, it was deduced that the sensor array would require more
than 2 sensors. To enable efficient line following capabilities, a minimum of 2 sensors is
required (refer to figure 13(c)). However, 2 sensors are not enough to distinguish a single line
from a junction; hence various possible arrangements were looked into. This is shown in
figure 13.
Figure 13: Different types of sensor array (a) Matrix type sensor array, (b) Single line sensor
array, (c) Basic two-sensor configuration
Referring figure 13(a) and 13(b), the matrix and line configuration would be possible
solutions to achieve junction detection. However, looking into the matrix configuration and
relate it back to the ALF movement on the test pitch, this configuration uses too much sensors.
For example, there are 8 sensors located on top of the vertical line which is superfluous for
line navigation purposes. In addition to that, there are also 4 sensors located in a vertical
fashion on the extreme left and right of the sensor array which in this is suppose to function as
junction detection. This also is considered to be too much for junction detection.
Looking into the single line sensor array (figure 13(b) and figure 15), it utilizes two middle
sensors for line navigation; sensors SR and SL; and the other two outer sensors for junction
detection; sensors SRR and SLL. Comparing between the matrix and the single line sensor
array, the single line sensor array can be seen to be suitable because it uses less sensors but it
would still be able to perform line navigation and junction detection.
In terms of the 2 outer sensors, care must be taken not to place these sensors too far or too
near to the line navigation sensors. If these sensors are placed too far from the navigational
sensors, there may be sensor inaccuracy to differentiate junction if it enters the junction at an
awkward angle. If the sensors are placed too near, the junction sensors may accidentally
detect the line self rather than the junction. This is illustrated in figure 14.
9
θC
In order to avoid the dilemma of sensor positioning mentioned earlier, equation (1) may be
utilised to determine the exact distance between the sensors.
w
w < S RR − S LL < (1)
tan θC
Where:
S RR − S LL is the distance between the left most and right most sensors in mm, w is
the width of the line being detected in mm and θC is the critical entry angle of the robot.
Please refer to figure 14 (b) for the angle θC.
The width of the line has been determined by the ROBOCON pitch requirements [3] to be
30mm. Setting the distance between the outermost sensors to be 85mm, and rearranging
inequality (1) to calculate the critical entry angle:
⎛ w ⎞
θ C < tan −1 ⎜⎜ ⎟⎟ (2)
⎝ S RR − S LL ⎠
This yields θC to be 19.44 degrees. If the robot attempts to enter the line at an angle above
this value, it will not be able to detect the junction.
*Front View
Figure 15: Front and bottom view of sensor array used in this study
10
There are two turning methods that were tested on as explained in the following subsection:
5.3.1 One motor moving forward and the other motor stationary (Method I)
Referring to figure 16, this method requires
the sensors be placed at half the length of the
Direction of motion
distance between the left and right wheel.
At this point, when the robot turns into the
junction it will enter the line exactly on the
line. Hence the sensor array will be l/2
perpendicular with respect to the line after
the 90-degree turn is completed. This will
ensure smooth transition from one line to l
another when turning junctions as illustrated
Figure 16: Location of sensors for 90
in figure 17.
degree turns
Sensors
Line
Line
11
5.3.3 Overall experiment observation
In general, ALF was tested employing all the navigational strategies discussed in this paper.
Observation made for every proposed strategy show that the robot is capable of navigating the
line with no difficulties at all. Introducing ambient lighting to the test pitch does not affect
ALF’s line following capability. The same can be said in terms of junction navigation
algorithms.
However, if the same strategies were employed to the same robot base but at a much higher
robot velocity, the robot would have difficulties in line and junction navigation. This is due to
a compound problem which consists of sensor position and the speed control during
navigational correction. Observing the current configuration, any navigational correction will
cause the sensor to overshoot the sensors from the desired position. This problem may be
solved by implementing a few optional strategies. One of these strategies would include speed
control implementation in the navigational algorithm. Apart from that, a much finer
navigation can be achieved if the sensor is placed further away from the sensor. Both of these
options must be implemented together to improve line following capabilities at high robot
velocities.
6. CONCLUSION
This paper has discussed and suggested standard methods to place sensors for junction
tracking and junction turning. The proposed methods were tested on a test pitch with a test
robot base. Test results yields that the proposed strategy implemented for ALF navigation is
successful. However some navigational problem will occur at higher velocities. These
problems can be solved by introducing a second sensor array for minute changes and speed
control in the final navigational strategy. These options will be looked into and implemented
for future development of ALF.
ACKNOWLEDGEMENT
The authors acknowledge financial and technical support for this research received from
Universiti Tenaga Nasional.
REFERENCES
1. Aziz, A.R.A., Designing and Constructing an Automatic Steering Vehicle (AGV), in
Dept of Electrical Engineering. 2004, University Tenaga Nasional: Malaysia. p. 50.
2. Bong, D.M.K., Automatic Guided Vehicle System (AGV), in Dept of Electrical
Engineering. 2004, University Tenaga Nasional: Malaysia. p. 41.
3. SIRIM, Asia Pacific Robot Contest 2006. 2006.
4. Ashari, S., Line, Gap and Colour Sensors, in Dept of Electical Engineering. 2004,
University Tenaga Nasional: Malaysia. p. 70.
5. Mohd Zafri Bahrudin, S.S.K.M., Izham Zainal Abidin, Mobile Robotics Group
Resources Page. 2004, Mobile Robotics Group at University Tenaga Nasional.
6. Osman, J.H.S., Computer Controlled Systems. 2nd Edition ed. 2005: Universiti
Teknologi Malaysia.
12