0% found this document useful (0 votes)
47 views8 pages

Jonk RVD BSme 97

This document summarizes a line-tracking algorithm that can flexibly and robustly track lines in images, even when line quality is poor. It detects line points locally to construct a tracker that is fast and leverages knowledge about the line being tracked. The algorithm models lines as B-splines and detects line points in two steps - estimating the local line direction, then seeking the line center orthogonal to that direction. It extends the line by finding potential extension points on a circle around the line endpoint within a maximum curvature threshold, then evaluating each point by how well it fits detected line points.

Uploaded by

Joe Javaborneo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views8 pages

Jonk RVD BSme 97

This document summarizes a line-tracking algorithm that can flexibly and robustly track lines in images, even when line quality is poor. It detects line points locally to construct a tracker that is fast and leverages knowledge about the line being tracked. The algorithm models lines as B-splines and detects line points in two steps - estimating the local line direction, then seeking the line center orthogonal to that direction. It extends the line by finding potential extension points on a circle around the line endpoint within a maximum curvature threshold, then evaluating each point by how well it fits detected line points.

Uploaded by

Joe Javaborneo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

A Line-Tracker

Arnold Jonk & Rein van den Boomgaard & Arnold W.M. Smeulders
Faculty of Mathematics, Computer Science, Physics and Astronomy
University of Amsterdam, The Netherlands
fjonk,rein,[email protected]
A line-tracker is presented that is flexible and robust. Linepoints are detected locally in
order to construct a tracker that can be fast and is able to use knowledge about the line to
be tracked. This is especially needed in cases where the line is of poor image quality. An
application is shown in which two trackers are combined into a parallel line-tracker.

1 Introduction
Interpretation of line drawings is largely dependent on the quality of line-recognition, because lines form the building blocks of these types of drawings. Because of its importance, considerable effort has been put in line-recognition. There are two main approaches to line recognition, line-detection and line-tracking. Line-detection is a parallel proces of global-image operations such as segmentation, edge-detection and clustering, in principle using little a priori
information where to find the line. The difficulty here is to assure connectivity in the detected
line-segments (see [Jonk95]). Line-tracking is a sequential process starting from an a priori located point on the line. The difficultu here is reach the end of the line. Line-tracking may use
an adaptive model of the profile, hence provides more information to the line-recognition step.
When lines are of very poor quality global-image techniques are very costly, ineffective or give
rise to a large number of false hits (observe figure 1 for an example). In such a case line trackers
are the preferred method.

Figure 1: Example of a line that is difficult to detect with global image operators. Note that the line on the left is
drawn on the backside of the paper, which explains its vague appearance.

2 Line modelling and detection


The model of the line consists of two parts, namely a model describing the geometry of the
line and a model describing the way the line is projected on the grey-valued image (the profile).

2.1 Shape modelling


Line-tracking methods model lines differently than line-detecting applications. In [Dori93]
curves are modelled as a series of arcs and vectors. The vector model and arc model can be
useful for detecting lines. Vectors and arcs can be effectively detected and then clustered in
complete line-descriptions. For line tracking however, another approach is needed.
Describing the line as a spline yields a stable and precise description of a line. This is the
reason most line-tracking applications use spline-descriptions. A B-spline is described by its set

of controlpoints fp~0
given by:

p~1

p~ng. The formula for interpolating a B-spline (see [Pavlidis87]) is


Lk (t) =

k
X
i=0

p~iNi m(t)

(1)

where

i t ti+1
Ni 0 = 10 totherwise
(2)
Ni m (t) = t t ;;ti t Ni m;1 (t) + t ti+m+1;;t t Ni1 m;1 (t):
(3)
i+m i
i+m+1 i+1
In these equations m stands for the order of the spline, and k for the number of nodes. In our
line-tracking method, a B-spline is used to describe the line.

2.2 Profile modelling & detection


In this section we describe a method to detect linepoints. Linepoints are the centre-points
of the line. To this end, first the model of the line is described, and then we describe a two-step
detection method. This method is detailed in [Steger96].
In figure 2.a, a grey-value landscape is presented, figure 2.b shows a model of this landscape. This is modelled as a 1D-parabolic profile that is swept along a curved line. Finding the

Figure 2: a. Grey value landscape of a line. b. Model of a line.


linepoints is done in two basic steps. First, the local direction of the line is estimated. Then, in
the direction orthogonal to the line (let this direction be n(t)), the center of the lines profile is
sought.
The local direction of the line can be estimated by looking at the second-order structure of
the image (z (x y )). As is clear from the example, the second order derivative is minimal is the
lines direction. Because lines are noisy, the data is convolved with the derivatives of a Gaussian
smoothing kernel. The size of the smoothing kernel must be tuned to the expected width of the
lines to get an optimal response (a technique called edge focussing). The direction in which the
second derivative of z (x y ) takes on its maximum absolute value will be used as the direction
n(t). This direction can be determined by calculating the eigenvalues and eigenvectors of the
Hessian matrix

zxy :
H (x y) = zzxx
xy zyy

(4)

The direction n(t) is obtained by using the vector corresponding to the eigenvalue of maximum absolute value.
Detection of the centre of the line is now straight forward. Figure 2.b shows our model of
the lines propile, a parabolic shaped profile. This is an effective first-order approximation of
the observed real world profiles.
The parabolic profile (with x the (local) coordinate in the direction n(t)) is defined with the
equation:

2
;w x w
f (x) = h0 (1 ; (x=w) )) otherwise

(5)

with w being the width, and h the height of the line.


The estimated 1D-derivatives (with Gaussian smoothing) are:

z(x w h) = (g f )(x)
z (x w h) = (g f )(x)
z (x w h) = (g f )(x):
0

(6)

00

(7)

00

(8)

It is clear from these equations that z (x w h) = 0 , x = 0 for all , and z (x w h)


will take on its maximum negative value at x = 0. It is therefore possible to determine the
precise location of the line for all .
To apply the above analysis on discrete signals, two modifications need to be made. The
first one is the choice of the convolution in discrete space, for which the integrated Gaussian is
selected. This location can be determined with subpixel accuracy.
0

00

3 The Line-Tracker
In figure 3 an overview is given of the basic steps of the tracking algorithm. In this figure
the boxes denote steps in the line-tracking algorithm. So, there is an initializing step, a step to
retrieve a list of possible next extensions, etc.
POINT, DIRECTION

Start
SPLINE
STEPSIZE

CURVATURE

find extensions
EVALUATED LINE-POINTS

Select extension
EXTENSION POINT

extend

stop?

terminate

Figure 3: Overview of the tracking algorithm, with its main parameters.


As input to the tracking-algorithm one point and a direction is needed. The data further
consists of the grey-value image, and parameters needed to steer the extension-routine (detailed
later).
The increment of the line-description is done in two basic-steps. First the set of possible
extensions is determined. Then each extension is evaluated, in order to be able to select the
optimal extension.
To extend a line with a certain stepsize, first potential extension points are detected. Then
each point is evaluated by trying to find evidence in the image for the line that is formed by
extending the spline with that point. This method of finding extension points is preferred over
pixel by pixel extensions for two reasons. First, jumping along the tracked line gives better
adaptive control on the curvature of the line. And secondly, jumping along the line can be more
robust, for small line-ruptures can be handled.
As can be seen in figure 4.a, extending the spline and looking for linepoints orthogonal to
the extension is too simple an approach. The size of the extension is limited to the radius of the
circle describing the maximal allowed curvature (d0 in the figure).

In figure 4.b another approach is presented. Here, the possible new linepoints are sought on
the circle with diameter d, between the intersections of this circle with the circles describing the
maximum admissible curvature p. Using the points on this circle also ensures the length of the
new line-fragment to be shorter with increasing curvature.

d
d
p

p
d

(A)

(B)

Figure 4: The maximally admissable curvature of the line is indicated by a with the circle with diameter p. (a).
Seeking points orthogonal to the extension-direction severely restricts the stepsize. (b). Therefore, possible new
linepoints are sought on the circle with diameter d.

The extension of a spline is calculated by obtaining the derivative-vector (d~) of the spline at
its endpoint. The initial extension is a vector of length p (See figure 5.a). Linepoints are detected
in the circular subimage using the detection-method described in section 2.2. Detection results
in a set of linepoints S (figure 5.b). Now, the list of potential extension-points E is derived
p

Figure 5: Definition of parameters. (a) The center c and stepsize p. (b) The set S of detected linepoints is a subset

of the hatched area. (c) The set E of potential extensionpoints is a subset of the cross-hatched area.

from S by defining

E = fvjv 2 S \ d(v c) p ; pg
with d(v c) the Euclidean distance between points v and c, and

(9)

p the maximum distance between a detected line-point and the circle line. This is visualized in figure 5.(c).
Now, each point in E needs to be evaluated, in order to be able to select the point that is
the most likely extension of the line. Observe figure 6, to appreciate the need for an evaluation
algorithm of potential extension points.
other line
c
tracked line

p1
p0
p2

extension of tracked line

Figure 6: An evaluation based on evidence in the image is needed to distinguish between both detected linepoints.
The evaluation is done by extending the spline Ln (representing a spline with n nodes) with
each point e 2 E , and then checking the evidence in the image for the lines thus hypothesized.

This done by calculating the minimum squared distance between the extended spline, and the
detected line-points. See for an example figure 7.
p1

p1

p2

p2

Figure 7: The squared difference-function applied to an example. Figure (b) will produce a (much) larger error
than figure (a).

The line tracker needs a stopping condition. Primarily, the tracker will halt if it can not find
an extension point. In addition, the tracker must also be restrained from jumping lines (figure
8.a). This is achieved by imposing a maximum on the allowed value of c(Ln S ). Also, the
line-tracker should halt when the line-structure changes abruptly (figure 8.a). This is tackled
by allowing continuous changes in the parameters of the line: extensions that do not extend the
line with a segment with significantly different contrast and/or width.

Figure 8: (a) the tracked line terminates, but another, similar, line falls within the stepsize. (b) Two different,
connected, lines. The line-tracker should terminate tracking when it reaches the connection.

The tracker can fail in a number of circumstances. When line points remain undetected, for
example because the lines thickness is outside the parameters supplied to the algorithm, or its
profile cannot be approximated by a parabolic shape, the tracker will incorrectly terminate. The
design of the algorithm is such that small gaps can be handled. A gap larger than the stepsize,
however, will still result in termination of the tracker.

4 Application of the linetracker


In this section a detector based on two line-trackers is described. The task at hand is to track
two lines that are largely parallel. The lines need to be tracked until one of the lines is lost. See
figure 9 for an example, and figure 4 for the pseudo-code of the algorithm.

Figure 9: Example of two parallel lines, with one ending in an arrowhead.


The input of the parallel line-tracker consists of two points and a direction. These parameters are sufficient to initialize both trackers, for the lines are presumed parallel.
Both lines (called Bn1 and Bn2 ) are extended with their appropriate stepsize. This results in
two sets of possible new points for both lines. The algorithm needs to decide which of the points
will be selected to extend the lines. See figure 11. This decision is based on two factors, namely
the quality of a hypotheses and the consistency in parallelism of the two lines.

procedure ParallelTracker (Point P1, Point P2, Vector Dir)


initialize(L,P1,Dir), initialize(R,P2,Dir)
LeftStepsize, RightStepsize DefaultStepsize
do
Left get extensions(L,LeftStepsize)
Right get extensions(R,RightStepsize)

Min
for Le Left
for Ri Right
This eval(L+Le,R+Ri)
if This < Min
LeMin Le
RiMin Ri
Min This
endif
if Min =
extend (L,LeMin)
extend (R,RiMin)
LeftStepsize, RightStepsize adjust stepsize(L,R,DefaultStepsize)
endif
while Left = EMPTY .AND. Right = EMPTY

6 1
6

return L,R

Figure 10: Pseudo-code of the parallel line tracker


The quality of the hypotheses of a new linepoint is described in the previous section. The
consistency in parallelism is measured using the first-derivative vectors of both spline-descriptions
between the last nodes in the following way:

P (Bn1 Bn2 ) =

Zn

n;2

(j Bn1 (i)0 ; Bn2 (i)0 j)2 d(i):

(10)

Note that we only need the last three nodes of the spline-descriptions because cubic splines
are used.
Combining these two factors is done by selecting the two potential linepoints of both lines
maximizing the following equation:

max

e1 2E1 e2 2E2

g p (P ( (Bn1 e1 ) (Bn2 e2 ))) g c (c( (Bn1 e1 ) S1 )) g c (c( (Bn2 e2 ) S2 )): (11)


extension of backline

p1
c

p0
p2

backline

B
galley
backline

Figure 11: Extending the B-spline and locating valid backline-points can produce several hypothesis. Using constraints provided by the galley, the correct point is found.

In this equation, (Ln e) extends a spline Ln with a point e. c( (Bn1 e1 ) S1 ) is a function


that measures the quality of an extension as described in the previous section. Note that usually
there will be only one potential linepoint for each line, in which case those points are immediately selected. There are several ways to combine the terms in equation 11. Multiplication
was chosen (instead of summation), after experiments demonstrated that it was best capable of
trading off the quality of the line-detection, and the consistency in parallelism.
Independently extending the two lines can lead to problems. This is due to the fact that
lines are required to be parallel, but not straight. Observe figure 12.a. When on both lines the
same stepsize is used, the detected points will not remain exactly opposite. Figure 12.b shows
a solution to this problem. Before every extension, the normal to the end-point of the spline
is intersected with the (straight) extension of the opposing spline. This produces an intersecting point. The distance between the endpoint and the intersection point is subtracted from the
extension of the inner-spline.

Delta

(A)

(B)

Figure 12: (a). When tracking two curved, parallel lines, a uniform stepsize will cause the tracking to divert. (b)
Calculating the size of the step needed on the inner-track.

4.

This synchronization of the stepsize is done in the subroutine adjust

stepsize in algorithm

Increments are done until the stopcondition is met. The stopcondition is reached when one
of the lines cannot be extended further. This can happen for example when one of the lines
terminates, or runs out of the image. Additional stopcriteria can be envisioned. For example an
upper bound can be enforced on the divergence of the lines. In the current algorithm, parallelism
is only used as a selection-criterium when multiple extensions are available.

5 Experiments
Using our algorithm, we undertook experiments to test the validity of our approach. It showed
the algorithm succeeds in finding the set of parallel lines except for very specific circumstances.
In figure 13 we present two of the tested situations, and the results obtained by the algorithm. In the presented experiments, two different lines are tracked. One line is drawn on the
backside of the paper, while the other is a thick line drawn on the front. In figure 13.a a difficult
but succesfully tracked situation is shown. Here, at the location the lines diverge (making the
parallel-criterium less effective), the image is of very poor quality due to paper rupture. Line
detection is also hampered by a crossing object. The tracker is, however, still able to detect te
line.
In figure 13.b a failed experiment is shown. Here, the line-detection step fails to find the
line drawn on the backside of the paper. The figure displays (the white dots) detected linepoints
(the center-line of the arrowhead), and the search area (indicated by the grey dots). The tracker
can not operate when no linepoints are detected.

6 Comparison with profile matching


There are other techniques for tracking lines. In this section, our method is compared with
a profile matching algorithm.
In [Noordmans97] an algorithm based on profile matching is described. Consequetive crosssections are matched against the model of the line (2D or 3D). This model is updated after each

(A)

(B)

Figure 13: Examples of tracked parallel lines. (a) Curved line with a close up of low quality image part. (b)
Example of a tracking error.

match. The stepsize between consequetive matches is typically very small. The advantage of
this method is the (potential) accuracy of the estimated parameters. Very specific structures can
be followed, for example a 3D-ellips. A drawback of the profile matching method is its sensitivity to noise and gaps in the line.
The example images in this paper show lines of very poor quality. Their profiles are not
suited for high precision profile matching, and many gaps in the lines, as well as crossing lines,
can be expected. Experiments showed that in this domain profile matching does not perform
well.

7 Conclusions
A line tracking algorithm was presented. We have shown the usefulness of the line point
detection method in cooperation with our line tracker. The line tracker is designed such that is
robust against line ruptures, and can accurately control the maximal allowed curvature of the
tracked line.
A strong point of this algorithm is its flexibility. This flexibility was demonstrated in an
example, where two line-trackers were combined to form a parallel line-tracker.

References
[Dori93] D. Dori & Y. Liang & I. Chai. Sparse Pixel Recognition of Primitives in Enigineering
Drawings. Machine Vision and Applications, 6,pp69-82, 1993.
[Jonk95] A. Jonk & A. Smeulders. An axiomatic approach to clustering line-segments. ICDAR, 1995.
[Noordmans97] H.J. Noordmans & A.W.M. Smeulders High Accuracy Tracking of 2d/3d
curved line structures by Consecutive Cross-Section Matching. Submitted to Pattern
Recognition Letters, 1997
[Pavlidis87] T. Pavlidis. Algorithm for graphics and image processing. Computer Science
Press, 1987.
[Steger96] C. Steger. Extracting curvilinear structures: A differential geometric approach. In
B. Buxton and R. Cipolla, editors, Fourth European Conference on Computer Vision,
volume 1064 of Lecture notes in computer science, pages 630-641. Springer-Verlag
1996.

You might also like