0% found this document useful (0 votes)
57 views

2-Adaptive Image Filtering Chapter2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

2-Adaptive Image Filtering Chapter2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

2

Adaptive Image Filtering

Carl-Fredrik Westin 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19


Ron Kikinis 2 Multidimensional Spatial Frequencies and Filtering . . . . . . . . . . . . . . . . . . . . . . . . . 19
Harvard Medical School 2.1 Spatial Frequency  2.2 Filtering  2.3 Unsharp Masking
3 Random Fields and Wiener Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Hans Knutsson 3.1 Autocorrelation and Power Spectrum  3.2 The Wiener Filter
LinkoÈping University
4 Adaptive Wiener Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.1 Local Adaptation  4.2 Nonlinear Extension by a Visibility Function
5 Anisotropic Adaptive Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
5.1 Anisotropic Adaptive Filtering in Two Dimensions  5.2 Multidimensional Anisotropic Adaptive
Filtering  5.3 Adaptation Process  5.4 Estimation of Multidimensional Local Anisotropy Bias 
5.5 Tensor Mapping  5.6 Examples of Anisotropic Filtering in 2D and 3D
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

1 Introduction obtained. For example, by using a ``visibility function,'' which


is based on local edge strength, the ®lters can be made locally
Adaptive ®lters are commonly used in image processing to adaptive to structures in the image so that areas with edges are
enhance or restore data by removing noise without signi®- less blurred. Section 5 is about adaptive anisotropic ®ltering. By
cantly blurring the structures in the image. The adaptive allowing ®lters to change from circularly/spherically symmetric
®ltering literature is vast and cannot adequately be summarized (isotropic) to shapes that are closely related to the image
in a short chapter. However, a large part of the literature structure, more noise can be suppressed without severe
concerns one-dimensional (1D) signals [1]. Such methods are blurring of lines and edges. A computationally ef®cient way
not directly applicable to image processing and there are no of implementing shift-variant anisotropic ®lters based on a
straightforward ways to extend 1D techniques to higher non-linear combination of shift-invariant ®lter responses is
dimensions primarily because there is no unique ordering of described.
data points in dimensions higher than one. Since higher-
dimensional medical image data are not uncommon (2D
images, 3D volumes, 4D time-volumes), we have chosen to
focus this chapter on adaptive ®ltering techniques that can be 2 Multidimensional Spatial Frequencies
generalized to multidimensional signals. and Filtering
This chapter assumes that the reader is familiar with the
fundamentals of 1D signal processing [2]. Section 2 addresses At a conceptual level, there is a great deal of similarity between
spatial frequency and ®ltering. 2D spatial signals and their 1D signal processing and signal processing in higher dimen-
Fourier transforms are shown to illuminate the similarities to sions. For example, the intuition and knowledge gained from
signals in one dimension. Unsharp masking is described as an working with the 1D Fourier transform extends fairly straight-
example of simple image enhancement by spatial ®ltering. forwardly to higher dimensions. For overviews of signal
Section 3 covers random ®elds and is intended as a primer for processing techniques in 2D see Lim [3], or Granlund and
the Wiener ®lter, which is introduced in Section 3.2. The Knutsson for higher dimensional signal processing [4].
Wiener formulation gives a lowpass ®lter with a frequency
characteristic adapted to the noise level in the image. The
2.1 Spatial Frequency
higher the noise level, the more smoothing of the data. In
Section 4 adaptive Wiener formulations are presented. By The only difference between the classical notions of time
introducing local adaptation of the ®lters, a solution more frequencies and spatial frequencies is the function variable
suitable for nonstationary signals such as images can be used: Instead of time, the variable is spatial position in the

Copyright # 2000 by Academic Press.


All rights of reproduction in any form reserved. 19
20 I Enhancement

latter case. A multidimensional sinusoidal function can be where d denotes the Dirac function, the Fourier transform will
written as be concentrated on a line with orientation normal to the plane,
and the function along this line will be constant.
f ˆ cos…xT ^e†; …1† Examples of 2D sinusoidal functions and their Fourier
transforms are shown in Fig. 1. The top ®gures show a
where x is the spatial position vector and ^e is a normalized transform pair of a sinusoid of fairly low frequency. The
vector de®ning the orientation of the wave …x; ^e [ rn †. The Fourier transform, which contains two Dirac impulse func-
signal is constant on all hyperplanes normal to ^e. For any 1D tions, is shown to the left and may be expressed as
function g, the multidimensional function
F1 ˆ d…u x1 † ‡ d…u ‡ x1 †; …4†
f ˆ g…xT ^e† …2†
where u denotes the 2D frequency variable, x1 the spatial
will have a Fourier transform in which all energy is concen- frequency of the signal. The bottom ®gures show the transform
trated on a line through the origin with direction e. For pair of a signal consisting of the same signal in F1 plus another
example, for a plane in three dimensions given by sinusoidal signal with frequency o2.
f ˆ d…xT ^e†; …3† F2 ˆ F1 ‡ d…u x2 † ‡ d…u ‡ x2 †: …5†

F1 f1

F1 ‡ F2 f1 ‡ f2

FIGURE 1 (Top) Sinusoidal signal with low spatial frequency. (Bottom) Sum of the top signal and a
sinusoidal with a higher spatial frequency.
2 Adaptive Image Filtering 21

2.2 Filtering 2.3 Unsharp Masking


Linear ®ltering of a signal can be seen as a controlled scaling of Unsharp masking, an old technique known to photographers,
the signal components in the frequency domain. Reducing the is used to change the relative highpass content in an image by
components in the center of the frequency domain (low subtracting a blurred (lowpass ®ltered) version of the image
frequencies), gives the high-frequency components an [5]. This can be done optically by ®rst developing an unsharp
increased relative importance, and thus highpass ®ltering is picture on a negative ®lm and then using this ®lm as a mask in
performed. Filters can be made very selective. Any of the a second development step. Mathematically, unsharp masking
Fourier coef®cients can be changed independently of the can be expressed as
others. For example, let H be a constant function minus a pair
of Dirac functions symmetrically centered in the Fourier f^ ˆ af bflp …8†
domain with a distance jx1 j from the center,
where a and b are positive constants, a  b. When processing
H ˆ1 d…u x1 † ‡ d…u ‡ x1 †: …6† digital image data, it is desirable to keep the local mean of the
image unchanged. If the coef®cients in the lowpass ®lter flp are
This ®lter, known as a notch ®lter, will leave all frequency normalized, i.e., their sum equals 1, the following formulation of
components untouched, except the component that corre- unsharp masking ensures unchanged local mean in the image:
sponds to the sinusoid in Fig. 1, which will be completely
1
removed. A weighting of the Dirac functions will control how f^ ˆ …af bflp †: …9†
much of the component is removed. For example, the ®lter a b
By expanding the expression in the parentheses …af bflp † ˆ
H ˆ1 0:9d…u x1 † ‡ d…u ‡ x1 †; …7†
aflp ‡ a…f flp † bflp , we can write Eq. (9) as
will reduce the signal component to 10% of its original value. a
The result of the application of this ®lter to the signal F1 ‡ F2 . f^ ˆ flp ‡ …f flp †; …10†
a b
(Fig. 1, bottom) is shown in Fig. 2. The lower-frequency
component is almost invisible. which provides a more intuitive way of writing unsharp
Filters for practical applications have to be more general than masking. Further rewriting yields
``remove sinusoidal component cos…xT x†.'' In image enhance-
f^ ˆ flp ‡ g…f flp † …11†
ment, ®lters are designed to remove noise that is spread out all
over the frequency domain. It is a dif®cult task to design ®lters ˆ flp ‡ gfhp ; …12†
that remove as much noise as possible without removing
important parts of the signal. where g can be seen as a gain factor of the high frequencies. For

H…F1 ‡ F2 † h  …f1 ‡ f2 †

FIGURE 2 Notch ®ltering of the signal f1 ‡ f2 , a sum of the sinusoids. The application of the ®lter h in Eq. (7) reduces the
low-frequency component to one-tenth of its original value.
22 I Enhancement

g ˆ 1, the ®lter is the identity map, flp ‡ fhp ˆ flp be constructed thereof, 1 Hlp (top right). Figure 3 (bottom
‡…f flp † ˆ f , and the output image equals the input image. left) shows the identity ®lter from adding the two top ®lters,
For g41 the relative highpass content in the image is increased, and (bottom right) a ®lter where the highpass component has
resulting in higher contrast; for g51 the relative highpass been ampli®ed by a factor of 2. Figure 4 shows a slice of a CT
content is decreased and the image gets blurred. This process data set through a skull (left), and the result of unsharp
can be visualized by looking at the corresponding ®lters masking with g ˆ 2, i.e., a doubling of the highpass part of the
involved. In the Fourier domain, the lowpass image flp can be image (right). The lowpass ®lter used (flp in Eq. (11)) was a
written as the product of a lowpass ®lter Hlp and the Fourier Gaussian ®lter with a standard deviation of 4, implemented on
transform of the original image, a 21621 grid.
A natural question arises. How should the parameters for the
Flp ˆ Hlp F: …13† lowpass ®lter be chosen? It would be advantageous if the ®lter
could adapt to the image automatically either through an a
Figure 3 (top left) shows Hlp and the highpass ®lter that can priori model of the data or by some estimation process. There

Lowpass ®lter, Hlp Highpass ®lter, 1 Hlp

Hlp ‡ Hhp Hlp ‡ 2Hhp

FIGURE 3 Visualization of the ®lters involved in unsharp masking. See also Plate 1.
2 Adaptive Image Filtering 23

FIGURE 4 Filtering CT data using unsharp masking. The highpass information of the original image (left) is twice as high in the
result image (right). Note how details have been ampli®ed. This technique works well due to the lack of noise in the image.

are two main categories of such adaptive ®lters, ®lters for image noise. In practice these are, in general, not known and have to
enhancement and ®lters for image restoration. The two be estimated from the image.
categories mainly differ in the view of the data that is to be
®ltered. The method of unsharp masking belongs to the ®rst
3.1 Autocorrelation and Power Spectrum
category, image enhancement. The image is made crisper by
increasing the image contrast. The input image was not A collection of an in®nite number of random variables de®ned
considered to be degraded in any way, and the purpose of the on an n-dimensional space …x [ Rn † is called a random ®eld.
algorithm was just to improve the appearance of the image. In The autocorrelation function of a random ®eld f …x† is de®ned
image restoration, as the name implies, the image data are as the expected value of the product of two samples of the
modeled as being degraded by a (normally unknown) process, random ®eld,
and the task at hand is to ``undo'' this degradation and restore
Rff …x; x0 † ˆ Eff …x†f …x0 †g; …14†
the image. The models of image degradation commonly
involve random noise processes. Before we introduce the well- where E denotes the statistical expectation operator. A random
known Wiener ®lter, a short description of stochastic processes ®eld is said to be stationary if the expectation value and the
in multiple dimensions is given. In multiple dimensions autocorrelation function are shift-invariant, i.e., the expecta-
stochastic processes are customary referred to as random ®elds. tion value is independent of the spatial position vector x, and
the autocorrelation is a function only of s ˆ x x0 .
Rff …s† ˆ Eff …x ‡ s†f …x†g: …15†
The power spectrum of a stationary random process is
de®ned by the Fourier transform of the autocorrelation
3 Random Fields and Wiener Filtering
function.
A central problem in the application of random ®elds is the Sff ˆ f…Rff †: …16†
estimation of various statistical parameters from real data. The
Wiener ®lter that will be discussed later requires knowledge of Since the autocorrelation function is always symmetric, the
the spectral content of the image signal and the background power spectrum is always a real function.
24 I Enhancement

A random process n…x† is called a white noise process if g ˆf ‡n …26†


Rnn …s† ˆ s2n d…s†: …17† Rnn ˆ s2n d…s†; …27†
From Eq. (16), the power spectrum of a white process is a then
constant:
Rfg ˆ Rff …28†
Snn ˆ s2n : …18†
The Wiener-Khinchin theorem [6] states that Rgg ˆ Rff ‡ Rnn ; …29†
Z
2 1 and the stationary Wiener ®lter is given by
Eff …x†g ˆ R…0† ˆ S…u†du; …19†
…2p†n Sff
Hˆ : …30†
where n is the dimension of the signal (the theorem follows Sff ‡ s2n
directly from Eq. 16, by taking the inverse Fourier transform of
This means that for a Gaussian signal, the Wiener ®lter is the
S…u† for s ˆ 0). This means that the integral of the power
optimal linear estimator (among all estimators). The Wiener
spectrum of any process is positive. It can also be shown that
the power spectrum is always nonnegative [6]. ®lter optimizes the trade-off between smoothing the signal
The cross-correlation function of two random processes is discontinuities and removal of the noise.
de®ned as
Rfg …x; x0 † ˆ Eff …x†g…x0 †g …20†

3.2 The Wiener Filter 4 Adaptive Wiener Filters


The restoration problem is essentially one of optimal ®ltering The standard formulation of the Wiener ®lter has met limited
with respect to some error criterion. The mean-squared error success in image processing because of its lowpass character-
(MSE) criterion has formed the basis for most published work istics, which give rise to unacceptable blurring of lines and
in this area [7±9]. edges. If the signal is a realization of a non-Gaussian process
In the case of linear stationary estimation we wish to such as in natural images, the Wiener ®lter is outperformed by
estimate the process f with a linear combination of values of the nonlinear estimators. One reason why the Wiener ®lter blurs
data g. This can be expressed by a convolution operation the image signi®cantly is that a ®xed ®lter is used throughout
the entire image, the ®lter is space invariant.
f^ ˆ h  g; …21† A number of attempts to overcome this problem have
where h is a linear ®lter. A very general statement of the adopted a nonstationary approach where the characteristics of
estimation problem is: given a set of data g, ®nd the estimate f^ the signal and the noise are allowed to change spatially [10±23].
of an image f that minimizes some distance kf f^k.
By using the mean-squared error, it is possible to derive the 4.1 Local Adaptation
estimate with the principle of orthogonality [6]:
The Wiener ®lter in Eq. (30) de®nes a shift-invariant ®lter, and
Ef…f f^†gg ˆ 0: …22† thus the same ®lter is used throughout the image. One way to
make the ®lter spatially variant is by using a local spatially
Inserting the convolution expression in Eq. (21) thus gives
varying model of the noise parameter sn . Such a ®lter can be
Ef…f h  g†gg ˆ 0: …23† written as
When the ®lter operates in its optimal condition, the estimation Sff
error …f f^† is orthogonal to the data g. In terms of the Hˆ : …31†
Sff ‡ s2n …x†
correlation functions Rfg and Rgg , Eq. (23) can be expressed as [2]
This ®lter formulation is computationally quite expensive since
Rfg ˆ h  Rgg …24† the ®lter changes from pixel to pixel in the image.
By Fourier transforming both sides of the equation, we get Lee derived an ef®cient implementation of a noise-adaptive
Sfg ˆ HSgg , resulting in the familiar Wiener ®lter Wiener ®lter by modeling the signal locally as a stationary
process [14, 3] that results in the ®lter
Sfg
Hˆ : …25† s2f …x†
Sgg f^…x† ˆ mf …x† ‡ …g…x† mf …x†† …32†
2
sf …x† ‡ s2n
When the data are the sum of the image and stationary white
noise of zero mean and variance s2n , where mf is the local mean of the signal g, and s2f is the local
2 Adaptive Image Filtering 25

signal variance. A local mean operator can be implemented by methods are small. A stable solution is almost inevitably
a normalized lowpass ®lter where the support region of the smooth.
®lter de®nes the locality of the mean operator. It is interesting Equation (33) shows a very explicit trade-off between
to note that this formulation gives an expression very similar to resolution and stability. a ˆ 0 gives a ®lter that is the identity
the expression of unsharp masking in Eq. (11), although the mapping (maximum resolution) and a ˆ 1 gives the smoother
latter is not locally adaptive. Figure 5 shows the result of Lee's Wiener solution. By choosing a to be spatially variant, i.e., a
®lter on MR data through the pelvis. function of the position in the image, a ˆ a…x†, a simple
adaptive ®lter is obtained. For large gradients, the alpha
function cancels the noise term and the function H becomes
4.2 Nonlinear Extension by a Visibility Function the identity map. This approach has the undesired feature that
The Wiener ®lter that results from a minimization based on the the ®lter changes from point to point in a way that is generally
MSE criterion can only relate to second-order statistics of the computationally burdensome. Abramatic and Silverman [12]
input data and no higher [1]. The use of a Wiener ®lter or a proposed an ``signal equivalent'' approach giving a ®lter that is
linear adaptive ®lter to extract signals of interest will therefore a linear combination of the stationary Wiener ®lter and the
yield suboptimal solutions. By introducing nonlinearities in identity map:
the structure, some limitations can be taken care of.
Abramatic and Silverman [12] modi®ed the stationary white Sff
Ha ˆ a ‡ …1 a†: …34†
noise Wiener solution (Eq. (30)) by introducing a visibility Sff ‡ s2
function a, 0  a  1, which depends on the magnitude of
the image gradient vector, where a is 0 for ``large'' gradients Note that Ha equals the Wiener solution (Eq. (30)) for a ˆ 1,
and 1 in areas of no gradient. They showed that the generalized and for a ˆ 0 the ®lter becomes the identity map. It is
Backus±Gilbert criterion yields the solution interesting to note that Eq. (34) can be rewritten as
Sff Sff
Ha ˆ …33† s2
Sff ‡ as2 Ha ˆ ‡ …1 a† : …35†
Sff ‡ s2 Sff ‡ s2
for an optimal ®lter. The Backus±Gilbert method [24, 25] is a
regularization method that differs from others in that it seeks to Equation (35) shows that Abramatic and Silverman's model
maximize stability of the solution rather than, in the ®rst can be seen as a linear combination of a stationary lowpass
instance, its smoothness. The Backus±Gilbert method seeks to component and a nonstationary highpass component [16].
make the mapping from f^ and f as close to the identity as Inserting H for the stationary Wiener solution we can write the
possible in the limit of error-free data. Although the Backus± ®lter in Eq. (35) as
Gilbert philosophy is rather different from standard linear
regularization methods, in practice the differences between the Ha ˆ H ‡ …1 a†…1 H†: …36†

Original MR data Filtered MR data

FIGURE 5 Filtering MR data through the pelvis using Lee's method. Note how the background noise the original image
(left) has effectively been suppressed in the result image (right). Note how the motion related artifacts are reduced, but
blurring is introduced.
26 I Enhancement

5 Anisotropic Adaptive Filtering X


N
2
Hg ˆ H ‡ …1 H† gk …^eTk u
^† ; …39†
kˆ1
5.1 Anisotropic Adaptive Filtering in Two where N is the dimension of the signal (N ˆ 2 for images and
Dimensions N ˆ 3 for volumes). However, two important elements
On the basis of the characteristics of the human visual system, remain: how to de®ne the orthogonal directions ^ek and the
Knutsson et al. [16] argued that local anisotropy is an coef®cients gk so that the equation describes a useful adaptive
important property in images and introduced an anisotropic ®lter.
component in Abramatic and Silverman's model (Eq. (36)) ^†2 as the inner
By expressing the squared inner product …^eTk u
T T
product of two outer products h ^ek ^ek , u
^u^ i
Ha;g ˆ H ‡ …1 a†…g ‡ …1 g† cos2 …j y††…1 H†; …37† X
N
Hg ˆ H ‡ …1 H† hgk ^ek ^eTk ; u
^u^T i …40†
where the parameter g controls the level of anisotropy, j kˆ1
de®nes the angular direction of the ®lter coordinates, and y is
the orientation of the local image structure. The speci®c choice ˆ H ‡ …1 H†hC; Ui; …41†
of weighting function cos2 …j y† was imposed by its ideal we can de®ne a term C as a ``control'' tensor,
interpolation properties, the directed anisotropy function
could be implemented as a steerable ®lter from three ®xed X
N

®lters [16] Cˆ gk ^ek ^eTk : …42†


kˆ1

cos2 …j†; cos2 …j p=3†; and cos2 …j 2p=3† The tensor C controls the ®lter adaptation by weighting the
components in the outer product description of the Fourier
(these ®lters span the same space as the three ®lters 1,
domain U according to its ``shape''. For a 3D signal this shape
cos 2…j†; sin 2…j†, which also can be used). Freeman and
can be thought of as an ellipsoid with principal axes ^ek .
Adelson later applied this concept to several problems in
Similar to the 2D case in Eq. (37), where y describes the main
computer vision [26].
orientation of the local spectrum, the tensor C should describe
Knutsson et al. estimated the local orientation and the degree
the major axes of the local spectrum in the Fourier domain.
of anisotropy with three oriented Hilbert transform pairs, so-
The consequence of the inner product in Eq. (41) is a
called quadrature ®lters, with the same angular pro®les as the
reduction of the high-frequency components in directions
three basis functions describing the steerable weighting func-
where the local spectrum is weak. In those directions the
tion. Figure 6 shows one of these Hilbert transform pairs. In
adaptive ®lter will mainly act as the stationary Wiener
areas of the image lacking a dominant orientation, g is set to 1,
component H, which has lowpass characteristics.
and Eq. (37) reverts to the isotropic Abramatic and Silverman
solution. The more dominant the local orientation, the smaller
the g value and the more anisotropic the ®lter. 5.3 Adaptation Process
Before we describe the adaptation process, we will introduce a
set of ®xed ®lters
5.2 Multidimensional Anisotropic Adaptive
Filtering Hk …u† ˆ …1 H†…^ ^ †2 ;
nTk u …43†
^k de®ne the directions of the ®lters and u u
^ ˆ juj
The cos2 …† term in Eq. (37) can be viewed as a squared inner where n :
product of a vector de®ning the main directionality of the It turns out that the adaptive ®lter in Eq. (39) can be written
signal (y) and the frequency coordinate vectors. This main as a linear combination of these ®xed ®lters and the stationary
direction is denoted ^e1 , and an orthogonal direction ^e2 , Eq. Wiener solution where the weights are de®ned by inner
(37) can be rewritten as products of a control tensor C and the ®lter associated dual
tensors M k .
2 2
Hg ˆ H ‡ …1 H†‰g1 …^eT1 u
^† ‡ g2 …^eT2 u
^† Š; …38† N …N
X ‡1†=2
Hc ˆ H ‡ hM k ; CiHk ; …44†
where u is the 2D frequency variable, and the parameters g1 and
kˆ1
g2 de®ne the anisotropy of the ®lter. For g1 ˆ g2 ˆ 1 a, the
®lter becomes isotropic (g ˆ 1 in Eq. (37)), and for g1 ˆ 1 a where the tensors M k de®ne the so-called dual tensor basis to
and g2 ˆ 0 the ®lter becomes maximally anisotropic (g ˆ 0 in the one de®ned by the outer product of the ®lter directions.
Eq. (37)) and mainly favors signals oriented as ^e1 . The dual tensors M k are de®ned by
The inner product notation in Eq. (38) allows for a direct hM k ; N l i ˆ dkl ;
extension to multiple dimensions:
2 Adaptive Image Filtering 27

Spatial domain: real part Spatial domain: imaginary part

Spatial domain: complex Frequency domain: real part

FIGURE 6 Visualization of a quadrature ®lter (Hilbert transform pair) used in the estimation of local anisotropy. (Top)
The plots show the ®lter in the spatial domain: the real part (left) and the imaginary part (right). It can be appreciated that
the real part can be viewed as a line ®lter and the imaginary part an edge ®lter. The color coding is green, positive real, red,
negative real; blue, positive imaginary, and orange, negative imaginary. (Bottom) The left plot shows the magnitude of the
®lter with the phase of the ®lter color coded. The right plot shows the quadrature ®lter in the Fourier domain. Here the ®lter
is real and zero on one half of the Fourier domain. See also Plate 2.

where N k ˆ n ^Tk are the outer products of the ®lter


^k n Equation (44) can be validated be inserting the expressions
directions. The number of ®xed ®lters needed, N …N ‡ 1†=2, for the control tensor C and the ®xed ®lter functions in
is the number of independent coef®cients in the tensor C, Eq. (44),
which is described by a symmetric N 6N matrix for a signal of
X
N …N ‡1†=2 X
N
dimension N. This gives that 3 ®lters are required in two 2
Hc ˆ H ‡ …1 H† hM k ; gi ^ei ^eTi i…^
nTk u
^† : …45†
dimensions, and 6 ®lters in three dimensions.
kˆ1 iˆ1
The rest of this section is somewhat technical and may be
omitted on a ®rst reading. ^†2 as the inner
By expressing the squared inner product …^eTk u
28 I Enhancement

product of two outer products h^ ^Tk , u


nk n ^u^T i, switching the A tensor T describing local structure is obtained by a linear
order of the terms in the ®rst inner product summation of quadrature ®lter magnitude responses, jqk j,
weighted by prede®ned tensors M k , associated with each ®lter
X
N …N ‡1†=2 X
N
(de®ned as the dual tensors in Eq. 44):
Hc ˆ H ‡ …1 H† h gi ^ei ^eTi ; M k ih^ ^Tk ; u
nk n ^u^T i;
kˆ1 iˆ1 N …N
X ‡1†=2
…46† Tˆ M k jqk j …53†
kˆ1
and reordering the summation
where jqk j is the output magnitude from the quadrature ®lter k.
XN X
N …N ‡1†=2
The tensor in Eq. (53) is real and symmetric and can thus be
Hc ˆ H ‡ …1 H†h gi ^ei ^eTi ; ^u
M k ihN k ; u ^T i …47†
iˆ1 kˆ1
written as a weighted sum of outer products of its orthogonal
|‚‚‚‚‚‚‚‚‚‚‚‚{z‚‚‚‚‚‚‚‚‚‚‚‚} eigenvectors:
ˆ1

X
N X
N

ˆ H ‡ …1 H†h gi ^ei ^eTi ; u


^u^T i …48† Tˆ lk ^ek ^eTk ; …54†
kˆ1
iˆ1

X
N where the vectors ^ek describe the principal axes of the local
ˆ H ‡ …1 H† ^ i †2 ;
gi …^eTi u …49† signal spectrum (the locality is de®ned by the radial frequency
iˆ1 functions of the quadrature ®lters).
which is equal to the adaptive ®lter we wanted to construct (Eq. The distribution of the eigenvalues, l1  l2  . . .  lN ,
(39)). The under-braced terms sums to 1 because of the dual describes the anisotropy of the local spectrum. If the tensor is
basis relation between the two bases M k and N k . close to rank 1, i.e., there is only one large eigenvalue
(l1 44lk , k [ f2; . . . ; N g), the spectrum is concentrated to
the line in the Fourier domain de®ned by ^e1. Further, if the
5.4 Estimation of Multidimensional Local tensor is close to rank 2 the spectrum is concentrated to the
Anisotropy Bias plane in the Fourier domain spanned by ^e1 and ^e2 .
Knutsson [27] has described how to combine quadrature ®lter
responses into a description of local image structure using
5.5 Tensor Mapping
tensors. His use of tensors was primarily driven by the urge to
®nd a continuous representation of local orientation. The The control tensor C used in the adaptive scheme is based on a
underlying issue here is that orientation is a feature that maps normalization of the tensor described in the previous section
back to itself modulo p under rotation, and direction is a (Eq. 54):
feature that maps back to itself modulo 2p. That is why vector
representations work well in the latter case (e.g., representing X
N
Cˆ gk ^ek ^eTk …55†
velocity) but not in the former case. kˆ1
Knutsson used spherically separable quadrature ®lters [28],
l1
Q…u† ˆ R…r†Dk …^
u†; …50† ˆ T; …56†
l21 ‡ a2
where u is the vector valued frequency variable, r ˆ juj,
u where a is a term de®ning the trade-off between resolution and
^ ˆ juj
u , and R…r† and Dk …^
u† are the radial and the directional
stability similar to Eq. (33). However, here the resolution
functions, respectively,
 trade-off is adaptive in the same way as the stationary Wiener
Dk …^
u† ˆ …^ ^k †2 if u
uT n ^T n
^k 40 ®lter a ˆ 0 gives maximum resolution, and the larger the value
…51†
Dk …^
u† ˆ 0 otherwise; of a, the smoother the Wiener solution. Maximum resolution
is given when l1 is large compared to a, and the smaller l1 the
where n u† varies as cos2 …j†,
^k is the ®lter direction, i.e., D…^ smoother the solution. If the ``resolution parameter'' a ˆ 0, the
where j is the angle between u and the ®lter direction, and control tensor will be
4
ln2 …r=r0 †
R…r† ˆ e B 2 ln 2 …52† 1
Cˆ T …57†
is the radial frequency function. Such functions are Gaussian l1
functions on a logarithmic scale and are therefore termed l2 T l
lognormal functions. B is the relative bandwidth in octaves and ˆ ^e1^eT1 ‡ ^e2^e2 ‡ . . . ‡ N ^eN ^eTN : …58†
l1 l1
r0 is the center frequency of the ®lter. R…r† de®nes the
frequency characteristics of the quadrature ®lters. With this normalization, the largest eigenvalue of C is g1 ˆ 1,
2 Adaptive Image Filtering 29

and the resulting adaptive ®lter Hc becomes an allpass ®lter frequency. The more noise in the image, the lower the cutoff
along signal direction ^e1 , frequency rlp that should be used.
Figure 7 shows the result of 2D adaptive ®ltering of MR data
Hc ˆ H ‡ …1 H†g1 ˆ 1: …59† from breast imaging. The original image is shown to the left. To
If it is desired to increase the image contrast, this can be done the right, the result from anisotropic adaptive ®ltering is
by increasing the high frequency content in dim parts of the shown. The quadrature ®lters used for estimating the local
image data adaptively. Assuming a normalization of the tensors structure had center frequency o0 ˆ p=3 and the lowpass ®lter
T so that the largest eigenvalue is 1 globally, max…l1 † ˆ 1, this H had a cutoff frequency rlp ˆ p=4. The control tensor C was
can be achieved by increasing the exponent of the l1 in the de®ned by Eq. (56) with a ˆ 1% of the largest l1 globally.
denominator. Figure 8 shows the result of 3D adaptive ®ltering of MR data
through the skull. The original image is shown to the left.
l1 The result from adaptive ®ltering is shown to the right. The
Cˆ T: …60†
l2:5
1 ‡ a2 quadrature ®lters used had center frequency o0 ˆ p=2 and the
lowpass ®lter H had a cutoff frequency rlp ˆ p=3. The control
This will increase the relative weights for low and medium-low tensor C was de®ned by Eq. (60) with a ˆ 1% of the largest l1
signal components compared to the largest …l1 ˆ 1†. More globally. Note that details with low contrast in the original
elaborate functions for remapping of the eigenvalues can be image have higher contrast in the enhanced image.
found in [4, 29, 30]. Figure 9 shows the result of 3D (2D ‡ time) adaptive
®ltering of ultrasound data of a beating heart. The left row
shows images from the original time sequence. The right row
5.6 Examples of Anisotropic Filtering in 2D and shows the result after 3D ®ltering. The quadrature ®lters used
3D had center frequency o0 ˆ p=6 and the cutoff frequency of the
The ®lters used in the examples below have in the 2D examples lowpass ®lter H was rlp ˆ p=4. The control tensor C is de®ned
size 15615, and 15615615 in the 3D. The center frequencies by Eq. (60) with a ˆ 5% of the largest l1 globally.
of the quadrature ®lters differ in the examples, but the relative More details on implementing ®lters for anisotropic
bandwidth is the same, B ˆ 2. We have for simplicity adaptive ®ltering can be found in [4] where the estimation
approximated the stationary Wiener ®lter with the following of local structure using quadrature ®lters is also described in
lowpass ®lter: detail. An important issue that we have not discussed in this
(   chapter is that in medical imaging we often face data with a
pr center-to-center spacing between slices that is larger than the
H…r† ˆ cos2 2r if 0  r5rlp
lp …61†
H…r† ˆ 0 otherwise: in-plane pixel size. Westin et al. [30] introduced an af®ne
model of the frequency characteristic of the ®lters to
Here r is the radial frequency variable, and rlp is the cutoff compensate for the data sampling anisotropy. In addition

FIGURE 7 2D adaptive ®ltering of data from MR breast imaging. (Left) Original image. (Right) Adaptively ®ltered image.
The adaptive ®ltering reduces the unstructured component of the motion related artifacts.
30 I Enhancement

FIGURE 8 3D adaptive ®ltering of a coronal MR data set of the head with dynamic compression of the signal. (Left)
Original image. (Right) Adaptively ®ltered image. Note the improved contrast between brain and cerebrospinal ¯uid (CSF).

FIGURE 9 Spatio-temporal adaptive ®ltering of ultrasound data of the heart. (Left) The images for the original image sequence. (Right) Result after 3D
adaptive ®ltering. Note the reduction of the specular noise when comparing between ®ltered and un®ltered image sets.

to changing the frequency distribution of the ®lters, they also 3. J. S. Lim. Two-Dimensional Signal and Image Processing.
show how this af®ne model can provide subvoxel shifted Prentice Hall, 1980.
®lters that can be used for interpolation of medical data. 4. G. H. Granlund and H. Knutsson. Signal Processing for
Computer Vision. Kluwer Academic Publishers, 1995.
5. W. F. Schriber. Wirephoto quality improvement by
unsharp masking. J. Pattern Recognition, 2, 117±121 (1970).
Acknowledgments 6. A. Papoulis. Probability, Random Variables, and Stochastic
Processes. McGraw-Hill, 1965.
This work was supported by CIMIT, and NIH grants P41- 7. C. W. Helstrom. Image restoration by the methods of least
RR13218 and R01-RR11747. The authors gratefully acknow- squares. J. Opt. Soc. Amer. 57(3), 297±303 (1967).
ledge Dr Raj Rangayyan, Dr Klas Nordberg, Lars WigstroÈm, 8. W. K. Pratt. Digital Image Processing. New York: Wiley,
and Dr Juan Ruiz-Alzola for valuble comments on drafts of this 1978.
chapter. 9. B. R. Hunt. The application of constrained least squares
estimation to image restoration by digital computer. IEEE
Trans. Comput. C-22, 805±812 (1973).
10. G. L. Anderson and A. N. Netravali. Image restoration
References based on a subjective criteria. IEEE Trans. Sys. Man.
Cybern., SMC-6, 845±853 (1976).
1. S. Haykin. Adaptive Filter Theory, 3rd ed. Prentice Hall, 11. V. K. Ingle and J. W. Woods. Multiple model recursive
1996. estimation of images. In Proc. IEEE Conf. Acoust., Speech,
2. A. Papoulis. Signal Analysis. McGraw-Hill, 1977. Signal Processing, pp. 642±645 (1979).
2 Adaptive Image Filtering 31

12. J. F. Abramatic and L. M. Silverman. Nonlinear restoration 22. J. A. S. Centeno and V. Haertel. An adaptive image
of noisy images. IEEE Transactions on Pattern Analysis and enhancement algorithm. Pattern Recognition 30(7), 1183±
Machine Intelligence, 4(2), 141±149 (1982). 1189 (1997).
13. M. Nagao and T. Matsuyama. Edge preserving smoothing. 23. B. Fischi and E. Schwartz. Adaptive nonlocal ®ltering: a
Computer Graphics Image Proc. 9, 394±407 (1979). fast alternative to anisotropic diffusion for image ®ltering.
14. J. S. Lee. Digital image enhancement and noise ®ltering by IEEE Transactions on Pattern Analysis and Machine
local statistics. IEEE Transactions on Pattern Analysis and Intelligence 21(1), 42±48 (1999).
Machine Intelligence, 2, 165±168 (1980). 24. G. E. Backus and F. Gilbert. The resolving power of gross
15. J. S. Lim. Image restoration by short space spectral earth data. Geophysical Journal of the Royal Astronomical
subtraction. IEEE Trans. Acoust. Speech Sig. Proc. 28, Society 16, 169±205 (1968).
191±197 (1980). 25. Ling Guang and Rabab K. Ward. Restoration of randomly
16. H. Knutsson, R. Wilson, and G. H. Granlund. Anisotropic blurred images by the Wiener ®lter. IEEE Trans Acoustics,
non-stationary image estimation and its applications Ð Speech, and Signal Processing 37(4), 589±592 (1989).
part I: Restoration of noisy images. IEEE Transactions on 26. W. T. Freeman and E. H. Adelson. The design and use of
Communications COM-31(3), 388±397 (1983). steerable ®lters. IEEE Transactions on Pattern Analysis and
17. D. T. Kuan, A. A. Sawchuck, T. C. Strand, and P. Chavel. Machine Intelligence PAMI-13(9), 891±906 (1991).
Adaptive noise smoothing ®lter for images with signal- 27. H. Knutsson. Representing local structure using tensors. In
dependent noise. IEEE Transactions on Pattern Analysis and The 6th Scandinavian Conference on Image Analysis, pp.
Machine Intelligence 7, 165±177 (1985). 244±251, Oulu, Finland, June 1989.
18. A. C. Bovik, T. S. Huang, and D. C. Munson, Jr. Edge- 28. H. Knutsson and G. H. Granlund. Fourier domain design
sensitive image restoration using order-constraint least of line and edge detectors. In Proceedings of the 5th
squares method. IEEE Trans. Acoust. Speech Sig. Proc. International Conference on Pattern Recognition, Miami, FL,
ASSP-33, 1253±1263 (1985). December 1980.
19. K. Conradsen and G. Nilsson. Data dependent ®lters for 29. C.-F. Westin, S. War®eld, A. Bhalerao, L. Mui, J. Richolt,
edge enhancement of landsat images. Computer Vision, and R. Kikinis. Tensor controlled local structure
Graphics, and Image Processing, 38, 101±121 (1987). enhancement of CT images for bone segmentation. In
20. H. Soltanianzadeh, J. P. Windham and A. E. Yagle. A MICCAI'98, First Int Conf on Medical Image Computing
multidimensional nonlinear edge-preserving ®lter for and Computer-Assisted Intervention, 1998, Lecture Notes
magnetic resonance image restoration. IEEE Transactions in Computer Science 1496, pp. 1205±1212, Springer
on Medical Imaging 4(2), 147±161 (1995). Verlag, 1998.
21. X. You and G. Crebbin. A robust adaptive estimator for 30. C.-F Westin, J. Richolt, V. Moharir and R. Kikinis. Af®ne
®ltering noise in images. IEEE Transactions on Image adaptive ®ltering of CT data. Medical Image Analysis 4(2),
Processing 4(5), 693±699 (1995). 1±21 (2000).

You might also like