0% found this document useful (0 votes)
35 views

Lect Wavelet Filt

Wavelet & Filter

Uploaded by

Sk Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

Lect Wavelet Filt

Wavelet & Filter

Uploaded by

Sk Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Wavelet Transform

Wavelet Transform
The wavelet transform corresponds to the
decomposition of a quadratic integrable function s(x) L2(R)
in a family of scaled and translated functions k,l(t),

t l
k,l (t) = k ( )
k
1/ 2

The function (x) is called wavelet function and shows


band-pass behavior. The wavelet coefficients da,b are derived

as follows:

1
d k, l =
k

x-l
- s(x) ( k )dx
*

where k R+, l R
and * denotes the complex conjugate function
The discrete wavelet transform (DWT) represents a 1-D
signal s(t) in terms of shifted versions of a lowpass scaling function
(t) and shifted and dilated versions of a prototype bandpass wavelet
function (t).

For special choices of (t) and (t), the functions:

j ,k (t ) = 2

(2 t k ),

j/2

j ,k (t ) = 2 (2 t k )
j

for j and k Z, form an orthonormal basis, and we have the


representation:

z (t ) = u j 0,k j 0,k (t ) + j ,k j ,k (t )
j = j0 k

where,

u j ,k = s (t ) (t ) dt
*
j ,k

and

j ,k = s (t )

*
j ,k

(t ) dt

Relook at F.T. expressions:

F (u ) =

f ( x )e

j 2xu

dx

f ( x) =

F (u )e

j 2xu

du

Thus f(x) is represented here as a linear combination of the basis


functions: exp(jx)
Wavelet transform on the other hand, represents f(x) (or f(t)) as a
linear combination of:
k / 2
k
kl

(t ) = 2

(2 t l )

where (t) is called the mother wavelet.


Parameters k and l are integers which generates the basis
functions as the dilated and shifted variations of the mother wavelet.

The parameter k plays the role of frequency and l plays the


role of time. Hence by varying k and l, we have different frequency
and different time or space hence the term multi-channel multiresolution approach.

Compare in discrete case:


The F.T.:
1 N 1

f ( x) =

j 2ux

F (u )e

; x = 0,1,...., ( N 1)

u =0

The DWT:

f (t ) = X DWT (k , l )[2
k

where,

(2 t l )]

k / 2

1
X CWT (k , l ) =
k

t l
x(t ) ( k )dt

DWT Discrete Wavelet Transform:

X DWT (k , l ) = a k / 2 x(t )h(a k t lT )dt

Forward:
and Inverse:

x(t ) = X DWT (k , l )[a


k

hk (t ) = a

k / 2

f (a t lT )]

Take, T = 1 and time is continuous.


Synthesis filters:

Analysis filters:
k / 2

h( a t )

f k (t ) = a

k / 2

f (a t )

Functions h(t) and f(t) are derived by dilation of a single filter. Thus
the basis functions are dilated (t -> a-kt) and shifted (t -> t - la-kt)
versions of:
k / 2
k
kl

f (t ) = (t ) = a

Synthesis filters for perfect reconstruction:

(a t lT )

f k (t ) = h (t )
*
k

Visualize pseudo-frequency corresponding


to a scale. Assume a center frequency Fc of the
wavelet and use the following relationship:

Fc
Fa =
a.
where a is the scale. is the sampling period and
Fc is the center frequency of a wavelet in Hz. Fa
is the pseudo-frequency corresponding to the
scale a, in Hz.

The highpass and lowpass filters are not


independent of each other, and they are related by
the following expression:

g[ L 1 N ] = ( 1) .h( n)
n

QMF bank and typical magnitude responses


Decimators Expanders
2

H0(z)

G0(z)
^

x(t )

x(t)
2

H1(z)

Analysis Bank

G1(z)
Synthesis Bank

H0(z)

H1(z)

/2

M-channel (M-band) QMF bank


Decimators

Expanders

H0(z)

G0(z)

H1(z)

G1(z)
^

x(t )

x(t)

HM-1(z)
Analysis Bank

GM-1(z)
Synthesis Bank

The DWT analyzes the signal at different frequency bands


with different resolutions by decomposing the signal into
coarse approximation and detail information.
DWT employs two sets of functions, called scaling functions
and wavelet functions, which are associated with low pass
and highpass filters, respectively.
The decomposition of the signal into different frequency
bands is simply obtained by successive highpass and lowpass
filtering of the time domain signal.
The original signal x[n] is first passed through a half-band
highpass filter g[n] and a lowpass filter h[n].
After the filtering, half of the samples can be eliminated
(according to the Nyquists rule) since the signal now has a
highest frequency of fmax/2 radians instead of fmax.
The signal can therefore be sub-sampled by 2, simply by
discarding every other sample. This constitutes one level of
decomposition and can mathematically be expressed as
follows:

yhi [k ] = x[n].g[2k n]
ylo [k ] = x[n].h[2k n]

Block diagram of the methodology of 1-D DWT.

Frequency responses (bandwidths) of the different


output channels of the wavelet filter bank, for a = 2
and three or more levels of decomposition

/4

/2

= 2

Frequency
Response
of 2-channel
Daubeschies
8-tap
orthogonal
wavelet filters.

Low-Pass

High-Pass

Frequency Response
of a 3-channel
orthogonal wavelet
filters.

Channel - I

Channel - II

Channel - III

Frequency Response
of a 4-channel
orthogonal wavelet
filters.

Channel - I

Channel - II

Channel - III

Channel - IV

Two-level maximally decimated filter bank

H0(z)
H0(z)

G0(z)

2
H1(z)

x(t)

H0(z)
H1(z)

2
2

G1(z)

G0(z)

2
2

x(t)

2
H1(z)

G0(z)

G1(z)

G1(z)

Illustrations to demonstrate the


difference between:
FT, STFT and WT

x(t ) = cos(2 10t ) + cos(2 25t ) + cos(2 50t ) + cos(2 100t )

Note that the FT gives what frequency components (spectral


components) exist in the signal. Nothing more, nothing less.
When the time localization of the spectral components are
needed, a transform giving the TIME-FREQUENCY REPRESENTATION of the
signal is needed.

What is Wavelet Transform and how does it solve the problem?


View WT as a plot on a 3-D graph, where time is one axis, frequency the
second and amplitude is the third axis.
This will show us what frequencies, f, exist at which time, T.
There is an issue, called "uncertainty principle", which states that, we
cannot exactly know what frequency exists at what time instance , but we can
only know what frequency bands exist at what time intervals.

The uncertainty principle, originally found and formulated by


Heisenberg, states that, the momentum and the position of a moving particle
cannot be known simultaneously. This applies to our subject as follows:
The frequency and time information of a signal at some certain point in
the time-frequency plane cannot be known.
In other words: We cannot know what spectral component exists at any
given time instant. The best we can do is to investigate what spectral components
exist at any given interval of time.
This is a problem of resolution, and it is the main reason why
researchers have switched from STFT to WT.
STFT gives a fixed resolution at all times, whereas WT gives a variable (or
suitable) resolution as follows:
Higher frequencies are better resolved in time, and lower frequencies are better
resolved in frequency.
This means that, a certain high frequency component can be located
better in time (with less relative error) than a low frequency component. On the
contrary, a low frequency component can be located better in frequency
compared to high frequency component

STFTx (t , f ) = [ x(t ). (t t ' )] exp( j 2ft )dt

a *t
w(t ) = exp(
)
2
2

Broader Window, w

Narrow Window, w

Still larger window, w

Amplitude

Frequency

Time and Frequency Resolutions

Amplitude (Fourier)

Scale

Frequency

Time

Time (STFT/Gabor)

Time (Wavelet)

Frequency

(STFT/Gabor)

21
1
(Wavelet)

Frequency

2T Time

Scale

0/4
0/2

T 2T

4T

Time

Two-dimensional Wavelet Transform

LPF

LL

HPF

LPF

HL

HPF

HH

Image

HPF

LPF

LH

Level I wavelet decomposition of an image

Level II wavelet decomposition of an image

References:
Multirate Systems and Filter banks, P. P. Vaidyanathan; Prentice
Hall Inc., 1993.

Wavelet based analysis


of texture Images

Problem of
Shape from
3-D Textures

2-D Textures

3-D Textures

Real world 3-D Texture image

REFERENCES
1.

M. Clerc and S. Mallat, The Texture Gradient Equation for Recovering Shape from Texture, IEEE
Transactions on Pattern Analysis and Machine Intelligence, Vol. 24, No. 4, pp. 536-549, April 2002.

2.

J. Garding, Surface Orientation and Curvature from Differential Texture Distortion, Proceedings of the IEEE
Conference on Computer Vision (ICCV 95), 1995, pp. 733-739.

3.

J. S. Kwon, H. K. Hong and J. S Choi, Obtaining a 3-D orientation of Projective textures using a Morphological
Method, Pattern Recognition, Vol. 29, No. 5, pp. 725-732, 1996.

4.

T. Leung and J. Malik, On Perpendicular textures, or: Why do we see more flowers in the distance?,
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 97), 1997, San Juan,
Puerto Rico, pp. 807-813.

5.

J. Malik and R. Rosenholtz, Computing Local Surface Orientation and Shape from texture for Curved
Surfaces, International Journal of Computer Vision, Vol. 23(2), pp. 149-168, 1997.

6.

E. Ribeiro and E. R. Hancock, Shape from periodic Texture using the eigenvectors of local affine distortion,
IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 12, pp. 1459 1465, Dec. 2001.

7.

B. J. Super and A. C. Bovik, Planar surface orientation from texture spatial frequencies, Pattern Recognition,
Vol. 28, No. 5, pp. 729-743, 1995.

8.

Sukhendu Das and Thomas Greiner; Wavelet based separable analysis of texture images for extracting
orientation of planar surfaces; Proceedings of the second IASTED International Conference on Visualization,
Imaging and Image Processing (IASTED-VIIP); September 9-12, 2002, Malaga, Spain, pp. 607612.

9.

Thomas Greiner and Sukhendu Das; Recovering Orientation of a textured planar surface using wavelet
transform; Indian Conference on Computer Vision, Graphics and Image Processing, 2002 (ICVGIP '02),
December 16 - 18, 2002, Space Applications Centre (SAC-ISRO), Ahmedabad, INDIA, pp. 254-259.

You might also like