0% found this document useful (0 votes)
61 views

Chapter 5 Algorithms For Optimum Linear Filters

Electrical and Computer Engineering Masters Program at Addis Ababa University, Communication Engineering. Course Name of Linear Systems

Uploaded by

Sisay
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views

Chapter 5 Algorithms For Optimum Linear Filters

Electrical and Computer Engineering Masters Program at Addis Ababa University, Communication Engineering. Course Name of Linear Systems

Uploaded by

Sisay
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 42

Statistical Digital Signal

Processing
Algorithms and Structures
for Optimum Linear Filters
Introduction
• The design and application of optimum filters
involves
– Determination of the optimum set of coefficients,
– Evaluation of the cost function to determine
whether the obtained parameters satisfy the design
requirements, and
– The implementation of the optimum filter.
• There are several important reasons to study the normal
equations in greater detail in order to develop efficient,
special-purpose algorithms for their solution.
– The throughput of several real-time applications can only be served with
algorithms that are obtained by exploiting the special structure of the
correlation matrix.
– We can develop order-recursive algorithms that help us to choose the
correct filter order or to stop before of numerical problems.
– Some algorithms lead to intermediate sets of parameters that have physical
meaning, provide easy tests, or are useful in special applications.
– Sometimes there is a link between the algorithm for the solution of the
normal equations and the structure for implementation.
Order-recursive algorithms
• In fixed-order algorithms in order to solve the normal
equations, the order of the estimator should be
known.
• When the order of the estimator becomes a design
variable, fixed-order algorithms are not effective.
– If order changes, the optimum coefficients have to be
calculated again from scratch.
• We would like to arrange the computations so
that the results for order m, that is, cm(n) or
yˆm(n), can be used to compute the estimates for
order m + 1, that is, cm+1(n) or yˆm+1(n).
– The resulting procedures are called order-recursive
algorithms or order-updating relations.
• Similarly, procedures that compute cm(n + 1) from
cm(n) or yˆm(n + 1) from yˆm(n) are called time-
recursive algorithms or time-updating relations.
Matrix Partitioning and Optimum Nesting

• If the order of the estimator increases from m to


m+1, then the input data vector is augmented
with one additional observation xm+1.
First m components of xm+1 Last m components of xm+1

The first mxm sub matrix of Rm+1 The last mxm sub matrix of Rm+1
• Since

• This is known as the optimum nesting property


and is instrumental in the development of order
recursive algorithms.
• The inverse of the m+1 autocorrelation matrix
is given as the following.

Where:

Alternatively:
• Note that:
– The inverse Rm+1of the m+1 autocorrelation matrix
is obtained directly from the inverse Rm.
– The vector bm is the MMSE estimator of
observation xm+1 from data vector xm.
– The inverse matrix does not have the optimum
nesting property.
Levinson Recursion for the Optimum
Estimator
• Solving the m+1 normal equation

Where
• Note that
– Even though the equation is order-recursive, the
parameter cm+1 does not have the optimum nesting
property.

– If bm is known, cm+1 can be calculated.


– However, the calculation of bm requires the inversion
of Rm.
• Minimal computational savings.
Order-recursive computation of LDLH
Decomposition
• The m+1 autocorrelation matrix R can be written as

Where

• Note that both matrices have optimum nesting


property
• From LDLH decomposition of linear MMSE

• Since Lm is lower triangular, km has the optimum


nesting property

• However, since LmH is not lower triangular, cm does


not satisfy the optimum nesting property.
• The MMSE also has the optimum nesting property
Order-Recursive Computation of the
Optimum Estimate
• The computation of the optimum linear
estimate using a linear combiner requires m
multiplications and m − 1 additions.
– To compute the estimate for 1 ≤ m ≤ M, we need
M(M + 1)/2 operations.
• From LDLH decomposition,

• Define a new vector wm called innovation as


• Then the estimate is given as

• Since both kmH and wm satisfy the optimum


nesting property, the estimate also has optimum
nesting property.
• Therefore,
• Note that:
– The correlation of wm is

– Therefore the components of wm are uncorrelated.


– The transformation from xm to wm removes all the
redundant correlation among components of x.
– Therefore each wi adds new information or
innovation.
– The estimation equation shows that the
improvement in the estimate when an addition
observation is included is proportional to the
innovation wm+1 contained in xm+1.
– Therefore, Lm-1 acts as a decorrelator.
– kmH acts a linear combiner.
– LDLH decomposition can be seen as the matrix
equivalent of spectral factorization.
Gram-Schmidt Orthogonalization
• The Gram-Schmidt procedure produces the
innovations {w1,w2,...,wm} by orthogonalizing
the original set {x1,x2,...,xm}.
• By using Gram-Schmidt orthogonalization, it is
possible to obtain mutually uncorrelated
innovation from x as

• The autocorrelation is therefore,


ORDER-RECURSIVE ALGORITHMS FOR
OPTIMUM FIR FILTERS
• The key difference between a linear combiner and an
FIR filter is the nature of the input data vector.
• The input data vector for FIR filters consists of
consecutive samples from the same discrete-time
stochastic process.
• Taking the shift invariance of the input data
• The correlation matrix Rm+1(n) can be shown to be

• Note that
• If the optimum m FIR filter coefficients are known at
time n, the m+1 time coefficients can be calculated as

• This is called the Levinson order recursion.


• By substitution,
• For this order recursion to be useful, we need
an order recursion for the BLP bm(n).
• This is possible if bm(n) has optimum nesting.

• The right side vectors are not nested if we use


upper partitioning.
• If we use lower-upper partitioning

• By using lower-upper partitioning of Rm+1


• By substitution

• Similarly am(n) does not have optimum nesting.


• Order recursion for FLP

• Clearly, am does not have the optimum nesting


property.
Simplification for Stationary Stochastic
Processes
• When x(n) and y(n) are jointly wide-sense stationary
(WSS), the optimum estimators are time-invariant
and we have the following simplifications:
– All quantities are independent of n -> no time recursion
necessary.
– bm=Jam*
• This is due to the Toeplitz structure of the
autocorrelation matrix.
• Therefore, Rm+1 can be partitioned as

• It can be shown that


• Where

• The optimum coefficients are


Levinson-Durbin Algorithm
• For stationary RP, the Toeplitz structure of the
autocorrelation matrix can be used to come
up with efficient order recursive algorithms.
• Suppose that cm is known

and we wish to determine


• Since Rm+1 and dm+1 can be partitioned as follows
• By utilizing the Toeplitz structure of Rm,

• To avoid the use of lower right corner partitioning,


FLP recursion can be used to obtain am
• This leads to the Levinson recursion
• Levinson recursion consists of two parts:
– A set of recursion to compute the FLP or BLP am or
bm,
– A set of recursion to compute the optimum filter
from am or bm.
• If required to obtain the coefficients c.
LATTICE STRUCTURES FOR OPTIMUM FIR
FILTERS
• To compute the FLP error and BLP error

• Using direct-form filter structure


• Since am and bm do not have the optimum
nesting property, we cannot obtain order-
recursive direct-form structures for the
computation of the prediction errors.
• By partitioning x,
• FLP errors are

• BLP errors are


• These equations can be computed for m=0,1,…,M-1
given initial conditions

• Implementation
• Given that

• The optimum filtering error can be computed


from the BLP error.

You might also like