Chapter 5 Algorithms For Optimum Linear Filters
Chapter 5 Algorithms For Optimum Linear Filters
Processing
Algorithms and Structures
for Optimum Linear Filters
Introduction
• The design and application of optimum filters
involves
– Determination of the optimum set of coefficients,
– Evaluation of the cost function to determine
whether the obtained parameters satisfy the design
requirements, and
– The implementation of the optimum filter.
• There are several important reasons to study the normal
equations in greater detail in order to develop efficient,
special-purpose algorithms for their solution.
– The throughput of several real-time applications can only be served with
algorithms that are obtained by exploiting the special structure of the
correlation matrix.
– We can develop order-recursive algorithms that help us to choose the
correct filter order or to stop before of numerical problems.
– Some algorithms lead to intermediate sets of parameters that have physical
meaning, provide easy tests, or are useful in special applications.
– Sometimes there is a link between the algorithm for the solution of the
normal equations and the structure for implementation.
Order-recursive algorithms
• In fixed-order algorithms in order to solve the normal
equations, the order of the estimator should be
known.
• When the order of the estimator becomes a design
variable, fixed-order algorithms are not effective.
– If order changes, the optimum coefficients have to be
calculated again from scratch.
• We would like to arrange the computations so
that the results for order m, that is, cm(n) or
yˆm(n), can be used to compute the estimates for
order m + 1, that is, cm+1(n) or yˆm+1(n).
– The resulting procedures are called order-recursive
algorithms or order-updating relations.
• Similarly, procedures that compute cm(n + 1) from
cm(n) or yˆm(n + 1) from yˆm(n) are called time-
recursive algorithms or time-updating relations.
Matrix Partitioning and Optimum Nesting
The first mxm sub matrix of Rm+1 The last mxm sub matrix of Rm+1
• Since
Where:
Alternatively:
• Note that:
– The inverse Rm+1of the m+1 autocorrelation matrix
is obtained directly from the inverse Rm.
– The vector bm is the MMSE estimator of
observation xm+1 from data vector xm.
– The inverse matrix does not have the optimum
nesting property.
Levinson Recursion for the Optimum
Estimator
• Solving the m+1 normal equation
Where
• Note that
– Even though the equation is order-recursive, the
parameter cm+1 does not have the optimum nesting
property.
Where
• Note that
• If the optimum m FIR filter coefficients are known at
time n, the m+1 time coefficients can be calculated as
• Implementation
• Given that