0% found this document useful (0 votes)
51 views

Applications of Chromatic Derivatives: Aleks Ignjatović Ignjat@cse - Unsw.edu - Au

This document discusses applications of chromatic derivatives, which are a new way of representing bandlimited signals locally. Chromatic derivatives form an orthogonal basis for linear differential operators and preserve spectral features of signals better than traditional derivatives. They allow defining local representations of operations like the norm, scalar product, and convolution for bandlimited signals. The chromatic expansion fixes issues with the traditional Taylor expansion approach by expressing any analytic function as a summation involving chromatic derivatives evaluated at zero.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Applications of Chromatic Derivatives: Aleks Ignjatović Ignjat@cse - Unsw.edu - Au

This document discusses applications of chromatic derivatives, which are a new way of representing bandlimited signals locally. Chromatic derivatives form an orthogonal basis for linear differential operators and preserve spectral features of signals better than traditional derivatives. They allow defining local representations of operations like the norm, scalar product, and convolution for bandlimited signals. The chromatic expansion fixes issues with the traditional Taylor expansion approach by expressing any analytic function as a summation involving chromatic derivatives evaluated at zero.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Applications of Chromatic Derivatives

Aleks Ignjatović

[email protected]

School of Computer Science and Engineering


University of New South Wales
Sydney, Australia
There are none.
The sad story of my life: how the whole thing started

input signal

error
1.5

PWM extrapolator
æ 1.0
æ ææ æææ
æ æ

æ æ
switch æææ
æ ææææ
0.5
æ æ ææ æ æ
æ æ ææææ æ
+ inductor æ æ æ
-12 æ ææ æ
-10 -8 -6 æ æ -4 -2
DC 0 æ
æ
æ æ
filtering æ -0.5
- capacitor
æ
speaker ææ

-1.0

Figure: Left: PWM amplifier; Right: waveform prediction.


Towards local signal representation

I Let f ∈ BL(π), i.e., f ∈ L2 with fd


(ω) supported on [−π, π]
————————————————————————————

X sin π(t − n)
Shannon’s Expansion: f (t) = f (n)
(Whittaker–Kotelnikov–Nyquist–Shannon) n=−∞
π(t − n)

I global in nature – requires samples f (n) for all n;


I fundamental to signal processing;
I poorly represents local signal behavior
————————————————————————————

X tn
Taylor’s Expansion: f (t) = f (n) (0)
n=0
n!

I local in nature – requires f (n) (t) at a single instant t = 0.


I very little use in signal processing – why ?
Problems with Taylor’s expansion of BL(π) signals

1. Numerical evaluation of derivatives of high orders of a


noisy sampled signal is unfeasible.

2. Truncations of Shannon’s expansion of an f ∈ BL(π)

– belong to BL(π)

– converge to f both uniformly and in L2

– if A is a filter, then
P∞
A[f ](t) = n=−∞ f (n) A [sinc ] (t − n), (1)

– In comparison, truncations of Taylor’s expansion of an


f ∈ BL(π) have none of these important properties

Can we fix all of these problems???


Numerical differentiation of band limited signals
Z π  n
f (n) (t) 1 ω
Let f ∈ BL(π); then n
= in fd
(ω)e i ωt dω.
π 2π −π π

0.5

-3 -2 -1 1 2 3

-0.5

n
Figure: (ω/π) for n = 15 − 18

I derivatives of high order obliterate the spectrum.


I transfer functions of the (normalized) derivatives cluster
together and are nearly indistinguishable.
I can we find a better base for the space of linear differential
operators? An orthogonal base??
Orthogonal base for the space of linear diff. operators
I Start with normalized and re-scaled Legendre polynomials:
Z π
1
PnL (ω)Pm
L
(ω)dω = δ(m − n).
2π −π

I Obtain operator polynomials by replacing ω k with ik d k /dt k :


 
d
Ktn = (−i)n PnL i
dt

I Definition of Kn chosen so that

Ktn [e i ωt ] = in PnL (ω) e i ωt .

I Thus, for f ∈ BL(π),


Z π
1
Kn [f ](t) = in PnL (ω)fd
(ω)e i ωt dω.
2π −π
Why are chromatic derivatives a better base?
I Compare the graphs of the
0.5
transfer functions of 1/π n d n /dt n ,
i.e., (ω/π)n (first graph) and of
-Π -
Π
2
Π
2
Π Kn , i.e., PnL (ω) (second graph).
-0.5

I Transfer functions of Kn form


a sequence of well separated
comb filters which preserve
2

1
spectral features of the signal,
thus we call them the
Π Π
-Π - Π
2 2

-1

-2
chromatic derivatives.

3
I Third graph: transfer
function of the ideal filter K15
2

Π Π Π Π
(red) vs. transfer function of a
transversal filter (blue),
- -
4 8 8 4
-1

-2
(128 taps, 2× oversampling, i.e.,
-3
use anticompressive sensing.)
Local representation of the scalar product in BL(π)
Proposition: Assume that f , g ∈ BL(π); then the sums on the
left hand side of the following equations do not depend on the
choice of the instant t, and

X Z ∞
n 2
K [f ](t) = f (t)2 dt = kf k2
n=0 −∞

X Z ∞
K n [f ](t)K n [g](t) = f (t)g(t)dt = hf , gi
n=0 −∞

X Z ∞
K n [f ](t)Ktn [g(u − t)] = f (t)g(u − t)dt = (f ∗ g)(u)
n=0 −∞

I These are the local equivalents of the usual, “globally


defined” norm, scalar product and convolution!

I Aim: “maximally localized” signal processing,


suitable for transient analysis.
Fixing Taylor’s Expansion: Chromatic Expansion
sin(πt)
Proposition: Let sinc (t) = and let f (t) be any
πt
analytic function. Then,

X
f (t) = (−1)n Kn [f ](0) Kn [sinc (t)]
n=0
X∞ √
= Kn [f ](0) 2n + 1 jn (πt)
n=0

jn − the spherical Bessel functions


solutions of x 2 y 00 + 2xy 0 + [x 2 − n(n + 1)]y = 0

I The truncations of the series belong to BL(π).

I If f ∈ BL(π) the series converges to f (t) both


uniformly and in L2 .
Chromatic approximation versus Taylor’s approximation

2.0

1.5

1.0

0.5

-15 -10 -5 5 10 15
-0.5

-1.0

-1.5

I red: the signal;


blue: the chromatic approximation of order 15;
green: Taylor’s approximation of order 15.

dk
Pn
I f (k) (0) = dt k
[ m=0 (−1)
m Km [f ](0) Km [sinc ](t)]t=0

I Chromatic approximations are local approximations


Approximation error behavior

1.0

0.8

0.6

0.4

0.2

-60 -40 -20 20 40 60

n
X

f (t) − (−1)m Km [f ](0) Km [sinc ](t) ≤ kf k2 E(t) (2)
m=0
v
u n
u X
E(t) = t1 − Km [m](t)2
m=0
Chromatic expansion vs. Shannon’s expansion
How is Shannon expansion
P∞ sin π(t−n)
f (t) = n=−∞ f (n) π(t−n)

related to the chromatic expansion


P∞ n [f ](0)

f (t) = n=0 K 2n + 1 jn (πt)

I Transformation {f (n)}n∈N ⇔ {Kn [f ](0)}n∈N by an


unitary operator defined by the infinite matrix
h√ i
2k + 1 jk (nπ) : k ∈ N, n ∈ Z :


X √
f (n) = Kk [f ](0) 2k + 1 jk (nπ);
k=0
X∞ √
Kk [f ](0) = f (n) 2k + 1 jk (nπ).
n=−∞
I In practice one CANNOT evaluate Kk [f ](0) using Shannon
rate samples via
N
X √
Kk [f ](0) ≈ f (n) 2k + 1 jk (nπ)
n=−N

because 2n + 1 jn (πt) decay very slowly and we would need
huge N .
1.0

0.5

-20 -10 10 20

-0.5

Figure: n = 0, n = 7, n = 14
Chromatic derivatives are non-redundant!

I This is a good news because this means that chromatic


derivatives are non redundant to the Nyquist rate samples;

I They provide additional “information which:

I can either be in a more convenient for some signal


processing applications,

I or which can be used in addition to the standard Nyquist


rate methods making them more powerful!
General families of chromatic derivatives
I Given a family of orthonormal polynomials Pn (ω) we can
always define differential operators
 
d
Ktn = (−i)n PnL i
dt

Question:

What are the families of orthogonal polynomials such that for


the corresponding differential operators K n and some
associated function m(t) we have

X
f (t) = (−1)n K n [f ](u)K n [m](t − u)
n=0

for important classes of functions, and when is the convergence


uniform?
Examples:
Legendre Polynomials/Spherical Bessel functions

I For the (normalized) Legendre polynomials


Z π
1
PnL (ω)Pm
L
(ω)dω = δ(m − n)
2π −π

sin(πt) √
and for m(t) = πt we have Kn [m](t) = (−1)n 2n + 1 jn (πt)
and ∞
X √
f (t) = Kn [f ](0) 2n + 1 jn (πt)
n=0

holds for all analytic functions;

I The convergence is uniform for functions in BL(π)


Examples:
Chebyshev polynomials /Bessel functions
I For the (normalized) Chebyshev polynomials of the first
kind:
Z π T
Pn (ω)Pm
T
(ω)
q  dω = δ(n − m).
−π π 2 1 − ω 2
π

for m(t) = J0 (πt) we have Kn [m](t) = (−1)n 2 Jn (πt) and
√ X∞
f (t) = f (u)J0 (πt) + 2 Kn [f ](0)Jn (πt)
n=1

- the Neumann series - converges for all analytic functions;

I Convergence uniform for band limited functions which satisfy


Z π q
1 − (ω/π)2 |bf (ω)|2 dω < ∞
−π
Examples:
Hermite polynomials/Gaussian monomials
I For the (normalized) Hermite polynomials
Z ∞ 2
e−ω
Pn (ω)Pm (ω) √ dω = δ(n − m)
H H

−∞ π

2 /4 tn 2
and m(t) = e−t we have Kn [m](t) = (−1)n √ e−t /4
n
2 n!
I chromatic expansion converges for analytic functions s.t.

|f (n) (z)|1/n
lim sup √ <∞
n→∞ n

I converges uniformly for all analytic functions s.t.


Z ∞
2
|fd
(ω)|2 eω dω < ∞
−∞
Examples: the hyperbolic family

If Ln (ω) satisfy
Z ∞  
1 πω
Ln (ω) Lm (ω) sech dω = δ(m − n)
2 −∞ 2

and m(z) = sech(z) then Kn [m](z) = (−1)n sech(z)tanhn (z)


and ∞ X
f (z) = Kn [f ](0) sech(z)tanhn (z)
n=0

converges uniformly on the disc |z| < π/2 for functions


analytic on this disc, whose Fourier transform satisfies
Z ∞
|bf (ω)|2 cosh(ω)dω < ∞
−∞
General families of chromatic derivatives
Definition: A family of polynomials Pn (ω) which is
orthonormal with respect to a non-decreasing bounded
moment distribution function a(ω):
Z ∞
Pn (ω)Pm (ω)da(ω)
−∞

is chromatic if the moments µn of a(ω),


Z ∞
µn = ω n dω
−∞

satisfy
1/n
µn
ρ = lim sup <∞
n→∞ n
Lemma: Pn (ω) are chromatic if and only if for every 0 ≤ α < ρ,
Z ∞
eα|ω| da(ω) < ∞
−∞
General families of chromatic derivatives

Theorem: Let Pn (ω) be a chromatic family of polynomials


orthonormal with respect to a(ω), and let
Z ∞
m(z) = eiωt da(ω)
−∞

Then m(z) is analytic on the strip Sρ/2 = {z : Im(z) < ρ/2}.

2
Definition: La(ω) is the space of functions φ(ω) satisfying
Z ∞
|φ(ω)|2 da(ω) < ∞.
−∞
General families of chromatic derivatives
Theorem: If Pn (ω) are a chromatic family of polynomials
orthonormal with respect to a(ω), then they are a complete
2
base of the space La(ω) .

Definition: Λ2 is the space of functions f (t) analytic on Sρ/2


such that for the chromatic derivatives Kn which correspond to
Pn (ω) we have

X
|Kn [f ](0)|2 < ∞.
n=0

Theorem: A function f (z) is in Λ2 if and only if there exists a


function φf (ω) such that
Z ∞
f (z) = φf (ω)ei ωz da(ω)
−∞
in which case ∞
X
φf (ω) = Kn [f ](0)Pn (ω)
n=0
General families of chromatic derivatives

Theorem: If f (z) ∈ Λ2 , then



X
f (z) = (−1)n Kn [f ](0)Kn [m](t)
n=0

with the series converging uniformly on strips Sρ/2− .


How about the local (non-uniform) convergence of the
chromatic series??

For example, in the case of the Chebyshev polynomials Tn (ω)


and the Bessel functions of the first kind Jn (ω), we know that
the chromatic series is just the Newmann series, and that the
above equality holds for every analytic function f (z) !
Weakly bounded families

Theorem: A family of polynomials is orthonormal with respect


to a moment distribution function a(ω) with all odd moments
µ2n+1 = 0 if and only if there exist γn > 0 such that
1 γn−1
Pn+1 (ω) = ω Pn (ω) − Pn−1 (ω).
γn γn

Definition: Such family of polynomials Pn (ω) is:


1
1. bounded if for some M and all n we have ≤ γn ≤ M .
M
2. weakly bounded if for some 0 ≤ p < 1 we have
1 γn
< γn < M n p and <M
M γn+1

I Bounded families are also weakly bounded with p = 0.


Examples:

I Bounded families (p = 0):

I Legendre polynomials: γn = √ π(n+1)2 → π2


4(n+1) −1
π
I Chebyshev polynomials: γ0 = √
2
and γn+1 = π2

I Weakly bounded family (p = 1/2):


p
I Hermite polynomials: γn = (n + 1)/2;

I Non - weakly bounded family (p = 1):


I Hyperbolic family: γn = n + 1;

I This shows that if we want m(z) to be entire, then


the bound p < 1 is sharp.
Lemma: Every weakly bounded family of orthonormal
polynomials is also chromatic.

Theorem: Let {Pn (ω)}n∈N be a weakly bounded family and


let f (z) be an entire function. If

f (n) (0) 1/n

lim =0
n→∞ n!1−p

then for every z ∈ C



X
f (z) = (−1)j Kj [f ](0) Kj [m](z).
j=0

The convergence is uniform on every disc of finite radius.

Corollary: If M is bounded then the chromatic expansion of


every entire function f (z) point-wise converges to f (z) for all z.
I It turns out that many of the classical formulas such as
P∞
ei ωt = n
n=0 i Tn (ω)Jn (t)
P
J0 (t + u) = J0 (u)J0 (t) + 2 ∞ n
n=1 (−1) Jn (u)Jn (t)
P
J0 (t)2 + 2 ∞ 2
k=1 Jn (t) = 1
P∞
J0 (z) + 2 n=1 J2n (z) = 1

are special cases of chromatic expansions valid for all weakly


bounded families of polynomials and their associated m(z):

P∞
ei ωt = n n
n=0 i Pn (ω)K [m](t)
P∞
m(t + u) = n=0 (−1)n Kn [m](u)Kn [m](t)
P∞ n 2
k=1 K [m](t) = 1
P  Qn γ2k−2 
m(z) + ∞ n=1
2n
k=1 γ2k−1 K [m](z) = 1
Trigonometric functions
I Trigonometric functions do not belong to the spaces Λ2 :

X ∞
X
keiωt k2Λ = |K n
[eiωt ]|2 = Pn (ω)2 → ∞
n=0 n=0

Definition: Assume M is weakly bounded. We denote by C


the vector space of analytic functions such that the sequence
Xn
1
νnf (t) = Kk [f ](t)2
(n + 1)1−p k=0

converges uniformly on every finite interval.

Definition: Let C0 ⊂ C consists of f (t) such that


Xn
1
lim Kk [f ](t)2 = 0.
n→∞ (n + 1)1−p
k=0

We define C2 = C/C0 .
Theorem: Let f , g ∈ C and
Xn
1
σnfg (t) = Kk [f ](t)Kk [g](t);
(n + 1)1−p k=0

then the sequence {σnfg (t)}n∈N converges to a constant function.

Definition: For f , g ∈ C we define


Xn
1
hf , gi = Kk [f ](t)Kk [g](t)
(n + 1)1−p k=0

I Do the trigonometric functions belong to C2 ?


Xn Xn
1 k iωt 2 1
|K [e ]| = Pn (ω)2
(n + 1)1−p k=0 (n + 1)1−p k=0
I Chebyshev polynomials: (p = 0) if 0 < ω < π then
n
1 X
keiωt k = lim PnT (ω)2 = 1
n→∞ n + 1
k=0

I for all 0 < σ, ω < π, σ 6= ω


n
1 X
heiσt , eiωt i = lim PkT (σ)PkT (ω) = 0
n→∞ n + 1
k=0

I Hermite polynomials: (p = 1/2) for all ω, σ > 0, ω 6= σ,


n r
1 X 2 ω2
keiωt k = lim √ P H (ω)2 = e ,
n→∞ n + 1 k=0 k π

Xn
1
heiσt , eiωt i = lim √ P H (σ)PkH (ω) = 0
n→∞ n + 1 k=0 k

Thus, in this space every two pure harmonic oscillations with


distinct positive frequencies are mutually orthogonal!
Conjecture: Assume that for some 0 ≤ p < 1 the recursion
coefficients γn satisfy
γn
0 < lim < ∞.
n→∞ n p

Then for the corresponding family of orthogonal polynomials we


have n
1 X
0 < lim Pk (ω)2 < ∞
n→∞ (n + 1)1−p
k=0

for all ω in the support sp(a) of a(ω).

Numerical experiments indicate that this is true...

It turns out that the special case with p = 0 is a previously well


known, still open problem (P. Nevai).
Application: signal interpolation

3 N T N T N T N

2
Given pieces of band limited
1
signals join them so that the
50 100 150
out of band energy is
-1 minimal.
-2

-3

f1 (t) f2 (t) f3 (t)


We use chromatic expansions
to ensure that the resulting
signal is N times continuously
differentiable Then
3

|Kn [f ] b (ω)|
|bf (ω)| ≤
1

M
50 100 150 |Pn (ω)| ≤ |Pn (ω)|
-1

-2

-3
Extrapolation filter

0.2

0.1

-40 -20 20 40

-0.1

-0.2
0.20

0.15

0.10

0.05
0.2
-34 -32 -30 -28
-0.05

-0.10
0.1
-0.15

-100 -50 50 100

-0.1
0.10

0.05

-0.2
30 32 34 36
-0.05

-0.10

-0.15
Application: frequency estimation
Idea: A signal is a sum of at most N shifted and damped sine
waves iff it is a solution to a homogeneous linear differential
equation with constant coefficients of order at most 2N .

A rough sketch of the frequency estimation algorithm:


I Choose the chromatic derivatives which are orthogonal with
respect to the power spectrum density of the noise:
I take polynomials Pn (ω) such that
Z π
1
Pn (ω)Pm (ω)S(ω)dω = δ(m − n)
2π −π

I Let Kn be the chromatic derivatives corresponding to the


polynomials Pn (ω), i.e., let

Kn = (− i)n Pn (− i d/dt).
Then, assuming E[ν(n)2 ] = ρ2 , we have
E{Kn [ν](n)Km [ν](n)} = δ(m − n)ρ2
so we can apply the standard SVD or ED methods.

K4[f]
K3[f]
K2[f]
K1[f]
K0[f]
1 2 3 4 5 6 7 8 9 10 11 12

1 2 3 4 5 6 7 8 9 10 11 12

error Cazdow’s method: 0.0025;


error Cazdow’s method+CD method: 0.0018
(SNR= −10db; 10 000 runs)
What if we allow time varying coefficients? We can easily detect
chirps, ets. In fact, transients can be classified according to
what type of differential eqation they satisfy!

CONJECTURE:

Classification via the minimal degree linear differential


equation satisfied by a transient can play the role which
the spectrum plays for the “steady state” signals!!

You might also like