Skip to content Skip to navigation

OpenStax_CNX

You are here: Home » Content » Gauss-Markov Theorem and Wiener Filtering

Navigation

Recently Viewed

This feature requires Javascript to be enabled.
 

Gauss-Markov Theorem and Wiener Filtering

Module by: Clayton Scott, Robert Nowak. E-mail the authors

Let xx and yy be jointly Gaussian distributed: ( x y )𝒩( m x m y )( R xx R xy R yx R yy ) x y m x m y R xx R xy R yx R yy Then the conditional distribution of y y given x x is y | x 𝒩 m y + R yx R xx -1(x m x )Q y | x m y R yx R xx x m x Q where Q= R yy R yx R xx -1 R xy Q R yy R yx R xx R xy

We know that the conditional mean y ^= m y + R yx R xx -1(x m x ) y m y R yx R xx x m x is the best estimate of yy give xx in a (mean) squared error sense.

Example 1

x=y+W x y W where y𝒩0 R yy y 0 R yy and W𝒩0 R WW W 0 R WW and yy and WW are independent. ( x y )𝒩( 0 0 )( R yy + R WW R yy R yy R yy ) x y 0 0 R yy R WW R yy R yy R yy y | x 𝒩 R yx R xx -1x R yy R yx R xx -1 R xy y | x R yx R xx x R yy R yx R xx R xy y ^= R yy R yy + R WW -1x=Hx y R yy R yy R WW x H x where H H is the Wiener filter. Minimum MSE estimator of yy given xx. R yx = R yy R yx R yy and R xx = R yy + R WW R xx R yy R WW .

Direct Optimization

x=y+W x y W y ^=Gx y G x H=argminGE(yGx)T(yGx) H G y G x y G x where E(yGx)T(yGx) y G x y G x is the MSE.

MS=E(yGx)T(yGx)=Etr(yGx)(yGx)T=trE(yGx)(yGx)T MS y G x y G x tr y G x y G x tr y G x y G x
(1)
Minimizing MSE is equivalent to minimizing
ε2=E(yGx)(yGx)T= R yy G R xy R yx GT+G R xx GT ε 2 y G x y G x R yy G R xy R yx G G R xx G
(2)
Taking the derivative with repsect to GG ε2 G =2 R yx +2G R xx =0 G ε 2 -2 R yx 2 G R xx 0 This implies
H= R yx R xx -1= R yy R yy + R WW -1 H R yx R xx R yy R yy R WW
(3)

Figure 1: y= y ^+(y y ^) y y y y , where y y ^ y y is the error. The error and the estimate are statistically orthogonal to each other.
Geometrical Interpretation
Geometrical Interpretation (geoint.png)

Orthogonality Condition

The optimal Wiener filter H= R yy R yy + R WW -1 H R yy R yy R WW satisfies the following condition E(y y ^) y ^T=0 y y y 0

E(y y ^) y ^T=E(yxTHTHxxTHT)= R yx HTH R xx HT= R yy HTH( R yy + R WW )HT= R yy R yy + R WW -1 R yy R yy R yy + R WW -1( R yy + R WW ) R yy + R WW -1 R yy =0 y y y y x H H x x H R yx H H R xx H R yy H H R yy R WW H R yy R yy R WW R yy R yy R yy R WW R yy R WW R yy R WW R yy 0
(4)

The Classical Wiener Filter

Figure 2: yt y t is a stationary random signal, wt w t is stationary random noise, and the filter is H H
Figure 2 (block1.png)
We want to find Hω H ω that minimizes the MSE
ε2=E yt ^yt2= R ŷ ŷ 02× R y ŷ 0+ R yy 0 ε 2 y t y t 2 R ŷ ŷ 0 2 R y ŷ 0 R yy 0
(5)
where R yy τ=Eytyt+τ R yy τ y t y t τ . We can express the MSE in the frequency domain by noting that R yy 0=12π S yy ωd ω R yy 0 1 2 ω S yy ω where S yy ω S yy ω is the power spectrum of yt y t .

Recall:

R yy τ R yy τ and S yy ω S yy ω are FT pairs: S yy ω= R yy τe(iωτ)d τ S yy ω τ R yy τ ω τ R yy τ=12π S yy ωeiωτd ω R yy τ 1 2 ω S yy ω ω τ

Random Signal Response of Linear Systems

Figure 3: xt x t is a stationary random process and H H is a linear time-invariant system.
Figure 3 (block2.png)
yt ^=huxtud u y t u h u x t u
E yt ^=huExtud u = m x hud u y t u h u x t u m x u h u
(6)
since Extu x t u is a constant. In particular, if xt x t is zero-mean then so is yt ^ y t .

Autocorrelation of Output Process

R ŷ ŷ τ=E yt ^ yt+τ ^=Ehsxtsd s huxt+τud u =Extsxt+τuhshud s d u = R xx τ+suhshud s d u = R xx τ*hτ*hτ R ŷ ŷ τ y t y t τ s h s x t s u h u x t τ u u s x t s x t τ u h s h u u s R xx τ s u h s h u R xx τ h τ h τ
(7)

Power Spectrum

S ŷ ŷ ω= S xx ωHω¯Hω= S xx ωHω2 S ŷ ŷ ω S xx ω H ω H ω S xx ω H ω 2
(8)

Cross-Correlation and Cross-Spectrum

R ŷ x τ=E yt ^xt+τ=Ehsxtsd s xt+τ=hs R xx τ+sd s = R xx τ*hτ R ŷ x τ y t x t τ s h s x t s x t τ s h s R xx τ s R xx τ h τ
(9)
This implies S ŷ x ω= S xx ωHω¯ S ŷ x ω S xx ω H ω R ŷ ŷ 0=12πHω2 S xx ωd ω R ŷ ŷ 0 1 2 ω H ω 2 S xx ω R y ŷ 0=12πHω¯ S yx ωd ω R y ŷ 0 1 2 ω H ω S yx ω S xx ω= S yy ω+ S ww ω S xx ω S yy ω S ww ω S yx ω= S yy ω S yx ω S yy ω since yt y t and wt w t are independent. Thus, the expression for the MSE becomes ε2=12πHω2( S yy ω+ S ww ω)2Hω¯ S yy ω+ S yy ωd ω ε 2 1 2 ω H ω 2 S yy ω S ww ω 2 H ω S yy ω S yy ω ε2 ε 2 is minimized by minimizing the integrand each frequency ωω. This implies Hω( S yy ω+ S ww ω)= S yy ω H ω S yy ω S ww ω S yy ω Hω= S yy ω S yy ω+ S ww ω H ω S yy ω S yy ω S ww ω

Example 2

Figure 4
Figure 4 (syy.png)
Figure 5
Figure 5 (sww.png)
Figure 6
Figure 6 (h.png)

Comparison

Signal Vector (discrete-time)

H= R yy R yy + R ww -1 H R yy R yy R ww Here H H is a matrix, R yy R yy is the signal "power", and R ww R ww is the noise "power".

Classical (continuous-time)

Hω= S yy ω S yy ω+ S ww ω= S yy ω S yy ω+ S ww ω-1 H ω S yy ω S yy ω S ww ω S yy ω S yy ω S ww ω
(10)
Which means that the Wiener Filter is defined as the ratio of signal power to the sum of signal power and noise power.

Content actions

Download module as:

Add module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks