Solutions to Homework 1

advertisement
Solutions to Homework 1
Problem 1 Part (a):
x[n] =
∞
∑
h1 [k]v[n − k]
k=−∞
y[n] =
∞
∑
h2 [k]w[n − k]
k=−∞
Since the inputs v[n] and w[n] are jointly WSS, from the results in sec. 3.4 in text, it
follows that x[n] is WSS and y[n] is WSS. To prove x[n] and y[n] are jointly WSS, we have
to only examine the cross correlation sequence.
∑
∑
rxy [n + m, n] = E[x[n + m]y ∗ [n]] = E[ k h1 [k]v[n + m − k]( l h2 [l]w[n − l])∗ ]
=
=
where q[m] =
∑
k
∑ ∑
k
∑
l
l
h1 [k]h∗2 [l]E[v[n + m − k]w∗ [n − l]] =
∑ ∑
k
l
h1 [k]h∗2 [[l]rvw [m + l − k]
h∗2 [l]q[m + l] = q[m] ∗ h∗2 [−m],
h1 [k]rvw [m − k] = h1 [m] ∗ rvw [m]. Substituting this expression for q[m]
above, we have
rxy [n + m, n] = rvw [m] ∗ h1 [m] ∗ h∗2 [−m].
This indicates the cross correlation is only a function of the time difference m and hence
x[n] and y[n] are jointly WSS with
rxy [m] = rvw [m] ∗ h1 [m] ∗ h∗2 [−m].
Taking the Fourier transform of the cross correlation results in the cross power spectrum.
Rxy (ejω ) = Rvw (ejω )H1 (ejω )H2∗ (ejω ).
Part b): Rxy (ejω ) is not always positive for all ω.. Consider H2 (z) = −H1 (z), and
v[n] = w[n]. Then
Rxy (ejω ) = −Rvv (ejω )|H1 (ejω )|2 ≤ 0, ∀ω.
1
Problem 2: r[0] = 1, r[1] = r[−1] = α, r[2] = r[−2] = .25, and r[m] = 0, |m| ≥ 3. The
positive definiteness of the autocorrelation sequence is guaranteed by the positivity of the
power spectrum.
jω
R(e ) =
∞
∑
r[m]e
−jωm
−∞
=
2
∑
r[m]e−jωm = 1 + 2α cos ω + .5 cos 2ω = .5 + 2α cos ω + cos2 ω
m=−2
= (.5 − α2 ) + (α + cos ω)2 = T1 + T 2
One can readily conclude that |α| ≤ r[0] = 1. Returning to the general expression, the
second term T2 satisfies T2 ≥ 0. The smallest value T2 can take is zero when ω is chosen
such that cos ω = −α. On the other hand, T1 is a constant. When T2 = 0, for the power
spectrum to be positive we need T1 ≥ 0. Hence we need .5 − α2 ≥ 0, or |α| ≤
√
.5.
Problem 3: For the given matrix


 4 −2 1 





R=
 −2 4 −2 




1 −2 4
to be a valid correlation matrix, need to just check for symmetry and positive semidefiniteness. This is a real matrix and symmetry implies R = RT , where T denotes
transpose or at the element level rij = rji , 1 ≤ i, j ≤ 3. Symmetry is evident by inspection.
Now to prove positive definiteness, we use the following property of symmetric matrices.
Since it is a symmetric matrix, it admits an eigendecomposition with orthonormal eigenvectors and real eigenvalues, i.e. R = QΛQH , where Q is the matrix with the orthonormal
set of eigenvectors, i.e. QQH = QH Q = I. Λ is the diagonal matrix with real eigenvalues
(make sure you review this property). Positive semi-definiteness requires xH Rx ≥ 0, ∀x.
H
H
H
H
x Rx = x QΛQ x = y Λy =
M
∑
k=1
2
|yk |2 λk ,
where y = QH x and yk are the elements of vector y. For this problem M = 3. This suggests
that if the eigenvalues are real and greater than or equal to zero, R is positive semi-definite.
Using matlab, the eigenvalues of R are 7.37, 3.0 and 1.63. They are all ≥ 0 and so the
given R is positive semi-definite.
Part (b): For this we will need the square root of matrix R. Matrix L is a square root
of R if R = LLH . There are many square roots to R. For instance if L1 is one square
root, then L1 U where U is an orthonormal matrix, i.e. U U H = U H U = I, is also a square
1
root. One choice is to construct L from the eigen decomposition. In particular, L = QΛ 2
1
where Λ 2 is a diagonal matrix with diagonal entries equal to the positive square roots of
the eigenvalues λi .
If W is a random vector with mean zero and correlation matrix I, then a random vector
X with correlation matrix R can be generated simply as X = LW.
Problem 4: Since R is a correlation matrix and admits an eigen decomposition.
R = QΛQH ,
where Q is an unitary matrix that contains the eigenvectors, i.e. QQH = QH Q = I. Λ is
a diagonal matrix containing the eigenvalues λi along the diagonal.
We will use the following property of determinants: detAB = detBA, as long as the
matrix products AB and BA are defined.
detR = detQH ΛQ = detΛQQH = detΛI = detΛ = ΠM
i=1 λi .
Problem 5: Since the filter H(z) =
∑M
k=0
hk z −k is FIR, its impulse response h[n] = 0, k <
3
0 and k > M. Computing the output autocorrelation
M ∑
M
∑
ry [m] = E[y[n]y ∗ [n − m]] = E[
h[k]h∗ [l]x[n − k]x∗ [n − m − l]]
k=0 l=0
=
M ∑
M
∑
∗
∗
h[k]h [l]E[x[n − k]x [n − m − l]] =
k=0 l=0
M ∑
M
∑
h[k]h∗ [l]rx [m + l − k]
k=0 l=0
Since the input x[n] is zero mean white, rx [m] = σ 2 δ[m]. Hence
ry [m] =
M ∑
M
∑
∗
h[k]h [l]σ δ[m + l − k] =
2
k=0 l=0
M
∑
h[m + l]h∗ [l].
(1)
l=0
Since ry [m] is symmetric, it is sufficient to consider only m ≥ 0. Since h[n] = 0, n ≥ M,
from eq. 1, ry [m] is zero for m > M, and hence has finite support, i.e. ry [m] = 0, |m| > M .
4
Download