EE 179, Lecture 23, Handout #39
Random Signals
A random signal is one of an ensemble of possible signals, discrete time
(time series) or continuous time, such as white noise
◮
A random process (or stochastic process ) is an infinite indexed collection of random variables { X ( t ) : t ∈ T } , defined over a common probability space
◮
The index parameter t is typically time, but can also be a spatial dimension.
◮
Random processes are used to model random experiments that evolve in time:
◮
◮
◮
◮
◮
◮
Received sequence/waveform at the output of a communication channel
Packet arrival times at a node in a communication network
Thermal noise in a resistor
Scores of an NBA team in consecutive games
Daily price of a stock
Winnings or losses of a gambler
EE 179, May 28, 2014 Lecture 23, Page 1
Two Ways to View a Random Process
A random process can be viewed as a function X ( t, ω ) of two variables, time t ∈ T and the outcome of the underlying random experiment ω ∈ Ω
◮
For fixed t , X ( t, ω ) is a random variable over Ω
◮
For fixed ω , X ( t, ω ) is a deterministic function of t , called a sample function
X ( t, ω
1
) t
X ( t, ω
2
) t
X ( t, ω
3
) t t
1 t
2
X ( t
1
, ω ) X ( t
2
, ω )
EE 179, May 28, 2014 Lecture 23, Page 2
Discrete-Time Random Process Example
Let Z ∼ U[0 , 1] , and define the discrete time process X n
= Z n for n ≥ 1 .
Sample paths:
Z = 1
2 x n
1
2
1
4 1
8
1
16 n
Z = 1
4 x n
1
4
1
16
1
64 n
Z = 0 x n
0 0 0
. . .
1 2 3 4 5 6 7 . . .
n
EE 179, May 28, 2014 Lecture 23, Page 3
Continuous-Time Random Process Example
Sinusoidal signal with random phase:
X ( t ) = α cos( ωt + Θ) , t ≥ 0 where Θ ∼ U[0 , 2 π ] and α and ω are constants
Sample functions: x ( t )
α
Θ = 0
π
2
ω
π
ω
3
π
2
ω
2
π
ω x ( t ) t
Θ = π
4 t x ( t )
Θ = π
2 t
EE 179, May 28, 2014 Lecture 23, Page 4
Characterization of Random Process
Some random processes can be described analytically. E.g., x ( t ) = A cos( ω c t + Θ) where Θ is uniformly distributed in the range [0 , 2 π ) . Sample functions are sinusoids with random phase.
In general, a random process is described by joint CDF of n random variables of the process for all n .
F
X ( t
1
) X ( t
2
)
···
X ( t n
)
( x
1
, x
2
, . . . , x n
) =
P { X ( t
1
) ≤ x
1
, X ( t
2
) ≤ x
2
, . . . , X ( t n
) ≤ x n
}
Kolmogorov showed that if these CDFs were consistent for all n , then the random process was well defined.
EE 179, May 28, 2014 Lecture 23, Page 5
Ensemble with Finite Number of Sample Functions
Shown below are sample functions of a binary polar random process.
Later we will calculate the frequency content of this process.
EE 179, May 28, 2014 Lecture 23, Page 6
Mean and Autocorrelation
The mean of a random process is determined by the first order PDF.
X ( t ) = E( X ( t )) =
Z ∞
−∞ xp
X
( x ; t ) dx
The autocorrelation is determined by second order PDF.
R
X
( t
1
, t
2
) = X
1
( t ) X
2
( t ) = E( X
1
( t ) X
2
( t ))
Z ∞ Z ∞
= x
1 x
2 p
X
( x
1
, x
2
; t
1
, t
2
) dx
1 dx
2
−∞ −∞
The autocorrelation function gives information about the frequency content of the random process.
EE 179, May 28, 2014 Lecture 23, Page 7
Autocorrelation Examples
EE 179, May 28, 2014 Lecture 23, Page 8
Strong Sense Stationary
A random process is strictly stationary (strong-sense stationary) if time shifts do not change probabilities. For all n, τ, x
1
, . . . , x n
,
P { X ( t
1
) ≤ x
1
, . . . , X ( t n
) ≤ x n
} =
P { X ( t
1
+ τ ) ≤ x
1
, . . . , X ( t n
+ τ ) ≤ x n
}
In particular, the first order pdf is the same for every t .
X ( t
1
) =
Z ∞
−∞ xp
X
( x ; t
1
) dx =
Z ∞
−∞ xp
X
( x ; t
2
) = X ( t
2
)
The autocorrelation function of a SSS random process depends only on difference t
2
− t
1
.
R
X
( t
1
, t
2
) = X ( t
1
) X ( t
2
) = X ( t
1
+ τ ) X ( t
2
+ τ )
We write autocorrelation as a function of delay.
R
X
( τ ) = R
X
( t
2
− t
1
)
EE 179, May 28, 2014 Lecture 23, Page 9
Wide-Sense (Weakly) Stationary
A random process is wide-sense stationary (WSS) if its mean and autocorrelation are time invariant:
X ( t ) = constant
R
X
( t
1
, t
2
) = R
X
( t
2
− t
1
) = X ( t
1
) X ( t
2
)
The power of a WSS random process is also time invariant.
E( X ( t )
2
) = X ( t ) X ( t ) = R
X
(0)
Important facts about autocorrelation:
◮
The maximum value of | R
X
( τ ) | occurs for τ = 0 .
◮
If R
X
( τ ) = R
X
(0) then X ( t ) is periodic and conversely.
◮
The PSD of a WSS random process is S
X
( f ) = F{ R
X
( t ) } .
◮
Total power of WSS r.p. is
Z ∞
−∞
S
X
( f ) df = 2
Z ∞
0
For complex-valued random processes,
S
X
( f ) df
EE 179, May 28, 2014 Lecture 23, Page 10
PSD of Low-Pass White Noise
White noise with PSD N
0
/ 2 is a low-pass filtered.
S
X
( f ) =
N
0
Π
2 f
2 B
⇒ R
X
( τ ) = N
0
B sinc(2 πBτ )
EE 179, May 28, 2014 Lecture 23, Page 11
Sample Functions of Low-Pass White Noise
0.2
0
−0.2
0
0.2
0
−0.2
0
0.1
0
−0.1
0
0.5
0
−0.5
0
EE 179, May 28, 2014
0.5
0.5
0.5
0.5
1 1.5
1 1.5
1 1.5
1 1.5
2 2.5
2 2.5
2 2.5
2 2.5
3 3.5
4 4.5
3 3.5
4 4.5
3 3.5
4 4.5
5
5
5
3 3.5
4 4.5
5
Lecture 23, Page 12
Random Phase Cosine
Let X ( t ) = A cos( ω c t + Θ) where Θ is random from [0 , 2 π ) .
Once Θ is chosen, the signal realization is known.
The random phase process is wide-sense stationary.
EE 179, May 28, 2014 Lecture 23, Page 13
PSD of Random Phase Cosine
The random phase cosine process is WSS.
X ( t ) = A cos( ω c t + Θ) =
Z
2 π
0
1
A cos( ω c t + θ )
2 π dθ = 0
R
X
( t
1
, t
2
) = A cos( ω c t
1
+ Θ) · A cos( ω c t
2
+ Θ)
=
=
1
2
A
2 cos( ω c
(
1
2
A
2 cos( ω c
( t
2 t
2
− t
1
− t
1
))
) + cos( ω c
( t
2
+ t
1
) + 2Θ
The mean is constant, and the autocorrelation depends only on t
2
Therefore the process is WSS.
− t
1
.
The random phase cosine process is SSS. Exercise for the reader.
EE 179, May 28, 2014 Lecture 23, Page 14
Random Binary Process
A discrete-time random process is not stationary because the signals change at specific times, multiple of T b
.
A standard trick to make the process stationary is to shift by a random phase. In other words, let time t = 0 be random.
The random waveforms can be written in terms of the phase shift:
X ( t ) =
X a n p ( t − nT b
− α ) , α ∈ [0 , T b
] uniform n
We can use this formula to find the autocorrelation.
EE 179, May 28, 2014 Lecture 23, Page 15
Random Binary Process (cont.)
If t
2
> t
1
+ T b then X ( t
1
) and X ( t
2
) are independent
R
X
( t
1
, t
2
) = X ( t
1
) X ( t
2
) = X ( t
1
) X ( t
2
) = 0 · 0 = 0 .
If | τ | = | t
2
− t
1
| < 1 then the pulses overlap and the overlap decreases as
τ → ± 1 . As shown in the figure,
R
X
( τ ) = Λ( T b
) ⇒ S
X
( f ) = T b sinc
2
( πT b vf )
As expected, most of the power of the binary process is contained within
1 /T
B
Hz.
EE 179, May 28, 2014 Lecture 23, Page 16