13.42 Design Principles for Ocean Vehicles Prof. A.H. Techet Spring 2005 1. Random Processes A random variable, x(ζ ) , can be defined from a Random event, ζ , by assigning values xi to each possible outcome, Ai , of the event. Next define a Random Process, x(ζ , t) , a function of both the event and time, by assigning to each outcome of a random event, ζ , a function in time, x1 (t ) , chosen from a set of functions, xi (t) . ⎛ A1 → ⎜ ⎜ A2 → ⎜ M ⎜ ⎝ An → p1 → x1 (t ) ⎞ ⎟ p2 → x2 (t ) ⎟ M M ⎟ ⎟ pn → xn (t ) ⎠ (6) This “menu” of functions, xi (t) , is called the ensemble (set) of the random process and may contain infinitely many xi (t) , which can be functions of many independent variables. EXAMPLE: Roll the dice: Outcome is Ai , where i = 1 : 6 is the number on the face of the dice and choose some function xi (t ) = t i to be the random process. (7) 3.1. Averages of a Random Process Since a random process is a function of time we can find the averages over some period of time, T , or over a series of events. The calculation of the average and variance in time are different from the calculation of the statistics, or expectations, as discussed in the previously. TIME AVERAGE (Temporal Mean) 1 T xi (t ) dt = x t ∫ 0 T (8) 1 T [xi (t) − M { xi (t)}]2 dt T ∫0 (9) M { xi (t )} =Tlim →∞ TIME VARIANCE (Temporal Variance) V t {xi (t)} =Tlim →∞ TEMPORAL CROSS/AUTO CORRELATION This gives us the “correlation” or similarity in the signal and its time shifted version. Rit (τ ) =Tlim →∞ 1 T [ xi (t) − M t { xi (t)}][ xi (t + τ ) − M t { xi (t + τ )}]dt ∫ 0 T (10) • τ is the correlation variable (time shift). • | Rit | is between 0 and 1. • If Rit is large (i.e. Rit (τ ) → 1) then xi (t) and xi (t + τ ) are “similar”. For example, a sinusoidal function is similar to itself delayed by one or more periods. • If Rit is small then xi (t ) and xi ( t + τ ) are not similar – for example white noise would result in Rit (τ ) = 0 . EXPECTED VALUE: ∞ µ x t1 = E{x(t1 )} = ∫ x f ( x, t1 )dx (11) −∞ STATISTICAL VARIANCE: ∞ σ x 2t1 = E ⎩⎧⎨[ x(t1 ) − µ x (t1 )]2 ⎭⎬⎫ = ∫ (x − µ x ) 2 f ( x, t1 )dx (12) −∞ AUTO-CORRELATION: Rx xt1 , t2 = E{x(t1 , ζ )x(t2 , ζ )} = E {[ x(t1 , ζ ) − E{x(t1 , ζ )}][ x(t2 , ζ ) − E{x(t2 , ζ )}]} (13) Example: Roll the dice: k = 1 : 6 Assign to the event Ak (t) a random process function: xk (t ) = a cos kωo t (14) Evaluate the time statistics: MEAN: VARIANCE: M t {xk (t)} = lim 1 T →∞ T V t {xk (t )} = CORRELATION: R t {xk (t )} = = lim 1 T →∞ T lim 1 T →∞ T ∫ T 0 ∫ T T a 2 cos 2 kωo tdt = 0 ∫ 0 a cos kωo tdt = 0 a2 2 a 2 cos(kωo t) cos(kωo (t + τ ))dt a2 2 coskωoτ Looking at the correlation function then we see that if kωo t = π/2 then the correlation is zero – for this example it would be the same as taking the correlation of a sine with cosine, since cosine is simply the sine function phase-shifted by π/2 , and cosine and sine are not correlated. Now if we look at the STATISTICS of the random process, for some time t = to , xk (ζ , to ) = a cos(kωoto ) = yk (ζ ) (15) where k is the random variable ( k = 1, 2, 3, 4, 5, 6 ) and each event has probability, pi = 1/ 6 . E{ y (ζ )} = ∑ pk xk = ∑ k =1 16 a cos(kωoto ) EXPECTED 6 VALUE: VARIANCE: V { y (ζ )} = ∑ k =1 16 a 2 cos 2 (kωo to ) CORRELATION: Ryy (to ,τ ) = E{ yk (to , ζ ) yk (to + τ , ζ )} 6 STATISTICS ≠ TIME AVERAGES In general the Expected Value does not match the Time Averaged Value of a function – i.e. the statistics are time dependent whereas the time averages are time independent. 2. Stationary Random Processes A stationary random process is a random process, X (ζ ,t) , whose statistics (expected values) are independent of time. For a stationary random process: µ x (t1 ) = E{x(t1 , ζ )} ≠ f (t ) V (t ) = σ x (t1 ) = E ⎩⎧⎨[x(t1 ) − µ x (t1 )]2 ⎫⎬⎭ = σ x 2 2 Rxx (t,τ ) = Rxx (τ ) =≠ f (t ) V (t ) = R(t , 0) = V ≠ f (t ) The statistics, or expectations, of a stationary random process are NOT necessarily equal to the time averages. However for a stationary random process whose statistics ARE equal to the time averages is said to be ERGODIC. EXAMPLE: Take some random process defined by y (t, ζ ) : y (t, ζ ) = a cos(ωot + θ (ζ )) (16) yi (t ) = a cos(ωot + θi ) (17) where θ (ζ ) is a random variable which lies within the interval 0 to 2π , with a constant, uniform PDF such that ⎧1/ 2π ; for(0 ≤ θ ≤ 2π ) fθ (θ ) = ⎨ else ⎩ 0; (18) STATISTICAL AVERAGE: the statistical mean is not a function of time. E{ y (toζ )} = ∫ 2π 0 1 a cos(ωoto + θ )dθ = 0 2π (19) STATISTICAL VARIANCE: Variance is also independent of time. V (to ) = R(τ = 0) = a2 2 (20) STATISTICAL CORRELATION: Correlation is not a function of t, τ is a constant. E{ y (to , ζ ) y (to + τ , ζ )} = R(to ,τ ) =∫ 2π 0 = 1 2 a cos(ωoto + θ ) cos(ωo [to + τ ] + θ )dθ 2π 1 2 a cos ωoτ 2 Since statistics are independent of time this is a stationary process! (21) Let’s next look at the temporal averages for this random process: MEAN (TIME AVERAGE): m{ y (t , ζ i )} =Tlim →∞ =Tlim →∞ 1 T a cos(ωo t + θi )dt T ∫0 1 a [sin(ωoT + θi )] = 0 T ωo (22) TIME VARIANCE: V t = R t (0) = a2 2 (23) CORRELATION: R t (τ ) =Tlim →∞ 1 T 2 a cos(ωo t + θi ) cos(ωo [t + τ ] + θi )dt T ∫0 1 = a 2 cos ωoτ 2 (24) STATISTICS = TIME AVERAGES Therefore the process is considered to be an ERGODIC random process! N.B.: This particular random process will be the building block for simulating water waves. 3. ERGODIC RANDOM PROCESSES Given the random process y (t, ζ ) it is simplest to assume that its expected value is zero. Thus, if the expected value equals some constant, E{x(t , ζ )} = xo , where xo ≠ 0 , then we can simply adjust the random process such that the expected value is indeed zero: y(ζ , t) = x(t , ζ ) − xo . The autocorrelation function R(τ ) is then R(τ ) = E{ y (t , ζ ) y (t + τ , ζ )} = R t (τ ) =Tlim →∞ 1 T yi (t ) yi (t + τ )dt T ∫0 (25) with CORRELATION PROPERTIES: 1. R(0) = variance = σ 2 = (RMS) 2 ≥ 0 2. R(τ ) = R(−τ ) 3. R(0) ≥| R(τ ) | EXAMPLE: Consider the following random process that is a summation of cosines of different frequencies – similar to water waves. N y(ζ , t)= ∑ an cos(ωn t +ψ n (ζ )) (26) n=1 where ψ n (ζ ) are all independent random variables in [0, 2π ] with a uniform pdf. This random process is stationary and ergodic with an expected value of zero. The autocorrelation R(τ ) is N R(τ ) = ∑ n=1 an2 cos(ωnτ ) 2 (27)