RANDOM PROCESSES Couple of random processes Definitions Let us consider an experiment whose result is represented by a couple of random processes X(t) and Y(t) (for example the components of the seismic motion at the base of 2 piers of a viaduct, the wind velocities registered by 2 anemometers, the dynamic response of a 2-D.O.F. system). The couple of the processes X(t) and Y(t) is also called a 2-variate random process. Let us consider the values x j t1 and y j t 2 j 1, 2,... assumed by the sample functions x j t and y j t of X(t) and Y(t) for t t1 and t t 2 (Fig. 1). The set of these values constitutes the couple of random variables X1 X t1 and Y2 Y t 2 . They are characterised by the joint density function of the second order pXY x1 , t1; y2 , t 2 . From this it is immediate to derive the marginal density functions of the first order of X t1 and Y t 2 : Fig. 1 1 pX x1 , t1 pXY x1 , t1; y2 , t 2 dy2 (1) pY y2 , t 2 pXY x1 , t1; y2 , t 2 dx1 (2) From the marginal density functions of the first order, it is immediate to derive the statistical averages of the first order already defined for each random process. Statistical averages of the second order The statistical averages of the second order are the joint indexes of X1 X t1 e Y2 Y t 2 ; they can be derived from the joint density function of the second order pXY x1 , t1; y2 , t 2 . In particular, the cross–correlation function of X t and Y t is defined as: R XY t1 , t 2 E X t1 Y t 2 x1 y 2 p XY x1 , t1 ; y 2 , t 2 dx1dy 2 (3) The cross–covariance function is defined as: CXY t1 , t 2 E X t1 X t1 Y t 2 Y t 2 x1 X t1 y 2 Y t 2 p XY x1 , t1; y 2 , t 2 dx1dy 2 (4) It derives: CXY t1 , t 2 R XY t1, t 2 X t1 Y t 2 (5) The normalised cross-covariance function is defined as: XY t1 , t 2 CXY t1 , t 2 X t1 Y t 2 (6) The prefix “cross-” indicates that the random variables X t1 and Y t 2 are extracted from different random processes (although associated with the same experiment). Eqs. (3), (4) and (6) involve the following properties: R XY t1 , t 2 R YX t 2 , t1 CXY t1 , t 2 CYX t 2 , t1 (7) XY t1 , t 2 YX t 2 , t1 Two random processes X t and Y t are defined as not correlated if: R XY t1, t 2 X (t1 )Y (t 2 ) t1, t 2 R (8) 2 CXY t1 , t 2 0 t1 , t 2 R (9) Two random processes X t and Y t are defined as orthogonal if: R XY t1 , t 2 0 t1, t 2 R (10) Stationary processes A couple of random processes is defined as weakly stationary if the density functions of the first order and the joint density functions of the second order are independent of any translation of the origin of the axis of time: pX x1 , t1 pX x1 , t1 (11a) pY y2 , t 2 p Y y 2 , t 2 (11a) pXY x1 , t1; y2 , t 2 pXY x1, t1 ; y2 , t 2 (11b) Assigning t1 , it is immediate to show that Eq. (11) involves the following properties: (a) the density function of the first order is independent of t 1 ; (b) the joint density function of the second order depends on only the time interval t 2 t1 . Thus, the statistical averages of the first order are independent of time. The statistical averages of the second order depend on only the time lag t 2 t1 : R XY t1 , t 2 R XY CXY t1 , t 2 CXY XY t1 , t 2 XY The cross-correlation function of two (weakly) stationary processes: R XY E X t Y t (12) has several noteworthy properties (Fig. 2): 1) Setting = 0 in Eq. (12): R XY 0 E X t Y t 2) (13) R XY CXY XY XY XY XY . Thus, since XY 1, it follows that: XY XY R XY XY XY R (14) 3) For tending to infinite the couple of random variables X t , X t tends to become not correlated XY 0 . Thus: 3 lim R XY X Y (15) 4) Setting t t , Eq. (11) becomes R XY E X t Y t . Thus it results: R XY R YX (16) Fig. 2 The cross-covariance function of two (weakly) stationary processes: CXY E X t X Y t Y (17) has properties analogous to the cross-correlation function (Fig. 3): CXY 0 E X t X Y t Y (18) XY CXY XY R (19) lim CXY 0 (20) CXY CYX (21) 4 Fig. 3 If X t and Y t are not correlated then, due to Eq. (9): CXY 0 R (22) If X t and Y t are orthogonal then, due to Eq. (10): R XY 0 R (23) Bi-variate normal process Two random stationary processes X(t) and Y(t) have a bi-variate normal distribution if their joint density function of the second order is given by the relationship: p XY x, y; 1 2 2X Y 1 XY () 2 x 2 2 () x y 2 y 2 Y X X Y XY X y X y exp 2 2 2 2X Y 1 XY () (24) where X and Y are the means of X(t) and Y(t), 2X e 2Y are the variances, XY () is the normalised cross-covariance function. 5 Cross-power spectral density Let us consider a couple of stationary random processes with zero mean; in this case the crosscorrelation function RXY() coincides with the cross-covariance function CXY(). The cross-power spectral density, or more simply the cross-power spectrum, SXY() of X(t) andY(t), is defined as: SXY 1 CXY e i d 2 (25) Unless the factor 1/2, it coincides with the Fourier transform of the cross-covariance function CXY(). Thus, the cross-covariance function is the inverse Fourier transform (unless the factor 2) of the cross-power spectral density SXY(): CXY SXY ei d (26) Also the Eqs. (25) and (26) are known as the Wiener-Khintchine equations. SXY() exists if CXY() is absolutely integrable: CXY d Since CXY() is in general a non symmetric function, SXY() is in general a complex function. As such it can be expressed as: SXY SCXY iSQXY (27) where SCXY () Re SXY () is referred to as the co-spectrum and SQXY () Im SXY () is referred to as the quad-spectrum. Let us rewrite CXY() as: CXY 1 1 CXY CXY C XY C XY 2 2 (28) where the two terms in the brackets are, respectively, symmetric and anti-symmetric functions of . Let us execute the Fourier transform of both terms. It follows: SCXY 1 CXY CXY e i d 4 (29) SQXY 1 CXY CXY e i d 4 (30) SCXY 1 CXY CXY cos d 4 (31) SQXY 1 CXY CXY sin d 4 (32) 6 Eqs. (29) and (30) demonstrate that SCXY is the Fourier transform (unless the factor 4) of the symmetric part of CXY ; SQXY is the Fourier transform (unless the factor 4) of the antisymmetric part of CXY . Thus, SXY () is real if CXY () is symmetric. Eqs. (31) and (32) lead to the relationships: SCXY SCXY (33) SQXY SQXY (34) So, SCXY and SQXY are, respectively, symmetric and anti-symmetric functions of . Dealing with SYX in an analogous way it results: SCXY SCYX (35) SQXY SQYX (36) from which it derives: SXY S*YX (37) where S* is the complex conjugate of S. Coherence, correlation and dependence The coherence function of two stationary processes X(t) and Y(t) is defined as: XY SXY (38) SXX SYY The real and the imaginary part of the coherence function are referred to as, respectively, the cocoherence and the quad-coherence: C XY Q XY Re XY () Im XY () SCXY SXX SYY SQXY SXX SYY (39) (40) Thus: XY () CXY () i QXY () (41) Once introduced the coherence function, the cross-power spectral density assumes the form: 7 SXY () SXX ()SYY () XY () (42) which shows that the cross-power spectral density is known through the knowledge of the power spectral densities of each process and through the coherence function. The coherence function may be interpreted as the frequency domain counter-part of the normalised cross-correlation function of the processes X(t) and Y(t). It can be proved that: 0 XY () 1 (43) In particular, when: XY () 1 (44) the processes X(t) and Y(t) are referred to as perfectly correlated at the circular frequency . If such condition is satisfied for any , then the two processes are referred to as totally or identically coherent. In this case: SXY SXX SYY (45) Instead, when: XY () 0 (46) the two processes X(t) and Y(t) are referred to as not correlated at the circular frequency . If such condition is satisfied for any , then the two processes are referred to as totally or identically incoherent. In this case: SXY() = 0 (47) and, consequently, CXY() = 0. Two stationary processes X(t) and Y(t) are defined as statistically independent if: pXY x1 , t1; y2 , t 2 pX x1, t1 pY y2 , t 2 (48) for any t1 and t2. It is easy to show that two independent random processes are also not correlated. The opposite is generally not true; however, if X(t) and Y(t) are normal processes, then the not correlation implies the independence. 8 Vector of random processes Definitions Let us consider an experiment whose result is represented by a vector of n random processes X t X1 t X2 t ..Xn t (for example the components of the seismic motion at the base of n piers of a viaduct, the wind velocities registered by n anemometers, the dynamic response of a nD.O.F. system). The vector X(t) is also called a n-variate random process. Let us consider the values x i( j) t i (i = 1,2,..n; j = 1,2,..) assumed by the sample functions x i( j) t of T Xi(t) for t t i . The set of these values constitutes n random variables Xi X t i (i = 1,2,..n). They are characterised by the joint density function of order n pX1X2 ..Xn x1 , t1; x 2 , t 2 ;..x n , t n = pX x, t , where x = x1 x 2 ..x n is the vector of the state variables and t = t1 t2 ..t n is the vector of the times in correspondence of which the random variables are extracted. If X(t) is a n-variate normal random process, its complete probabilistic definition calls for the knowledge of only the joint density functions of the second order of all the possible couples of random variables Xi (t i ), X j (t j ) i, j 1, 2,...n : p Xi X j x i , t i ; x j , t j , i, j 1..n and t i , t j R . T T Stationary processes If X(t) is a (weakly) stationary random process, the joint desity functions of the second order of all the possible couples of random variables Xi (t i ), X j (t j ) i, j 1, 2,...n do not depend on the instants ti, tj but only on the time lag = tj - ti: p Xi X j x i , t i ; x j , t j = p Xi X j x i , x j ; , i, j 1..n and R . Let us define as mean vector X of the stationary process X t , a vector whose components Xi (t) are the means of the processes Xi t i 1,..n : X E X(t) X1 X2 .. Xn T (49) Let us define as correlation matrix and covariance matrix of X(t), respectively, the matrices whose i,j-th terms are the functions R Xi X j and CXi X j : R X1X1 () R X1X2 () .. R X1Xn () R R X2X2 () .. R X2Xn () X 2 X1 ( ) T R X E X t X t R Xn X1 () R Xn X2 () .. R Xn Xn () (50) CX1X1 () CX1X2 () .. CX1Xn () C () C .. CX2Xn () T X 2 X1 X 2 X 2 ( ) C X E X t X X t X CXn X1 () CXn X2 () .. CXn Xn () (51) 9 The on-diagonal terms are, respectively, the auto-correlation functions and the auto-covariance functions of each process; the off-diagonal terms are, respectively, the cross-correlation functions and the cross-covariance functions of all the possible couples of the processes. Eqs. (50) and (51) involve: R X CX XTX (52) Due to Eqs. (16) and (21): R X RTX (53) CX CTX (54) Thus, the matrices R X and CX are symmetric for 0 : R X 0 RTX 0 (55) CX 0 CTX 0 (56) It can be shown that R X and CX are semi-positive definite for any value of ( f R uT R Xu 0, f C v T CX v 0 for u, v 0 ). Let us define as power spectral density matrix of X(t) the matrix whose i,j-th term is the function SXi X j : SX1X1 () SX1X2 () .. SX1Xn () S () S .. SX2Xn () 1 X 2 X1 X 2 X 2 () i S X CX e d 2 SXn X1 () SXn X2 () .. SXn Xn () (57) CX1X1 () CX1X2 () .. CX1Xn () C () C .. CX2Xn () X 2 X1 X 2 X 2 ( ) i CX SX e d CXn X1 () CXn X2 () .. CXn Xn () (58) The on-diagonal terms of the spectral matrix are the power spectral densities of each process; the off-diagonal terms are the cross-power spectral densities of all the possible couples of the processes. Due to Eq. (37) S X is a Hermitian matrix. It can be proved that it is also semi-positive definite for any value of . From Eq. (58) it results: 10 2X1 CX1X2 (0) .. C X1Xn (0) CX2 X1 (0) X2 2 .. CX2Xn (0) = CX 0 S X d X2 n CXn X1 (0) CXn X2 (0) .. (59) The on-diagonal terms are the variances of each random process; the off-diagonal terms are the cross-covariance functions (i.e. the cross-correlation functions for a zero mean process) of all the possible couples of processes, at the time lag = 0. n-variate normal process A random stationary process X(t) has a normal n-variate distribution if all the couples of the random processes that constitute X(t) have a joint density function of order 2 given by the Eq. (24). A particular linear transformation Let Y(t) be a n-variate process linked with the n-variate process X(t) by the relationship: Y t AX t It can be shown that: μ Y Aμ X CY ACX AT RY AR X AT SY ASX AT 11