Midterm #1 Practice Problems.docx

advertisement

Midterm #1 – Practice Exam

Question #1: Let 𝑋

1

and 𝑋

2

be independent and identically distributed random variables with probability density function 𝑓

𝑋

(π‘₯) = { 𝑒 −π‘₯ 𝑖𝑓 π‘₯ > 0

0 𝑖𝑓 π‘₯ ≤ 0

such that they come from the distribution 𝑋~𝐸𝑋𝑃(1) . Compute the joint density function of π‘ˆ = 𝑋

1

and 𝑉 = 𝑋

1

𝑋

2

.

ο‚· Since 𝑋

1

and 𝑋

2

are independent, their joint density function is given by 𝑓

𝑋

1

𝑋

2

(π‘₯

1

, π‘₯

2

) = 𝑓

𝑋

1

(π‘₯

1

)𝑓

𝑋

2

(π‘₯

2

) = 𝑒 −π‘₯

1 𝑒 −π‘₯

2 = 𝑒 −(π‘₯

1

+π‘₯

2

) whenever π‘₯

1

> 0 , π‘₯

2

> 0 . We have the transformation 𝑒 = π‘₯

1

and 𝑣 = π‘₯

1 π‘₯

2

so π‘₯

1

= 𝑒 and π‘₯

2

= 𝑣 π‘₯

1

= 𝑣 𝑒

, so we can

πœ•π‘₯

1 compute the Jacobian as 𝐽 = 𝑑𝑒𝑑 [

πœ•π‘’

πœ•π‘₯

2

πœ•π‘’

πœ•π‘₯

1

πœ•π‘£

πœ•π‘₯

2

] = 𝑑𝑒𝑑 [

1 0 𝑣 1 ] = 𝑒 2 𝑒

πœ•π‘£

1 𝑒

. This implies that the joint density 𝑓

π‘ˆπ‘‰

(𝑒, 𝑣) = 𝑓

𝑋

1

𝑋

2

(𝑒, 𝑣 𝑒

) |𝐽| =

1 𝑒 𝑒

−(𝑒+ 𝑣 𝑒

) whenever 𝑒 > 0 and 𝑣 > 0 .

Question #2: Let 𝑋 be a standard normal random variable. Compute the Moment Generating

Function (MGF) of

π‘Œ = |𝑋|

using the standard normal distribution function

Φ(z)

.

ο‚· For some continuous random variable 𝑋 with density 𝑓

𝑋

(π‘₯) , its moment generating function is given by 𝑀

𝑋

(𝑑) = 𝐸(𝑒 𝑑𝑋 ) = ∫

−∞ 𝑒 𝑑π‘₯ 𝑓

𝑋

(π‘₯) 𝑑π‘₯ . Since 𝑋~𝑁(0,1) , we know that its density function is given by 𝑓

𝑋

(π‘₯) =

1

√2πœ‹ 𝑒

− π‘₯2

2

. We can therefore compute

𝑀

|𝑋|

(𝑑) = 𝐸(𝑒 𝑑|𝑋| ) = ∫

−∞ 𝑒 𝑑|π‘₯| 1

√2πœ‹ 𝑒

− π‘₯2

2 𝑑π‘₯ =

1

√2πœ‹

−∞ 𝑒 𝑑|π‘₯|− π‘₯2

2 𝑑π‘₯ =

2

√2πœ‹

0

∫ 𝑒 𝑑π‘₯− π‘₯2

2 𝑑π‘₯ =

2

√2πœ‹ 0

∫ 𝑒

2𝑑π‘₯−π‘₯2

2 𝑑π‘₯ =

2

√2πœ‹ 0

∫ 𝑒 𝑑2−(π‘₯−𝑑)2

2 𝑑π‘₯ =

2

√2πœ‹ 0

∫ 𝑒 𝑑2

2 𝑒

(π‘₯−𝑑)2

2 𝑑π‘₯ . Separating out the 𝑑 variable and making the substitution 𝑒 = π‘₯ − 𝑑

and 𝑑𝑒 = 𝑑π‘₯

then yields

2𝑒 𝑑2

2

−𝑑

1

√2πœ‹ 𝑒

− 𝑒2

2 𝑑𝑒 = 2𝑒 𝑑2

2

[Φ(∞) − Φ(−𝑑)] = 2𝑒 𝑑2

2

[1 − Φ(−𝑑)] = 2𝑒 𝑑2

2

Φ(𝑑) . Thus, the

Moment Generating Function of

π‘Œ = |𝑋|

is given by

𝑀

|𝑋|

(𝑑) = 2Φ(𝑑)𝑒 𝑑2

2

.

Question #3: Let 𝑋

1

and 𝑋

2

be independent random variables with respective probability density functions 𝑓

𝑋

1

(π‘₯) = { 𝑒 −π‘₯ 𝑖𝑓 π‘₯ > 0

0 𝑖𝑓 π‘₯ ≤ 0

and 𝑓

𝑋

2

(π‘₯) = {

1

2 𝑒

(π‘₯−1)

2

𝑖𝑓 π‘₯ > 1

0 𝑖𝑓 π‘₯ ≤ 1

. Calculate the probability density function of the transformed random variable π‘Œ = 𝑋

1

+ 𝑋

2

.

ο‚· Since 𝑋

1

and 𝑋

2

are independent random variables, their joint density is 𝑓

𝑋

1

𝑋

2

(π‘₯

1

, π‘₯

2

) = 𝑓

𝑋

1

(π‘₯

1

)𝑓

𝑋

2

(π‘₯

2

) =

1

2 𝑒 −π‘₯

1 𝑒

(π‘₯2−1)

2

=

1

2 𝑒

(2π‘₯1+π‘₯2−1)

2

if π‘₯

1

> 0 and π‘₯

2

> 1 .

We have 𝑦 = π‘₯

1

+ π‘₯

2

and generate 𝑧 = π‘₯

1

so π‘₯

1

= 𝑧 and π‘₯

2

= 𝑦 − π‘₯

1

= 𝑦 − 𝑧 , so we

πœ•π‘₯

1 can compute the Jacobian as

𝐽 = 𝑑𝑒𝑑 [

πœ•π‘¦

πœ•π‘₯

2

πœ•π‘¦

πœ•π‘₯

1

πœ•π‘§

πœ•π‘₯

2

] = 𝑑𝑒𝑑 [

0 1

1 −1

] = −1

. This implies

πœ•π‘§ that the joint density 𝑓

π‘Œπ‘

(𝑦, 𝑧) = 𝑓

𝑋

1

𝑋

2

(𝑧, 𝑦 − 𝑧)|𝐽| =

1

2 𝑒

(2𝑧+𝑦−𝑧−1)

2

=

1

2 𝑒

(𝑧+𝑦−1)

2 whenever 𝑧 > 0 and 𝑦 − 𝑧 > 1 → 𝑦 > 1 + 𝑧 or 𝑧 < 𝑦 − 1 . We then integrate out 𝑧 to find the PDF 𝑓

π‘Œ

(𝑦) = ∫

−∞ 𝑓

π‘Œπ‘

(𝑦, 𝑧) 𝑑𝑧 =

1

2 0 𝑦−1

∫ 𝑒

(𝑧+𝑦−1)

2 𝑑𝑧 =

1

2 𝑒

(𝑦−1)

2

0 𝑦−1

∫ 𝑒

− 𝑧

2 𝑑𝑧 =

1

2 𝑒

(𝑦−1)

2

[−2𝑒

− 𝑧

2

]

0 𝑦−1

= −𝑒

(𝑦−1)

2

[𝑒

− 𝑧

2 𝑦−1

]

0

= −𝑒

(𝑦−1)

2

(𝑒

(𝑦−1)

2

− 1) = 𝑒

1−𝑦

2

(1 − 𝑒

1−𝑦

2

) .

Thus, we find that 𝑓

π‘Œ

(𝑦) = 𝑒

1−𝑦

2

(1 − 𝑒

1−𝑦

2

) whenever 𝑦 > 0 and zero otherwise.

Question #4: Let 𝑋

1

, … , 𝑋 𝑛

be independent and identically distributed random variables from a cumulative distribution function 𝐹

𝑋

(π‘₯) = {

1 −

1 π‘₯

𝑖𝑓 π‘₯ > 1

0 𝑖𝑓 π‘₯ ≤ 1

. Find the limiting distribution of the transformed first order statistic given by 𝑋 𝑛

1:𝑛

.

ο‚· We can calculate that 𝐹

𝑋 𝑛

1:𝑛

(π‘₯) = 𝑃(𝑋 𝑛

1:𝑛

≤ π‘₯) = 𝑃 (𝑋

1:𝑛

≤ π‘₯ 𝑛

1

) = 1 − 𝑃 (𝑋

1:𝑛

> π‘₯

1 𝑛

) =

1 − 𝑃 (𝑋 𝑖

> π‘₯ 𝑛

1

) 𝑛

= 1 − [1 − 𝐹

𝑋

(π‘₯ 𝑛

1

)] 𝑛

= 1 − [1 − (1 −

1

1 π‘₯ 𝑛

)] 𝑛

= 1 −

1 π‘₯

when π‘₯ > 1 .

Since there is no dependence on 𝑛 , we may simply compute the desired limiting distribution as lim 𝑛→∞

𝐹

𝑋 𝑛

1:𝑛

(π‘₯) = { lim 𝑛→∞ lim 𝑛→∞

(1 −

1 π‘₯

) 𝑖𝑓 π‘₯ > 1

(0) 𝑖𝑓 π‘₯ ≤ 1

= {

1 −

1 π‘₯

𝑖𝑓 π‘₯ > 1

0 𝑖𝑓 π‘₯ ≤ 1

= 𝐹

𝑋

(π‘₯) .

Question #5: Let 𝑋

1

, … , 𝑋 𝑛

be independent and identically distributed random variables with density function 𝑓

𝑋

(π‘₯) = {

1 𝑖𝑓 π‘₯ ∈ (0,1)

0 𝑖𝑓 π‘₯ ∉ (0,1)

. That is, they are sampled from a random variable distributed as

𝑋~π‘ˆπ‘πΌπΉ(0,1)

. Approximate

𝑃(∑

90 𝑖=1

𝑋 𝑖

≤ 77)

using

𝑍~𝑁(0,1)

.

ο‚· Since 𝑋~π‘ˆπ‘πΌπΉ(0,1) , we know that 𝐸(𝑋) =

0+1

=

2

1

2

and π‘‰π‘Žπ‘Ÿ(𝑋) =

(1−0) 2

12

=

1

12

. Then if we let π‘Œ = ∑ 90 𝑖=1

𝑋 𝑖

, we have 𝐸(π‘Œ) = 𝐸(∑ 90 𝑖=1

𝑋 𝑖

) = ∑ 90 𝑖=1

𝐸(𝑋 𝑖

) = ∑ 90 𝑖=1

1

2

=

90

= 45 and

2

π‘‰π‘Žπ‘Ÿ(π‘Œ) = π‘‰π‘Žπ‘Ÿ(∑ 90 𝑖=1

𝑋 𝑖

) = ∑ 90 𝑖=1

π‘‰π‘Žπ‘Ÿ(𝑋 𝑖

) = ∑ 90 𝑖=1

1

12

90

=

12

= 7.5

so that 𝑠𝑑(π‘Œ) = √7.5

.

Thus

𝑃(∑

90 𝑖=1

𝑋 𝑖

≤ 77) = 𝑃(π‘Œ ≤ 77) = 𝑃 (

π‘Œ−𝐸(π‘Œ) 𝑠𝑑(π‘Œ)

77−𝐸(π‘Œ) 𝑠𝑑(π‘Œ)

) = 𝑃 (

π‘Œ−45

√7.5

77−45

) ≈

√7.5

𝑃 (𝑍 ≤

32

√7.5

) = 𝑃(𝑍 ≤ 11.68) = Φ(11.68) ≈ 0.99

where 𝑍~𝑁(0,1) .

Question #6: If 𝑋~𝐸𝑋𝑃(1) such that 𝑓

𝑋

(π‘₯) = { 𝑒 −π‘₯ 𝑖𝑓 π‘₯ > 0

0 𝑖𝑓 π‘₯ ≤ 0

, what transformation of 𝑋 will yield a random variable π‘Œ~π‘ˆπ‘πΌπΉ(0,1) ? What about a random variable 𝑍~π‘ˆπ‘πΌπΉ(−1,1) ?

ο‚· Let π‘Œ = 1 − 𝑒 −𝑋 , so we can compute that 𝐹

π‘Œ

(𝑦) = 𝑃(π‘Œ ≤ 𝑦) = 𝑃(1 − 𝑒 −𝑋 ≤ 𝑦) =

𝑃(𝑋 ≤ −𝑙𝑛 (1 − 𝑦)) = 𝐹

𝑋

(−𝑙 𝑛(1 − 𝑦)) and then by differentiating we obtain 𝑓

π‘Œ

(𝑦) = 𝑑 𝑑𝑦

[𝐹

𝑋

(−𝑙 𝑛(1 − 𝑦))] = 𝑓

𝑋

(−𝑙 𝑛(1 − 𝑦)) (−

−1

1−𝑦

) =

1−𝑦

1−𝑦

= 1 so that π‘Œ~π‘ˆπ‘πΌπΉ(0,1) .

ο‚· Since we need to end up with 1−𝑦

2(1−𝑦)

=

1

2

to conclude that 𝑍~π‘ˆπ‘πΌπΉ(−1,1) , it is clear that we must make the transformation 𝑍 = 1 − 2𝑒 −𝑋 to obtain this result. We can therefore compute 𝐹

𝑍

(𝑧) = 𝑃(𝑍 ≤ 𝑧) = 𝑃(1 − 2𝑒 −𝑋 ≤ 𝑧) = 𝑃 (𝑋 ≤ −𝑙 𝑛 (

1−𝑧

2

)) =

𝐹

𝑋

(−𝑙 𝑛 (

1−𝑧

2

)) so after differentiating we have 𝑓

𝑍

(𝑧) = 𝑑 𝑑𝑦

[𝐹

𝑋

(−𝑙 𝑛 (

1−𝑧

2

))] = 𝑓

𝑋

(−𝑙 𝑛 (

1−𝑧

2

)) (−

2

1−𝑦

(−

1

2

) ) =

1−𝑧

2(1−𝑧)

=

1

2

so that we conclude 𝑍~π‘ˆπ‘πΌπΉ(−1,1) .

Question #7: Does convergence in probability imply convergence in distribution? Does convergence in distribution imply convergence in probability? Give a counterexample if not.

ο‚· We have that 𝐢𝑃 ⇒ 𝐢𝐷 , but that 𝐢𝐷 ⇏ 𝐢𝑃 . As a counterexample, consider π‘Œ 𝑛

= −𝑋 where 𝑋~𝑁(0,1) . Then π‘Œ 𝑛

~𝑁(0,1) so π‘Œ 𝑛

→ 𝑑 𝑋 . However, we have 𝑃(|π‘Œ 𝑛

− 𝑋| > πœ–) =

𝑃(|−𝑋 − 𝑋| > Ο΅) = 𝑃(2|𝑋| > Ο΅) = 𝑔(πœ–) > 0 . This expression won’t converge to zero as 𝑛 → ∞ since there is no dependence on 𝑛 in the probability expression. This implies that there is no convergence in probability, that is π‘Œ 𝑛

↛ 𝑝 𝑋 .

Question #8: State the law of large numbers (LLN), including its assumptions. Then state the central limit theorem (CLT), including its assumptions.

ο‚· Law of Large Numbers: If 𝑋

1

, … , 𝑋 𝑛

are independent and identically distributed with finite mean πœ‡ and variance 𝜎 2 , then the sample mean 𝑋̅ → 𝑝 πœ‡ .

ο‚· Central Limit Theorem: If 𝑋

1

, … , 𝑋 𝑛

are independent and identically distributed with finite mean πœ‡ and variance 𝜎 2 , then

[∑ 𝑛 𝑖=1

𝑋 𝑖

]−π‘›πœ‡ 𝜎√𝑛

→ 𝑑 𝑁(0,1) and 𝑋̅−πœ‡ 𝜎/√𝑛

→ 𝑑 𝑁(0,1) .

Question #10: Let 𝑋

1

and 𝑋

2

be independent random variables with Cumulative Distribution

Function (CDF) given by 𝐹

𝑋

(π‘₯) = 1 − 𝑒 −π‘₯

2

/2 for π‘₯ > 0 . Find the joint probability density function 𝑓

π‘Œ

1

π‘Œ

2

(𝑦

1

, 𝑦

2

) of the random variables π‘Œ

1

= √𝑋

1

+ 𝑋

2

and π‘Œ

2

=

𝑋

1

𝑋

2

+𝑋

2

.

ο‚· We first find the density 𝑓

𝑋

(π‘₯) = 𝑑 𝑑π‘₯

𝐹

𝑋

(π‘₯) = −𝑒

− π‘₯2

2

(−π‘₯) = π‘₯𝑒

− π‘₯2

2

for π‘₯ > 0 . Then since

𝑋

1

and 𝑋

2

are independent random variables, their joint density function is 𝑓

𝑋

1

𝑋

2

(π‘₯

1

, π‘₯

2

) = 𝑓

𝑋

1

(π‘₯

1

)𝑓

𝑋

2

(π‘₯

2

) = [π‘₯

1 𝑒

− π‘₯1

2

] [π‘₯

2 𝑒

− π‘₯2

2

] = π‘₯

1 π‘₯

2 𝑒

2+π‘₯ 2

2

2

for π‘₯

1

> 0, π‘₯

2

> 0 .

We then have that 𝑦

1

= √π‘₯

1

+ π‘₯

2

and 𝑦

2

= π‘₯

1 π‘₯

2

+π‘₯

2

so that π‘₯

2

= 𝑦

2

(π‘₯

1

+ π‘₯

2

) = 𝑦

2 𝑦 2

1

and π‘₯

1

= 𝑦 2

1

− π‘₯

2

= 𝑦 2

1

− 𝑦

2 𝑦 2

1

. These computations allow us to compute the Jacobian as

πœ•π‘₯

1

𝐽 = 𝑑𝑒𝑑 [

πœ•π‘¦

1

πœ•π‘₯

2

πœ•π‘¦

1

πœ•π‘₯

1

πœ•π‘¦

2

πœ•π‘₯

2

] = 𝑑𝑒𝑑 [

2𝑦

1

2𝑦

1 𝑦

2

πœ•π‘¦

2

− 2𝑦

1 𝑦

2

−𝑦 2

1 𝑦 2

1

] = 2𝑦 3

1

− 2𝑦 3

1 𝑦

2

+ 2𝑦 3

1 𝑦

2

= 2𝑦 3

1

. This implies that the joint density function is 𝑓

π‘Œ

1

π‘Œ

2

(𝑦

1

, 𝑦

2

) = 𝑓

𝑋

1

𝑋

2

(𝑦 2

1

− 𝑦

2 𝑦 2

1

, 𝑦

2 𝑦 2

1

)|𝐽| =

(𝑦 2

1

− 𝑦

2 𝑦 2

1

)(𝑦

2 𝑦 2

1

)𝑒

2−𝑦2𝑦12)

2

2

2)

2

2𝑦 3

1

whenever 𝑦 2

1

− 𝑦

2 𝑦 2

1

> 0, 𝑦

2 𝑦 2

1

> 0 . This work also reveals that π‘Œ

1

and π‘Œ

2

are not independent since their joint probability density function 𝑓

π‘Œ

1

π‘Œ

2

(𝑦

1

, 𝑦

2

) cannot be factored into two functions of 𝑦

1

and 𝑦

2

.

Question #11: Let 𝑋

1

, 𝑋

2

, 𝑋

3

be independent random variables and suppose that we know

𝑋

1

~𝑁(2,4) , 𝑋

2

~𝑁(3,9) , and 𝑋

1

+ 𝑋

2

+ 𝑋

3

~𝑁(4,16) . What is the distribution of 𝑋

3

?

ο‚· We use the Moment Generating Function (MGF) technique to note that since the three random variables are independent, we have

𝑀

𝑋

1

+𝑋

2

+𝑋

3

(𝑑) = 𝑀

𝑋

1

(𝑑)𝑀

𝑋

2

(𝑑)𝑀

𝑋

3

(𝑑) →

𝑀

𝑋

3

(𝑑) =

𝑀

𝑋1+𝑋2+𝑋3

(𝑑)

𝑀

𝑋1

(𝑑)𝑀

𝑋2

(𝑑)

= 𝑒

4𝑑+16𝑑2/2

(𝑒 2𝑑+4𝑑2/2 )(𝑒 3𝑑+9𝑑2/2 )

= 𝑒

4𝑑+16𝑑2/2 𝑒 5𝑑+13𝑑2/2

= 𝑒 −𝑑+3𝑑

2

/2 . This reveals that the random variable 𝑋

3

~𝑁(−1,3) since 𝑀

𝐴

(𝑑) = 𝑒 πœ‡π‘‘+𝜎

2 𝑑

2

/2 when 𝐴~𝑁(πœ‡, 𝜎 2 ) .

Question #12: Let 𝑋

1

, … , 𝑋

5

be independent 𝑁(πœ‡, 𝜎 2 ) random variables. Give examples of random variables π‘Œ

1

, … , π‘Œ

4

defined in terms of the random variables 𝑋

1

, … , 𝑋

5

such that we have π‘Œ

1

~𝑁(0,1) , π‘Œ

2

~πœ’ 2 (3) , π‘Œ

3

~𝐹(1,3) , and π‘Œ

4

~𝑑(3) .

ο‚· We have that π‘Œ

1

=

𝑋

1

−πœ‡ 𝜎

~𝑁(0,1) , π‘Œ

2

= ∑ 3 𝑖=1

π‘Œ 𝑖

2 = ∑ 3 𝑖=1

(

𝑋 𝑖

−πœ‡

)

2 𝜎

~πœ’ 2 (3) , π‘Œ

3

=

π‘Œ

2

1

/1

π‘Œ

2

/3

=

(

𝑋1−πœ‡

) 𝜎

2

/1

[∑

3 𝑖=1

(

𝑋𝑖−πœ‡ 𝜎

)

2

]/3

~𝐹(1,3) , and π‘Œ

4

=

π‘Œ

1

√π‘Œ

2

/3

=

(

𝑋1−πœ‡ 𝜎

)

√[∑

3 𝑖=1

(

𝑋𝑖−πœ‡

) 𝜎

2

]/3

~𝑑(3) . These random variables are constructed since if some 𝑋~𝑁(πœ‡, 𝜎 2 ) , then 𝑍 =

𝑋−πœ‡

~𝑁(0,1) . This implies that 𝜎

π‘Œ = 𝑍 2 ~πœ’ 2 (1)

, so that

π‘ˆ = ∑ 𝑛 𝑖=1

π‘Œ 𝑖

~πœ’ 2 (𝑛)

. Finally, we note that

𝑉 =

π‘ˆ/𝑛

~𝐹(𝑛, π‘˜)

𝑆/π‘˜ where 𝑆~πœ’ 2 (π‘˜) and that 𝑇 =

𝑍

√π‘ˆ/𝑛

~𝑑(𝑛) .

Question #13: If 𝑋

1

and 𝑋

2

are independent 𝐸𝑋𝑃(1) random variables such that their densities are 𝑓(π‘₯) = 𝑒 −π‘₯ 1{π‘₯ > 0} , then find the joint density of π‘ˆ =

𝑋

1

𝑋

2

and 𝑉 = 𝑋

1

+ 𝑋

2

.

ο‚· By independence, 𝑓

𝑋

1

𝑋

2

(π‘₯

1

, π‘₯

2

) = 𝑓

𝑋

1

(π‘₯

1

)𝑓

𝑋

2

(π‘₯

2

) = [𝑒 −π‘₯

1 ][𝑒 −π‘₯

2 ] = 𝑒 −(π‘₯

1

+π‘₯

2

) if π‘₯

1

> 0 and π‘₯

2

> 0 . We have 𝑒 = π‘₯

1 π‘₯

2

and 𝑣 = π‘₯

1

+ π‘₯

2

so π‘₯

1

= 𝑒π‘₯

2

= 𝑒(𝑣 − π‘₯

1

) , which implies that π‘₯

1

+ 𝑒π‘₯

1

= 𝑒𝑣 so π‘₯

1

= 𝑒𝑣

1+𝑒

. Then π‘₯

2

= 𝑣 − π‘₯

1 𝑒𝑣

= 𝑣 −

1+𝑒

= 𝑣+𝑒𝑣−𝑒𝑣

1+𝑒

= 𝑣

1+𝑒

. This

πœ•π‘₯

1 allows us to compute the Jacobian as 𝐽 = 𝑑𝑒𝑑 [

πœ•π‘’

πœ•π‘₯

2

πœ•π‘’

πœ•π‘₯

1

πœ•π‘£

πœ•π‘₯

2

] = 𝑑𝑒𝑑 [

πœ•π‘£ 𝑣(1+𝑒)−𝑒𝑣

(1+𝑒) 2

−𝑣

(1+𝑒) 2 𝑒

1+𝑒

1

] =

1+𝑒 𝑣(1+𝑒)−𝑒𝑣

(1+𝑒) 2

1

(

1+𝑒

) + 𝑒

1+𝑒 𝑣

(

(1+𝑒) 2

) = 𝑣

(1+𝑒) 3 𝑒𝑣

+

(1+𝑒) 3

= 𝑣(1+𝑒)

(1+𝑒) 3

= 𝑣

(1+𝑒) 2

. Thus, the joint density is 𝑓

π‘ˆπ‘‰

(𝑒, 𝑣) = 𝑓

𝑋

1

𝑋

2 𝑒𝑣

(

1+𝑒

, 𝑣

1+𝑒

) |𝐽| = 𝑣

(1+𝑒) 2 𝑒 𝑒𝑣

−(

1+𝑒

+ 𝑣

1+𝑒

)

= 𝑣

(1+𝑒) 2 𝑒 −𝑣 whenever we have that 𝑒 > 0, 𝑣 > 0 and zero otherwise.

Question #14: If 𝑋

1

, 𝑋

2

, 𝑋

3

are a random sample of size 𝑛 = 2 from the π‘ˆπ‘πΌπΉ(0,1) distribution, then a) find the joint density of the order statistics π‘Œ

1

, π‘Œ

2

, π‘Œ

3

, b) find the marginal densities of π‘Œ

1

and π‘Œ

3

, and c) find the mean of the sample range 𝑅 = π‘Œ

3

− π‘Œ

1

. a) We have 𝑓

π‘Œ

1

π‘Œ

2

π‘Œ

3

(𝑦

1

, 𝑦

2

, 𝑦

3

) = 3! ∏ 3 𝑖=1 𝑓

𝑋

(𝑦 𝑖

) = 6 ∏ 3 𝑖=1

1 = 6 if 0 < 𝑦

1

< 𝑦

2

< 𝑦

3

< 1 . b) We have 𝑓

π‘Œ

1

(𝑦

1

) = 3𝑓

𝑋

(𝑦

1

)[1 − 𝐹

𝑋

(𝑦

1

)] 2 = 3(1 − 𝑦

1

) 2 if 0 < 𝑦

1

< 1 and that 𝑓

π‘Œ

3

(𝑦

3

) = 3𝑓

𝑋

(𝑦

3

)[𝐹

𝑋

(𝑦

3

)] 2 = 3𝑦 2

3

whenever 0 < 𝑦

3

< 1 . c) We have 𝐸(𝑅) = 𝐸(π‘Œ

3

) − 𝐸(π‘Œ

1

) , so we compute 𝐸(π‘Œ

3

) = ∫ 𝑦

3

3𝑦 2

3 𝑑𝑦

3

=

3

4

[𝑦 3

3

] 1

0

=

3

4 and 𝐸(π‘Œ

1

) = ∫ 𝑦

1

3(1 − 𝑦

1

) 2 𝑑𝑦

1

0

1

= 3 ∫ 𝑦

1

− 2𝑦 2

1

+ 𝑦 3

1 𝑑𝑦

1

= 3 [ 𝑦

1

2

2𝑦

3

1

3

+ 𝑦

4

1

4

1

]

0

=

1

4

.

Then, we have that 𝐸(𝑅) = 𝐸(π‘Œ

3

) − 𝐸(π‘Œ

1

) =

3

4

1

4

=

1

2

. Alternatively, we could have found 𝑓

π‘Œ

1

π‘Œ

3

(𝑦

1

, 𝑦

3

) and computed 𝐸(𝑅) as a double integral.

Download