Statistical Models Dr. Anis Koubâa CS433 Modeling and Simulation

advertisement

Al-Imam Mohammad Ibn Saud University

1

27 Oct 2008

CS433

Modeling and Simulation

Lecture 04

Statistical Models

http://10.2.230.10:4040/akoubaa/cs433/

Dr. Anis Koubâa

Goals for Today

Understand the difference between discrete and continuous random variables

Review of the most common statistical models

Understand how to determine the empirical distribution from a statistical sample.

2

2

Topics

 Discrete Random Variable

 Continuous Random Variable

 Discrete Probability Distributions

 Binomial Distribution

 Bernoulli Distribution

 Discrete Poisson Distribution

 Continuous Probability Distribution

 Uniform

 Exponential

 Normal

 Weibull

 Lognormal

 Empirical Distributions 3

3

Discrete and Continuous Random Variables

4

Discrete Random Variables

X is a discrete random variable if the number of possible values of X (the sample space) is finite .

Example: Consider jobs arriving at a job shop.

 Let X be the number of jobs arriving each week at a job shop.

R x

= possible values of X (range space of X)

p(x i

) = probability the random variable is x i

= {0,1,2,…}

= p(X = x i

)

The collection of pairs [x i distribution of X ,

, p(x i

)], i = 1,2,…, is called the probability

p(x i

) is called the probability mass function (PMF) of X .

Characteristics of the PMF: p(x i

), i = 1,2, … must satisfy: p x i

)

0, for all i

2.

 i

1 p x i

5

Continuous Random Variables

X is a continuous random variable if its range space R x collection of intervals.

is an interval or a

The probability that X lies in the interval [a,b] is given by:

P ( a

X

 b )

  b f ( x ) dx a

Where f(x) is the probability density function (PDF) .

Characteristics of the PDF : f(x) must satisfies:

1.

f ( x )

0 , for all x in R

X

2.

R

X

3.

f f ( x ) dx

( x )

1

0 , if x is not in R

X

Properties

1.

P ( X

2 .

P ( a

X x

0

)

 b )

0 , because

 x x

0

0

P ( a  X f

 b )

( x ) dx

P ( a

0

X  b )

P ( a  X  b )

6

Discrete versus Continuous

Random Variables

Discrete Random Variable

Continuous Random Variable

Finite Sample Space e.g. {0, 1, 2, 3}

Infinite Sample Space e.g. [0,1], [2.1, 5.3]

Probability Mass Function (PMF)

     x i p x i

)

0, for all i

2.

 i

1 p x i

Probability Density Function (PDF)

 

1.

f ( x )

0 , for all x in R

X

2.

R

X

 f ( x ) dx

1

3.

f ( x )

0 , if x is not in R

X

Cumulative Distribution Function (CDF)

  x

  x

   x i

 x p x i

)

  x

 

X

 b

 x



  a

  

0 b f

 

7

Five Minutes Break

You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions.

Administrative issues

Groups Formation

Choose a “class coordinator”

8

Expectation

The expected value (the mean) of X is denoted by E(X)

If X is discrete E ( x )

  all i x i p ( x i

)

If X is continuous

E ( x )

  

  xf ( x ) dx

A measure of the central tendency

The variance of X is denoted by V(X) or var(X) or s

2

Definition:

Also,

 

 

E X

2

 

2

  

2

A measure of the spread or variation of the possible values of X around the mean

The standard deviation of X is denoted by s

Definition: square root of V(X)

Expressed in the same units as the mean

9

Example: Continuous Random Variables

Example: modeling the lifetime of a device

Time is a continuous random variable

Random Time is typically modeled as exponential distribution

We assume that with average lifetime of a device is 2 years f ( x )



1

2

0 e

 x / 2

, x

0

, otherwise

Probability that the device’s life is between 2 and 3 years is:

P ( 2

 x

3 )

1

2

2

3 e

 x / 2 dx

0 .

14

10

Example: Continuous Random Variables

Cumulative Distribution Function : A device has the CDF:

F ( x )

1

2

0 x e

 t / 2 dt

1

 e

 x / 2

The probability that the device lasts for less than 2 years:

P ( 0

X

2 )

F ( 2 )

F ( 0 )

F ( 2 )

1

 e

1 

0 .

632

The probability that it lasts between 2 and 3 years:

P ( 2

X

3 )

F ( 3 )

F ( 2 )

( 1

 e

( 3 / 2 )

)

( 1

 e

1

)

0 .

145

11

Example: Continuous Random Variables

Expected Value and Variance

Example: The mean of life of the previous device is:

( )

1

2

0

 x / 2 xe dx

 xe

 x /2

0

 

0

 x / 2 e dx

2

To compute variance of X, we first compute E(X 2 ):

E ( X

2

)

1

2

0

 x

2 e

 x / 2 dx

 x 2 e

 x / 2

0

 

0

  e x / 2 dx

8

Hence, the variance and standard deviation of the device’s life are:

V ( X )

8

2

2 

4 s 

V ( X )

2

12

Discrete Probability Distributions

Bernoulli Trials

Binomial Distribution

Geometric Distribution

Poisson Distribution

Poisson Process

13

Discrete Distributions

Discrete random variables are used to describe random phenomena in which only integer values can occur.

In this section, we will learn about:

Bernoulli trials and Bernoulli distribution

Binomial distribution

Geometric and negative binomial distribution

Poisson distribution

14

14

Modeling of Random Events with Two-States

Bernoulli Trials

Binomial Distribution

15

Bernoulli Trials

In the theory of probability and statistics, a Bernoulli trial is an experiment whose outcome is random and can be either of two possible outcomes , " success " and

" failure ".

In practice it refers to a single experiment, which can have one of two possible outcomes. These events can be phrased into “yes” or “no” questions:

Did the coin land heads?

Was the newborn child or a girl?

Were a person's eyes green?

16

Bernoulli Distribution

Consider an experiment consisting of n trials, each can be a

success or a failure.

Let X j and X j

= 1 if the j th experiment is a success

= 0 if the j th experiment is a failure

The Bernoulli distribution (one trial):

PMF: p x j

)

 p x j

)

1 p , p q , x x j j

1,

0 j

0, otherwise

1, 2,...,

1 2 n

Expected Value:

 j

  p Variance : V  X j

 s 2 p

1 p

Bernoulli process

It is the n Bernoulli trials where trials are independent:

1

, X

1

,..., X n

,

      

...

 

17

Binomial Distribution

A binomial random variable is the number of successes in a series of n trials .

Example: the number of 'heads' occurring when a coin is tossed 50 times.

A discrete random variable X is said to follow a Binomial distribution with parameters n and p , written X ~ Bi(n,p) or

X ~ B(n,p) if it has the probability distribution:

 where

 k

 

x = 0, 1, 2, ......., n

n = 1, 2, 3, ....... p k

 p

p = success probability; 0 < p < 1

Where

 

 

 k !

 n n

!

 k

!

Expected Value:

  n p

Variance :

 

 s 2 n p

1 p

18

Binomial Distribution

The trials must meet the following requirements:

 the total number of trials is fixed in advance;

 there are just two outcomes of each trial; success and failure ;

 the outcomes of all the trials are statistically independent ;

 all the trials have the same probability of success.

19

19

Binomial Distribution

The number of successes in n Bernoulli trials, X , has a binomial distribution.

p ( X

 k )



 n k

 p k q n

 k

, k

0 , 1 ,

0 , otherwise

2 ,..., n

The number of outcomes having the required number of successes and failures

Probability that there are x successes and

(n-x) failures

The formula can be understood as follows: we want k successes

(p k ) and n k failures (1 − p) n k . However, the k successes can occur anywhere among the n trials, and there are C(n, k) different ways of distributing k successes in a sequence of n trials.

20

End of Part 01

Administrative issues

Groups Formation

Choose a “class coordinator”

21

Modeling of Discrete Random Time

Geometric Distribution

22

Geometric Distribution

Geometric Distribution represents the number X of Bernoulli trials to achieve the FIRST SUCCESS .

It is used to represent random time until a first transition occurs

PMF

PMF: (

 k )

 q k

1 p , k

0,1, 2,...,

 0, otherwise n

CDF: F

     k

1

1 p

 k

Expected Value :

 

1 p

Variance :

 

 s 2  q p

2

1

 p p

2 k

23

Negative Binomial Distribution

24

Negative Binomial Distribution

The negative binomial distribution is a discrete probability distribution that can be used to describe the distribution arising from an experiment consisting of a sequence of independent trials, subject to several constraints.

Firstly each trial results in success or failure, the probability of success for each trial, p , is constant across the experiment and finally the experiment continues until a fixed number of successes have been achieved.

Negative Binomial Distribution

The number of Bernoulli trials, X , until the k th success

If X is a negative binomial distribution with parameters p and r , then:

,

   k

 k r 1

 p r

1 p

 k

, k

1, 2,3...

 k

0, otherwise

Expected Value :

 

 r

1

 p p

Variance :

 

 s 2  r

 p

 p

2

25

Modeling of Random Number of

Arrivals/Events

Poisson Distribution

Poisson Process

26

Poisson Distribution

 the Poisson distribution is a discrete probability distribution that expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate and independently of the time since the last event.

Poisson random variable represents the count of the number of events that occur in a certain time interval or spatial area.

Example:

The number of cars passing a fixed point in a 5 minute interval,

The number of calls received by a switchboard during a given period of time.

The number of message coming to a router in a given period of time

27

Discrete Poisson Distribution

A discrete random variable X is said to follow a Poisson distribution with parameter l

, written X ~ Po( l ), if it has probability distribution

PMF:

  k

  l k k !

exp

The PMF represents the probability that there are k arrivals in a certain period of time.

 where

X = 0, 1, 2, ..., n l > 0 is called the arrival rate.

28

Discrete Poisson Distribution

Poisson distribution describes many random processes quite well and is mathematically quite simple.

The Poisson distribution with the parameter l is characterized by:

PMF:

     k

 l k k

!

exp

 

for

0, otherwise

0,1, 2, ....

PMF

CDF:

     k

  i k 

0 l i i !

 exp

Expected value:

 

 l

Variance:

 

 l

CDF

29

Discrete Poisson Distribution

The following requirements must be met in the Poisson Distribution:

 the length of the observation period is fixed in advance;

 the events occur at a constant average rate;

 the number of events occurring in disjoint intervals are statistically independent.

30

Example: Poisson Distribution

The number of cars that enter the parking follows a Poisson distribution with a mean rate equal to l = 20 cars/hour

The probability of having exactly 15 cars entering the parking in one hour: or p

 

15

20

15

15!

 exp

20

0.051649

p

 

F

 

F

 

  

0.051649

The probability of having more than 3 cars entering the parking in one hour:

 

3

1 p

 

3

1 F

   p

   p

   p

 

0.9999967

USE EXCEL/MATLAB

Example: Poisson Distribution

Probability Mass Function

Poisson ( l

= 20 cars/hour)

  k

 

20 k k !

 exp

  

Cumulative Distribution Function

Poisson ( l

= 20 cars/hour)

     k

  i k 

0

20 i i !

 exp

 

20

32

Five Minutes Break

You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions.

Administrative issues

Groups Formation

33

Modeling of Random Number of

Arrivals/Events

Poisson Distribution

Poisson Process

34

Poisson Process

Wikipedia: A Poisson process , named after the French mathematician Siméon-Denis Poisson (1781 – 1840), is the

stochastic process in which events (e.g. arrivals) occur continuously and independently of one another .

Formal Definition: The Poisson Process is a counting function { N( t ), t ≥0} where N( t ) is the number of events that have occurred up to time t , i.e. in the interval [0, t ].

Fact: The number of events between time a and time b is given as

N(b) − N(a) and has a Poisson distribution .

The Poisson process is a continuous-time process : Time is continuous

Its discrete-time counterpart is the Bernoulli process

 Bernoulli process is a discrete-time stochastic process consisting of a sequence of independent random variables taking values over two symbols. 35

Examples of using Poisson Process

The number of web page requests arriving at a server may be characterized by a Poisson process except for unusual circumstances such as coordinated denial of service attacks.

The number of telephone calls arriving at a switchboard, or at an automatic phone-switching system, may be characterized by a

Poisson process.

The number of raindrops falling over a wide spatial area may be characterized by a spatial Poisson process.

The arrival of "customers" is commonly modelled as a Poisson process in the study of simple queueing systems.

The execution of trades on a stock exchange, as viewed on a tick by tick basis, is a Poisson process.

36

(Homogenous) Poisson Process

The homogeneous Poisson process is characterized by a CONSTANT rate parameter λ , also known as intensity , such that the number of events in time interval

, l t

 t  follows a Poisson distribution with

N

 

Poisson process with mean rate l if:

0

 for t

0 and n

0,1, 2,...

PMF:

     n

   n

 

( l t

) n n !

exp

 

    describes the number of events in time interval

,

The mean and the variance are equal

 

V

N

  l t

 t 

37

(Homogenous) Poisson Process

Properties of Poisson process

Arrivals occur one at a time (not simultaneous)

N

  

0

 has stationary increments , which means

        s

The number of arrivals in time s to t is also Poisson-

 distributed with mean

N

 

0

 l  s

 has independent increments

38

CDF of Exponential distribution

Inter-Arrival Times of a Poisson Process

39

Inter-arrival time: time between two consecutive arrivals

The inter-arrival times of a Poisson process are random.

What is its distribution ?

Consider the inter-arrival times of a Poisson process (A

1 elapsed time between arrival i and arrival i+1

, A

2

, …) , where A i is the

The first arrival occurs after time t MEANS that there are no arrivals in the interval [0,t] , As a consequence:

1

 t

 

   

0

 exp

 t

1

 

1 t

  t

The Inter-arrival times of a Poisson process are exponentially distributed and independent with mean 1/ l 39

Splitting and Pooling

Splitting

A Poisson process can be split into two Poisson processes : The first with a probability p and the second with probability 1-p .

 

N

1

 

N

2

  rates l  p and l   p where

N(t) ~ Poi( l )

N

1

  l and

N

2 t l p

  are both Poisson processes with

N1(t) ~ Poi[ l p] l (1-p)

N2(t) ~ Poi[ l (1-p)]

Pooling

The summation of two Poisson processes is a Poisson process

N

1

  

N

2

      

N1(t) ~ Poi[ l

1

] l

1 l

1

 l

2

N(t) ~ Poi( l

1

 l

2

)

N2(t) ~ Poi[ l

2

] l

2

2

40

Modeling of Random Number of

Arrivals/Events

Poisson Distribution

Non Homogenous Poisson

Process

41

Non Homogenous (Non-stationary) Poisson Process (NSPP)

The non homogeneous Poisson process is characterized by a VARIABLE rate parameter λ(t) , the arrival rate at time t. In general, the rate parameter may change over time. l

1 l

2 l

3

The stationary increments, property is not satisfied

, :

 

 

 s

The expected number of events (e.g. arrival) between time s and time t is l   s t

42

Example: Non-stationary Poisson Process (NSPP)

The number of cars that cross the intersection of King Fahd Road and Al-Ourouba

Road is distributed according to a non homogenous Poisson process with a mean l (t) defined as follows: l 

80 cars/mn if 8 am

 

 

9 am

60 cars/mn if 9 am t 11 pm

50 car/mn

70 car/mn if if

11

15 am pm

 

 

15

17 pm pm

Let us consider the time 8 am as t=0.

Q1. Compute the average arrival number of cars at 11H30?

Q2. Determine the equation that gives the probability of having only 10000 car arrivals between 12 pm and 16 pm.

Q3. What is the distribution and the average (in seconds) of the inter-arrival time of two cars between 8 am and 9 am?

43

Example: Non-stationary Poisson Process (NSPP)

Q1. Compute the average arrival number of cars at 11H30?

l

8:00,11:30

 11:30 

8:00

9:00  11:00  11:30

λ(u) du  λ(u) du 

8:00 9:00 11:00

80cars/mn 60mn

       

13500 cars

Q2. Determine the equation that gives the probability of having only 10000 car arrivals between 12 pm and 16 pm.

    follows a Poisson distribution. During 12 pm and 16pm, the average number of cars is l

12:00

16:00

180 50 60 70 13200 cars

Thus,

  

N

  

10000

13200

10000

10000!

 exp

 

13200

Q3. What is the distribution and the average (in seconds) of the inter-arrival time of two cars between 8 am and 9 am? (Homework) 44

Two Minutes Break

You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions.

Administrative issues

Groups Formation

45

Continuous Probability Distributions

Uniform Distribution

Exponential Distribution

Normal (Gaussian) Distribution

Weibull Distribution

Lognormal Distribution

46

Continuous Distributions

Continuous random variables can be used to describe random phenomena in which the variable can take on any value in some interval.

In this section, the distributions studied are:

Uniform

Exponential

Normal

Weibull

Lognormal

47

Uniform Distribution

48

Continuous Uniform Distribution

The continuous uniform distribution is a family of probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable

A random variable X is uniformly distributed on the interval [a,b] ,

U(a,b) , if its PDF and CDF are:

 b

1

 a

, a

  b



0, otherwise

Expected value:

 

2

0, x a

 x a

, a

 x b

 b a

1, x

 b

Variance:

 

  2

12

49

Uniform Distribution

Properties

1

X

 x 2

 is proportional to the length of the interval

     

X

2

X b

 a

1

Special case: a standard uniform distribution U(0,1).

Very useful for random number generators in simulators

PDF

CDF

50

Exponential Distribution

Modeling Random Time

51

Exponential Distribution

The exponential distribution describes the times between events in a Poisson process, in which events occur continuously and independently at a constant average rate.

A random variable X is exponentially distributed with parameter m1/l > 0 if its PDF and CDF are:

 l exp

 l x

, x

0

0, otherwise

 

1 m

 exp

 x m

, x

0

0, otherwise

 



0, x

0

0 x l e

 l t dt e

 l x

, x

0

0, x

0

 x m

, x

0

Expected value:

 

1 l

 m

Variance:

 

 l

1

2

 m 2

52

Exponential Distribution

µ=20 µ=20

( )

 



1

20

 exp

 x

20

, x

0

0, otherwise

0, x

0

  x

20

, x

0

53

Exponential Distribution

The memoryless property : In probability theory, memoryless is a property of certain probability distributions: the exponential distributions and the geometric distributions , wherein any derived probability from a set of random samples is distinct and has no information (i.e. "memory") of earlier samples.

Formally, the memoryless property is:

For all s and t greater or equal to 0:

  

|

 s

    t

This means that the future event do not depend on the past event , but only on the present event

The fact that Pr(X > 40 | X > 30) = Pr(X > 10) does not mean that the events X > 40 and X > 30 are independent; i.e. it does not mean that

Pr(X > 40 | X > 30) = Pr(X > 40) .

54

Exponential Distribution

The memoryless property : can be read as “ the probability that you will wait more than s+t minutes given that you have already been waiting t minutes is equal to the probability that you will wait s minutes .”

In other words “ The probability that you will wait s more minutes given that you have already been waiting t minutes is the same as the probability that you had wait for more than s minutes from the beginning .”

  

|

 s

    t

The fact that Pr(X > 40 | X > 30) = Pr(X > 10) does not mean that the events

X > 40 and X > 30 are independent; i.e. it does not mean that

Pr(X > 40 | X > 30) = Pr(X > 40) .

55

Example: Exponential Distribution

The time needed to repair the engine of a car is exponentially

distributed with a mean time equal to 3 hours.

The probability that the car spends more than 3 hours in reparation

3

1

3

1 F

 

1

1 exp

3

3

0.368

The probability that the car repair time lasts between 2 to 3 hours is: p

 

X

3

 

F

  

F

  

0.145

The probability that the repair time lasts for another hour given it has been operating for 2.5 hours:

Using the memoryless property of the exponential distribution, we have:

  

X

2.5

 

1

1

1

 exp 

1

3

0.717

56

Normal (Gaussian) Distribution

57

Normal Distribution

The Normal distribution , also called the Gaussian distribution , is an important family of continuous probability distributions, applicable in many fields.

Each member of the family may be defined by two parameters, location and scale : the mean ("average", μ ) and variance

(standard deviation squared, σ 2) respectively.

The importance of the normal distribution as a model of quantitative phenomena in the natural and behavioral sciences is due in part to the Central Limit Theorem.

It is usually used to model system error (e.g. channel error), the distribution of natural phenomena, height, weight, etc.

58

Normal or Gaussian Distribution

A continuous random variable X , taking all real values in the range (-∞,+∞) is said to follow a Normal distribution with parameters µ and σ if it has the following PDF and CDF:

PDF: f

  

1 s 

2

 exp

 

1

2

 x

 s m 

2

CDF:

 

2

 1 erf

 x s 

 m

2

 where

Error Function: erf

  

2

 x

0 exp

The Normal distribution is denoted as X ~ N

2

This probability density function (PDF) is

 a symmetrical, bell-shaped curve,

 centered at its expected value µ.

The variance is s 2 .

59

Normal distribution

Example

The simplest case of the normal distribution, known as the Standard Normal

Distribution , has expected value zero and variance one. This is written as

N(0,1) .

60

Normal Distribution

Evaluating the distribution:

Independent of m and s, using the standard normal distribution :

Z ~ N

 

Transformation of variables: let

Z

X s

 m

F ( x )

P

X

 x

P

Z

 x

 s m

( 

  x

 m

) / s 1

2

( 

  x

 m

) / s

( z ) dz e

 z

2

/ 2 dz

 

( x

 s m

)

, where

( z )

 

 z

1

2

 e

 t

2

/ 2 dt

61

Normal Distribution

Example: The time required to load an oceangoing vessel, X, is distributed as N(12,4)

The probability that the vessel is loaded in less than 10 hours:

F ( 10 )

 

10

2

12

 

(

1 )

0 .

1587

Using the symmetry property,  (1) is the complement of  (-1)

62

Other Distributions

Weibull Distribution

Lognormal Distribution

63

Weibull Distribution

A random variable X has a Weibull distribution if its pdf has the form: f ( x )

 b a x a

 b 

1 exp

0 ,

 x a

 b 

, x

  otherwise

3 parameters:

Location parameter: u,

Scale parameter: b , b  0

(

 

 

)

Shape parameter. a,  0

Example: u = 0 and a = 1:

Lifetime of objects

When b

= 1 ,

X ~ exp( l

= 1/ a

)

64

Lognormal Distribution

A random variable X has a lognormal distribution if its pdf has the form: f ( x )

2

1

π σx

0, exp

 ln

2 x

σ

2

μ

2

, x  0 otherwise

Mean E(X) = e m + s 2

/2

Variance V(X) = e 2m + s 2

/2 ( e s 2

- 1)

 m

=1, s

2=0.5,1,2.

Relationship with normal distribution

When Y ~ N( m , s 2 ), then X = e Y ~ lognormal( m , s 2 )

Parameters m and s 2 are not the mean and variance of the lognormal

 general reliability analysis

 65

65

Empirical Distribution

66

Empirical Distributions

An Empirical Distribution is a distribution whose parameters are the observed values in a sample of data.

May be used when it is impossible or unnecessary to establish that a random variable has any particular parametric distribution.

Advantage : no assumption beyond the observed values in the sample.

Disadvantage : sample might not cover the entire range of possible values.

67

Empirical Distributions

In statistics, an empirical distribution function is a cumulative probability distribution function that concentrates probability 1/n at each of the n numbers in a sample.

Let

X1, X2, …, Xn be iid random variables in with the CDF equal to F(x).

The empirical distribution function F n is a step function defined by

(x) based on sample

X1, X2, …, Xn

F n

   number of element in the sample n

 x

1 n i n 

1

 i

 x

 where I(A) is the

 indicator of event

For a fixed value x, I(X i n

A.

 i

 x



1 if

X i

0 otherwise x

(x) is a binomial random variable with mean nF(x) and variance nF(x)(1-F(x)).

≤x) is a Bernoulli random variable with parameter p=F(x) , hence nF

68

End of Chapter 4

69

Download