Bayesian Computation

advertisement
Bayesian Computation
Andrew Gelman
Department of Statistics and Department of Political Science
Columbia University
Class 4, 28 Sept 2011
Andrew Gelman
Bayesian Computation
Review of homework 4
I
Skills:
1. Write the joint posterior density (up to a multiplicative
constant)
2. Program two-dimensional Metropolis jumps
3. Program the accept/reject rule
4. Tune the parameters of your algorithm
Andrew Gelman
Bayesian Computation
Review of homework 4
I
Skills:
1. Write the joint posterior density (up to a multiplicative
constant)
2. Program two-dimensional Metropolis jumps
3. Program the accept/reject rule
4. Tune the parameters of your algorithm
Andrew Gelman
Bayesian Computation
Review of homework 4
I
Skills:
1. Write the joint posterior density (up to a multiplicative
constant)
2. Program two-dimensional Metropolis jumps
3. Program the accept/reject rule
4. Tune the parameters of your algorithm
Andrew Gelman
Bayesian Computation
Review of homework 4
I
Skills:
1. Write the joint posterior density (up to a multiplicative
constant)
2. Program two-dimensional Metropolis jumps
3. Program the accept/reject rule
4. Tune the parameters of your algorithm
Andrew Gelman
Bayesian Computation
Review of homework 4
I
Skills:
1. Write the joint posterior density (up to a multiplicative
constant)
2. Program two-dimensional Metropolis jumps
3. Program the accept/reject rule
4. Tune the parameters of your algorithm
Andrew Gelman
Bayesian Computation
Review of homework 4
I
Skills:
1. Write the joint posterior density (up to a multiplicative
constant)
2. Program two-dimensional Metropolis jumps
3. Program the accept/reject rule
4. Tune the parameters of your algorithm
Andrew Gelman
Bayesian Computation
Optimization of Gibbs and Metropolis algorithms
I
Conclusion of presentation by Wei Wang, Ph.D. student in
statistics
I
You can interrupt and discuss . . .
Andrew Gelman
Bayesian Computation
Optimization of Gibbs and Metropolis algorithms
I
Conclusion of presentation by Wei Wang, Ph.D. student in
statistics
I
You can interrupt and discuss . . .
Andrew Gelman
Bayesian Computation
Optimization of Gibbs and Metropolis algorithms
I
Conclusion of presentation by Wei Wang, Ph.D. student in
statistics
I
You can interrupt and discuss . . .
Andrew Gelman
Bayesian Computation
Missing-data imputation
I
Presentation by Ido Rosen, M.S. student in computer science
I
You can interrupt and discuss . . .
Andrew Gelman
Bayesian Computation
Missing-data imputation
I
Presentation by Ido Rosen, M.S. student in computer science
I
You can interrupt and discuss . . .
Andrew Gelman
Bayesian Computation
Missing-data imputation
I
Presentation by Ido Rosen, M.S. student in computer science
I
You can interrupt and discuss . . .
Andrew Gelman
Bayesian Computation
1. Write the joint posterior density (up to a multiplicative
constant)
I
Binomial model for #deaths given #rats
I
Logistic model for Pr(death)
I
Prior distribution for the logistic regression coefficients
I
Discuss extensions to the model
I
Steps 2, 3, 4 5 are straightforward
Andrew Gelman
Bayesian Computation
1. Write the joint posterior density (up to a multiplicative
constant)
I
Binomial model for #deaths given #rats
I
Logistic model for Pr(death)
I
Prior distribution for the logistic regression coefficients
I
Discuss extensions to the model
I
Steps 2, 3, 4 5 are straightforward
Andrew Gelman
Bayesian Computation
1. Write the joint posterior density (up to a multiplicative
constant)
I
Binomial model for #deaths given #rats
I
Logistic model for Pr(death)
I
Prior distribution for the logistic regression coefficients
I
Discuss extensions to the model
I
Steps 2, 3, 4 5 are straightforward
Andrew Gelman
Bayesian Computation
1. Write the joint posterior density (up to a multiplicative
constant)
I
Binomial model for #deaths given #rats
I
Logistic model for Pr(death)
I
Prior distribution for the logistic regression coefficients
I
Discuss extensions to the model
I
Steps 2, 3, 4 5 are straightforward
Andrew Gelman
Bayesian Computation
1. Write the joint posterior density (up to a multiplicative
constant)
I
Binomial model for #deaths given #rats
I
Logistic model for Pr(death)
I
Prior distribution for the logistic regression coefficients
I
Discuss extensions to the model
I
Steps 2, 3, 4 5 are straightforward
Andrew Gelman
Bayesian Computation
1. Write the joint posterior density (up to a multiplicative
constant)
I
Binomial model for #deaths given #rats
I
Logistic model for Pr(death)
I
Prior distribution for the logistic regression coefficients
I
Discuss extensions to the model
I
Steps 2, 3, 4 5 are straightforward
Andrew Gelman
Bayesian Computation
Tuning the algorithm
I
Shape of jumping kernel
I
Scale of jumping kernel
I
Objective function to optimize
I
Trying different tuning parameters
I
Stochastic optimization
Andrew Gelman
Bayesian Computation
Tuning the algorithm
I
Shape of jumping kernel
I
Scale of jumping kernel
I
Objective function to optimize
I
Trying different tuning parameters
I
Stochastic optimization
Andrew Gelman
Bayesian Computation
Tuning the algorithm
I
Shape of jumping kernel
I
Scale of jumping kernel
I
Objective function to optimize
I
Trying different tuning parameters
I
Stochastic optimization
Andrew Gelman
Bayesian Computation
Tuning the algorithm
I
Shape of jumping kernel
I
Scale of jumping kernel
I
Objective function to optimize
I
Trying different tuning parameters
I
Stochastic optimization
Andrew Gelman
Bayesian Computation
Tuning the algorithm
I
Shape of jumping kernel
I
Scale of jumping kernel
I
Objective function to optimize
I
Trying different tuning parameters
I
Stochastic optimization
Andrew Gelman
Bayesian Computation
Tuning the algorithm
I
Shape of jumping kernel
I
Scale of jumping kernel
I
Objective function to optimize
I
Trying different tuning parameters
I
Stochastic optimization
Andrew Gelman
Bayesian Computation
For next week’s class
I
Homework 5 due 5pm Tues
I
All course material is at http:
//www.stat.columbia.edu/~gelman/bayescomputation
Next class:
I
I
I present weakly informative priors
Andrew Gelman
Bayesian Computation
For next week’s class
I
Homework 5 due 5pm Tues
I
All course material is at http:
//www.stat.columbia.edu/~gelman/bayescomputation
Next class:
I
I
I present weakly informative priors
Andrew Gelman
Bayesian Computation
For next week’s class
I
Homework 5 due 5pm Tues
I
All course material is at http:
//www.stat.columbia.edu/~gelman/bayescomputation
Next class:
I
I
I present weakly informative priors
Andrew Gelman
Bayesian Computation
For next week’s class
I
Homework 5 due 5pm Tues
I
All course material is at http:
//www.stat.columbia.edu/~gelman/bayescomputation
Next class:
I
I
I present weakly informative priors
Andrew Gelman
Bayesian Computation
For next week’s class
I
Homework 5 due 5pm Tues
I
All course material is at http:
//www.stat.columbia.edu/~gelman/bayescomputation
Next class:
I
I
I present weakly informative priors
Andrew Gelman
Bayesian Computation
Download