Bayesian Computation Andrew Gelman Department of Statistics and Department of Political Science Columbia University Class 3, 21 Sept 2011 Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Review of homework 3 I Skills: 1. Write the joint posterior density (up to a multiplicative constant) 2. Program one-dimensional Metropolis jumps 3. Program the accept/reject rule 4. Fit generalized linear models in R 5. Display and summarize results I And more . . . Andrew Gelman Bayesian Computation Implementing Gibbs and Metropolis and improving their efficiency I Presentation by Wei Wang, Ph.D. student in statistics I You can interrupt and discuss . . . Andrew Gelman Bayesian Computation Implementing Gibbs and Metropolis and improving their efficiency I Presentation by Wei Wang, Ph.D. student in statistics I You can interrupt and discuss . . . Andrew Gelman Bayesian Computation Implementing Gibbs and Metropolis and improving their efficiency I Presentation by Wei Wang, Ph.D. student in statistics I You can interrupt and discuss . . . Andrew Gelman Bayesian Computation 1. Write the joint posterior density (up to a multiplicative constant) I Binomial model for #deaths given #rats I Logistic model for Pr(death) I Prior distribution for the logistic regression coefficients I Discuss extensions to the model I Steps 2, 3, 4 5 are straightforward Andrew Gelman Bayesian Computation 1. Write the joint posterior density (up to a multiplicative constant) I Binomial model for #deaths given #rats I Logistic model for Pr(death) I Prior distribution for the logistic regression coefficients I Discuss extensions to the model I Steps 2, 3, 4 5 are straightforward Andrew Gelman Bayesian Computation 1. Write the joint posterior density (up to a multiplicative constant) I Binomial model for #deaths given #rats I Logistic model for Pr(death) I Prior distribution for the logistic regression coefficients I Discuss extensions to the model I Steps 2, 3, 4 5 are straightforward Andrew Gelman Bayesian Computation 1. Write the joint posterior density (up to a multiplicative constant) I Binomial model for #deaths given #rats I Logistic model for Pr(death) I Prior distribution for the logistic regression coefficients I Discuss extensions to the model I Steps 2, 3, 4 5 are straightforward Andrew Gelman Bayesian Computation 1. Write the joint posterior density (up to a multiplicative constant) I Binomial model for #deaths given #rats I Logistic model for Pr(death) I Prior distribution for the logistic regression coefficients I Discuss extensions to the model I Steps 2, 3, 4 5 are straightforward Andrew Gelman Bayesian Computation 1. Write the joint posterior density (up to a multiplicative constant) I Binomial model for #deaths given #rats I Logistic model for Pr(death) I Prior distribution for the logistic regression coefficients I Discuss extensions to the model I Steps 2, 3, 4 5 are straightforward Andrew Gelman Bayesian Computation And more . . . I Check convergence I Debug program I Check fit of model to data I Understand model in context of data and alternative models Andrew Gelman Bayesian Computation And more . . . I Check convergence I Debug program I Check fit of model to data I Understand model in context of data and alternative models Andrew Gelman Bayesian Computation And more . . . I Check convergence I Debug program I Check fit of model to data I Understand model in context of data and alternative models Andrew Gelman Bayesian Computation And more . . . I Check convergence I Debug program I Check fit of model to data I Understand model in context of data and alternative models Andrew Gelman Bayesian Computation And more . . . I Check convergence I Debug program I Check fit of model to data I Understand model in context of data and alternative models Andrew Gelman Bayesian Computation Optimizing the algorithm I Scale of jumps in α and β I Jumping distributions I One-dimensional or two-dimensional jumps I How to implement Gibbs here?? I Other computational strategies?? Andrew Gelman Bayesian Computation Optimizing the algorithm I Scale of jumps in α and β I Jumping distributions I One-dimensional or two-dimensional jumps I How to implement Gibbs here?? I Other computational strategies?? Andrew Gelman Bayesian Computation Optimizing the algorithm I Scale of jumps in α and β I Jumping distributions I One-dimensional or two-dimensional jumps I How to implement Gibbs here?? I Other computational strategies?? Andrew Gelman Bayesian Computation Optimizing the algorithm I Scale of jumps in α and β I Jumping distributions I One-dimensional or two-dimensional jumps I How to implement Gibbs here?? I Other computational strategies?? Andrew Gelman Bayesian Computation Optimizing the algorithm I Scale of jumps in α and β I Jumping distributions I One-dimensional or two-dimensional jumps I How to implement Gibbs here?? I Other computational strategies?? Andrew Gelman Bayesian Computation Optimizing the algorithm I Scale of jumps in α and β I Jumping distributions I One-dimensional or two-dimensional jumps I How to implement Gibbs here?? I Other computational strategies?? Andrew Gelman Bayesian Computation For next week’s class I Homework 4 due 5pm Tues I All course material is at http: //www.stat.columbia.edu/~gelman/bayescomputation Next class: I I Student presentation on missing-data imputation Andrew Gelman Bayesian Computation For next week’s class I Homework 4 due 5pm Tues I All course material is at http: //www.stat.columbia.edu/~gelman/bayescomputation Next class: I I Student presentation on missing-data imputation Andrew Gelman Bayesian Computation For next week’s class I Homework 4 due 5pm Tues I All course material is at http: //www.stat.columbia.edu/~gelman/bayescomputation Next class: I I Student presentation on missing-data imputation Andrew Gelman Bayesian Computation For next week’s class I Homework 4 due 5pm Tues I All course material is at http: //www.stat.columbia.edu/~gelman/bayescomputation Next class: I I Student presentation on missing-data imputation Andrew Gelman Bayesian Computation For next week’s class I Homework 4 due 5pm Tues I All course material is at http: //www.stat.columbia.edu/~gelman/bayescomputation Next class: I I Student presentation on missing-data imputation Andrew Gelman Bayesian Computation