Welcome to Amsterdam! Welcome to Amsterdam! Bayesian Modeling for Cognitive Science: A WinBUGS Workshop Contributors Michael Lee http://www.socsci.uci.edu/~mdlee/ Contributors Dora Matzke http://dora.erbe-matzke.com/ Contributors Ruud Wetzels http://www.ruudwetzels.com/ Contributors EJ Wagenmakers http://www.ejwagenmakers.com/ Assistants Don van Ravenzwaaij http://www.donvanravenzwaaij.com Assistants Gilles Dutilh http://gillesdutilh.com/ Assistants Helen Steingröver Why We Like Bayesian Modeling It is fun. It is cool. It is easy. It is principled. It is superior. It is useful. It is flexible. Our Goals This Week Are… For you to experience some of the possibilities that WinBUGS has to offer. For you to get some hands-on training by trying out some programs. For you to work at your own pace. For you to get answers to questions when you get stuck. Our Goals This Week Are NOT… For you to become a Bayesian graphical modeling expert in one week. For you to gain deep insight in the statistical foundations of Bayesian inference. For you to get frustrated when the programs do not work or you do not understand the materials (please ask questions). Logistics You should now have the course book, information on how to get wireless access, and a USB stick. The stick contains a pdf of the book and the computer programs. Logistics Brief plenary lectures are at 09:30 and 14:00. All plenary lectures are in this room. All practicals are in the computer rooms on the next floor. Coffee and tea are available in the small opposite the computer rooms. What is Bayesian Inference? Why be Bayesian? What is Bayesian Inference? What is Bayesian Inference? “Common sense expressed in numbers” What is Bayesian Inference? “The only statistical procedure that is coherent, meaning that it avoids statements that are internally inconsistent.” What is Bayesian Inference? “The only good statistics” Outline Bayes in a Nutshell The Bayesian Revolution This Course Bayesian Inference in a Nutshell In Bayesian inference, uncertainty or degree of belief is quantified by probability. Prior beliefs are updated by means of the data to yield posterior beliefs. Bayesian Parameter Estimation: Example We prepare for you a series of 10 factual questions of equal difficulty. You answer 9 out of 10 questions correctly. What is your latent probability θ of answering any one question correctly? Bayesian Parameter Estimation: Example We start with a prior distribution for θ. This reflect all we know about θ prior to the experiment. Here we make a standard choice and assume that all values of θ are equally likely a priori. Bayesian Parameter Estimation: Example We then update the prior distribution by means of the data (technically, the likelihood) to arrive at a posterior distribution. The posterior distribution is a compromise between what we knew before the experiment and what we have learned from the experiment. The posterior distribution reflects all that we know about θ. Mode = 0.9 95% confidence interval: (0.59, 0.98) Outline Bayes in a Nutshell The Bayesian Revolution This Course The Bayesian Revolution Until about 1990, Bayesian statistics could only be applied to a select subset of very simple models. Only recently, Bayesian statistics has undergone a transformation; With current numerical techniques, Bayesian models are “limited only by the user’s imagination.” The Bayesian Revolution in Statistics The Bayesian Revolution in Statistics Why Bayes is Now Popular Markov chain Monte Carlo! Markov Chain Monte Carlo Instead of calculating the posterior analytically, numerical techniques such as MCMC approximate the posterior by drawing samples from it. Consider again our earlier example… Mode = 0.89 95% confidence interval: (0.59, 0.98) With 9000 samples, almost identical to analytical result. Want to Know More About MCMC? MCMC With MCMC, the models you can build and estimate are said to be “limited only by the user’s imagination”. But how do you get MCMC to work? Option 1: write the code it yourself. Option 2: use WinBUGS! Outline Bayes in a Nutshell The Bayesian Revolution This Course Bayesian Cognitive Modeling: A Practical Course …is a course book under development, used at several universities. …is still regularly updated. …will eventually be published by Cambridge University Press. …greatly benefits from your suggestions for improvement! [e.g., typos, awkward sentences, new exercises, new applications, etc.] Bayesian Cognitive Modeling: A Practical Course …requires you to run computer code. Do not mindlessly copy-paste the code, but study it first, and try to discover why it does its job. …did not print very well (i.e., the quality of some of the pictures is below par). You will receive a better version tomorrow! WinBUGS Bayesian inference Using Gibbs Sampling You want to have this installed (plus the registration key) WinBUGS Knows many probability distributions (likelihoods); Allows you to specify a model; Allows you to specify priors; Will then automatically run the MCMC sampling routines and produce output. WinBUGS knows many statistical distributions (e.g., the binomial distribution, the Gaussian distribution, the Poisson distribution). These distributions form the elementary building blocks from which you may construct infinitely many models. WinBUGS & R WinBUGS produces MCMC samples. We want to analyze the output in a nice program, such as R or Matlab. This can be accomplished using the R package “R2WinBUGS”, or the Matlab function “matbugs”. R: “Here are the data and a bunch of commands” WinBUGS: “OK, I did what you wanted, here’s the samples you asked for” Matlab: “Here are the data and a bunch of commands” WinBUGS: “OK, I did what you wanted, here’s the samples you asked for” Getting Started Work through some of the exercises of the book. Most of you will want to get started with the chapter “getting started”. For those of you who have worked with the book before, you can start wherever you want. Note that most early chapters have been restructured (and new content was added). Running the R programs The R scripts have extension .R. You can use “File” -> “Open Script” to read these. You can run these scripts by copying-andpasting the scripts in the R console. Saving Your Work If you want to save your work, please do this on the USB stick! WARNING The first chapters are mostly about simple statistical models. This lays the groundwork for the later chapters on more complicated cognitive modeling. The idea is that you have to walk before you can run. Questions? Feel free to ask questions when you are stuck. Answers to the exercises for the first few chapters can be found at the end of the book!! Inside every Non-Bayesian, there is a Bayesian struggling to get out Dennis Lindley