
Statistics 618/SPA 696: Bayesian Statistics, Fall 2017 Seminar
Thursday, 2:305:20 PM, Location TBD.
 Course Description:
Principles and applications of modern statistical decision theory, with a special focus on Bayesian modeling, data analysis,
inference, and optimal decision making. Prior and posterior; comparison of Bayesian and frequentist approaches, including
minimax decision making and elementary game theory. Bayesian estimation, hypothesis testing, credible sets, and Bayesian
prediction. Introduction to Bayesian computing software and applications to diverse fields. Grading: AF only. Prerequisite:
STAT514 or permission of instructor.
 Learning Outcomes:
At the conclusion of this course participants will: be able to specify and estimate Bayesian multilevel
(hierarchical) models with linear and nonlinear outcomes, treat missing data in a principled and
correct manner using multiple imputation, gain facility in the R and bugs statistical languages,
know how to compute the appropriate sample size and power calculations for Bayesian models, gain
exposure to Bayesian approaches including MCMC computation, and be able to assess model reliability
and fit in complex models.
 Prerequisite Details:
This course assumes a knowledge of basic statistics as taught in a first year undergraduate or graduate
sequence. Topices should include: probability, crosstabulation, basic statistical summaries, and linear
regression in either scalar or matrix form. and knowledge of R. Exposure to basic matrix algebra and calculus is helpful.
 Course Grade:
The final grade will be based on two components: weekly attendance and participation (20%) and
exercises (80%). Exercises are due one week after assignment on the syllabus.
 Office Hours: By appointment.
 Incompletes: Due to the scheduled nature of the course, no incompletes will be given.
 Teaching Assistant: Simon Heuberger.
Office Hours: Friday 13, in TBD.
 Required Text: Gelman and Hill, "Data Analysis Using Regression and
Multilevel/Hierarchical Models (Cambridge University Press 2007). Other readings will be papers will made available
at jstor.org or distributed by the instructor on this syllabus/webpage. Readings should be completed before class
listed on the syllabus.
 Topics (subject to minor change):
 August 28: No class meeting (APSA conference).
 September 7: Introducing Bayesian Inference.
 Background Reading. Gelman & Hill: Chapters 12.
Refreshing R Skills:
Starting R,
Writing Functions,
Making Plots,
Sampling.
 Weekly Reading. Issues in Inference:
Gill, Jeff. The Insignificance of Null Hypothesis Significance Testing. Political Research Quarterly, vol. 52, no. 3, 1999,
pp.647674, JSTOR. Click here.,
Leamer, Edward E. Let's Take the Con Out of Econometrics. The American Economic Review, vol. 73, no. 1, 1983, pp. 3143, JSTOR.
Click here.,
King, Gary. 1986. How Not to Lie With Statistics: Avoiding Common Mistakes in Quantitative Political Science. American Journal
of Political Science, 30: pp. 666687. Click here,
slides from the lecture.
 Exercises. Gelman & Hill 2.2, 2.3, 2.4.
 September 14: Detailed Linear Model Theory Review.
 September 21: Multilevel Structures and Multilevel Linear Models.
 Reading. Gelman & Hill: Chapters 11 and 12,
Gill, Jeff and Andrew J. Womack. The Multilevel Model Framework. In The SAGE Handbook of Multilevel Modeling.
Scott, Marc A, Jeffrey S Simonoff and Brian D Marx (eds). London: SAGE Publications Ltd, 2013. Pp. 320. SAGE Research Methods.
Click here.
Chapter 1112 code from the lecture.
 Exercises. Gelman & Hill: 11.4, 12.2, 12.5.
 September 28: Multilevel Linear Models: Varying Slopes, NonNested Models and Other Complexities. Running Bayesian
Regression Models.
 Reading. Gelman & Hill: Chapter 13.
Chapter 13 code from the lecture.
 Exercises. Gelman & Hill: 13.2, 13.4, 13.5.
 October 5: Multilevel Logistic Regression Models, Multilevel Bayesian GLMs..
 Reading, Gelman & Hill: Chapter 14 (skip Section 14.3), Chapter 15.
Chapter 14 code from the lecture.
 Exercises. Gelman & Hill: 14.5, 14.6, 15.1, 15.2.
 October 12: Multilevel Modeling in Bugs and R: the Basics, MCMC Theory.
 October 19: Fitting Multilevel Linear and Generalized Linear Models in Bugs and R, MCMC Coding.
 Reading. Gelman & Hill: Chapter 17.
Chapter 17 code from the lecture.
 Exercises. Gelman & Hill: Rerun 16.3 using the instructions in 17.2 and 17.3, 17.5.
 October 26: Likelihood and Bayesian Inference, Computation, MCMC Diagnostics and Customization.
 Reading. Gelman & Hill: Chapter 18.
 Exercises. Gelman & Hill: 18.1, 18.2, 18.4.
 November 2: Treatment of Missing Data.
 November 9: Understanding and Summarizing the Fitted Models, Multilevel Analysis of Variance.
 November 16: Model Checking and Comparison.
 Reading. Gelman & Hill: Chapter 24,
Chapter 24 code from the lecture.
 Exercises. 24.1, 24.4.
 November 23: Thanksgiving Holiday.
 November 30: Sample Size and Power Calculations.
 Reading.: Gelman & Hill: Chapter 20,
Chapter 20 code from the lecture.
 Exercises. 20.1, 20.2, 20.3.
 December 7: Causal Inference Using Regression on the Treatment Variable. Bayesian Causal Inference.
 Reading Gelman & Hill: Chapter 9,
Chapter 9 code from the lecture,
code and data from the lecture example.
 Exercises. Rerun the Baldus data example in JAGS and change the model specification.


