Home Page

0. Preliminaries

1. Introduction

2. Simple
Examples

3. Bayes'
Theorem

4. Interpretation

5. MCMC

6. Poisson Inference

7. Normal Linear Models

8. Priors

9. Bayesian Hypothesis Testing Sorry,
I wasn't able to get this to print in 4-up but you may be able to do it
yourself.

Here is a link to the paper by Dellaportas
*et. al**.*, mentioned in the notes. Here is the reversible jump program in R that
we discussed in class.

10. Model Selection and Model Averaging

SPECIAL LECTURE on Cepheid
Variable Stars

11. Missing Data

12. Hierarchical Bayes

You should be reading Chapters 1-3 of Schmitt, and Chapters 1-2 of Sivia.
If you have a copy of Gelman *et. al.,* you may find Chapter 1 useful
reading.

Conntinue reading Chapters 4-6 of Schmitt and Chapter 3 of Sivia.

9/28 The Appendix of Sivia may be useful. If you have Gelman *et. al*
Chapters 10 and 11 will be relevant to MCMC.

Problem set #1,
due 9/16/04

Problem set #2, due
9/30/04

Problem set #3, due
10/7/04

Problem set #4, due
10/14/04; Here is a corrected version of the MCMC
Shell Program

Problem set #5, due
11/4/04.

Problem set #6, due
11/11/04

9/7/04 Click here for the
input file (Example1). Click here
for a transcript of the R session.

9/9/04 Click here for a transcript
of the (very bried) R session.

9/21/04 Click here for a transcript
of our R session.

9/28/04 Click here for a transcript
of our R session

9/30/04 Click here
for a shell MCMC program in R

9/30/04 Click here for a transcript
of our R session

11/16 Click here and here
for the code that we ran for selection models today. The first one has been
corrected, and now gives correct values for the possible parameters.

A very basic discussion of the intuitive basis of Bayesian reasoning can be found at http://yudkowsky.net/bayes/bayes.html. This contains some javascript calculators to try out simple calculations.

A former student in this course, Mark Powell, has developed two web-based distance learning graduate level courses on Bayesian inference and decision theory. They can be found at http://www.if.uidaho.edu/~powell. There is an introductory module that explains how the courses are taught and what they contain (requires a browser and Real Audio). We did not discuss decision theory, and the first of these courses is on this topic. The second concentrates on MCMC and Bayesian inference, so probably duplicates much of what we've done this semester.

Tom Loredo's Bayesian Inference in the Physical Sciences (BIPS) website has a lot of useful information about Bayesian inference. Note particularly the first five items in his Bayesian Reprints page, which are very nice tutorials on practical application of Bayesian inference. He also has extensive pointers to other websites including software, reprint archives, etc.

The book by E. T. Jaynes can be found in a preliminary form here.

First Bayes is a software package that is intended to help students with the first steps in understanding Bayesian inference. It runs under Windows. It concentrates on simple, closed-form examples but may be helpful to you.

The International Society for Bayesian Analysis (ISBA) is the international Bayesian organization. It sponsors meetings and publishes a newsletter. Dues are not expensive, and for students are set at a reduced rate of $10/year.

Bayesians Worldwide contains links to the home pages of a large number of Bayesians. Many of these individuals maintain collections of their reprints. Most of the prominent Bayesians are listed.

The Bayesian Songbook contains songs that have been presented at various Bayesian meetings over the years. Just for fun. There are also links to pictures of the infamous "Cabarets" at which these songs were sung.

Carnegie-Mellon University's statistics group has a library of many different statistics packages, including Bayesian packages. It can be accessed here.

Although CMU archives the R package, it's best to go to the R Project homepage, since you'll probably get the most recent version of it. Click here. R runs on Windows, Linux, UNIX and Macintosh (OS 9 or higher). The introductory tutorial for R can be found here.

The BUGS project at the University of Cambridge offers the BUGS (Bayesian inference Using Gibbs Sampling) package. It does both Gibbs and Metropolis-Hastings sampling, and the software can be downloaded here. It runs on Windows and the "classic" version runs on UNIX. There is no Mac version of BUGS. However, if you purchase Virtual PC you can run it on a Mac (at reduced speed), but Virtual PC is not free. Virtual PC is sold by Microsoft.

S Plus is not free, but there is a fairly inexpensive student package. It is sold by Mathsoft. It can also be used from UT servers, click here for information. (UT does not sell this, however). Most of the functionality of S Plus can be found in the free R package (above) so unless you need something not available in R, you don't need to buy S Plus. Also, S Plus has some memory management problems that cause problems in large simulations. R does not have this problem.

Another software package that has been used successfully in MCMC simulations is Gauss, sold by Aptech.

Minitab is used by quite a few people. It can be used on UT computers, but UT cannot sell it to individuals. However, students can rent a copy by the semester. Click here.

SAS is extremely powerful and UT does sell it to students and faculty. It is reputed to be the most difficult of the popular packages to learn. There are versions for Windows and Macintosh as well as UNIX. Click here for information.

Yet another popular package is SPSS, which UT also sells to students and faculty. There are versions for Windows and Macintosh. Click here for information.

Bayesian Statistics (AST 383/M394C/CAM394C)

This is a course in Bayesian statistics. The instructor is an astronomer by profession, so the course will emphasize applications to the physical sciences; however, the material of the course will be useful for applying Bayesian inference in a wide variety of contexts. Bayesian inference is a powerful and increasingly popular statistical approach, which allows one to deal with complex problems in a conceptually simple and unified way. The recent introduction of Markov Chain Monte Carlo (MCMC) simulation methods has made possible the solution of large problems in Bayesian inference that were formerly intractable. This course will introduce the student to the basic methods and techniques of modern Bayesian inference, including parameter estimation, MCMC simulation, hypothesis testing, and model selection/model averaging in the context of practical problems.

Data Analysis: A Bayesian Tutorial (D. S. Sivia. Oxford: Clarendon Press)

Measuring Uncertainty: An Introduction to Bayesian Inference (Samuel Schmitt. Reading, MA: Addison-Wesley, to be reprinted as a course packet, obtain at Texas Union).

(Optional) Bayesian Data Analysis, Second Edition (Andrew Gelman, John B. Carlin, Hal S. Stein and Donald B. Rubin. London: Chapman and Hall)

Review of probability calculus. Interpretations of probability (e.g., frequency, degree-of-belief). Coherence. Bayes's Theorem. Joint, conditional, and marginal distribution. Independence. Prior distribution, likelihood, and posterior distribution. Bayesian estimation and inference on discrete state spaces. Likelihoods, odds and Bayes factors. Simple and composite alternatives.

Markov Chain Monte Carlo (MCMC) simulation as a method for practical calculation of Bayesian results. The Gibbs sampler. Metropolis-Hastings sampling. Metropolis-within-Gibbs sampling. Computer tools, e.g., BUGS,S+, R.

Bayesian point and interval parameter estimation. Bayesian credible intervals. Comparison with frequentist parameter estimation and confidence intervals. Bayesian inference on Gaussian distributions. Maximum Likelihood estimation as a Bayesian approximation. Laplace's approximation. Bayesian inference in non-Gaussian cases, e.g., Poisson, Cauchy, and arbitrary distributions. Linear and nonlinear models. Errors-in-variables models. Selection models. Hierarchical models

Prior selection. Subjective and objective priors. Priors as a way to encode actual prior knowledge. Sensitivity of the posterior distribution to the prior. Priors for hierarchical models.

Bayesian hypothesis testing. Comparison with frequentist hypothesis testing. Model selection and model averaging. Reversible jump MCMC for models of variable size. Approximations, e.g., AIC, BIC. Philosophical issues, likelihood principle, and the Bayesian Ockham's Razor.

There will be two exams, and no final. There will be substantial out-of-class
assignments. The course grade will be based 40% on the exams and 60% on
the assignments. I want students to work on the programming assignments
in groups of two or three. This not only saves me time in grading but allows
for cooperative learning and teamwork. Statistics is a cooperative discipline
where the scientist and the statistician work together as a team.

The University of Texas at Austin provides upon request appropriate academic accommodations for qualified students with disabilities. For more information, contact the Office of the Dean of Students at 471-6259, 471-4641 TTY

You may send me an E-mail message right now, if you have any questions or comments about the course.

Here is a link to my Home Page. From it you can also locate other information about astronomy on the worldwide web. There is information about our department, about McDonald Observatory, and links to other astronomical pages.

This page is under construction. Keep tuned for new material.

9

This page was served to you by Quasar. It was last modified on 041115.

My home page is located here. My
office number is RLM 16.236. Office hours are MWF 2-3 or by appointment.

All materials at this website Copyright (C) 1994-2004 by William H. Jefferys.
All Rights Reserved.