Bayesian Adaptive Research Design, Prof. Kruschke

P747 Bayesian Adaptive Research Design.
Spring 2011, Fridays 9:05am-11:50am, Room 230 of the Psychology Bldg.
(P747, registrar section 16650)

Prof. John K. Kruschke

Course description:
     One of the most exciting benefits of Bayesian data analysis is being able to evaluate data "on the fly", as they are being collected, and decide whether or not to continue data collection and how to optimize the experimental treatment for the next observation. Bayesian adaptive research design can be especially helpful in clinical applications, when experimental treatments with null or detrimental effects should be discontinued as quickly as possible, and treatments with clearly beneficial effects should be disseminated as quickly as possible. The FDA has recently endorsed Bayesian adaptive design (Guidance for the use of Bayesian statistics in medical device clinical trials. See Section 2.6, regarding benefits of using Bayesian methods).
     Bayesian adaptive design is useful for any application where efficient data collection is desired. It has recently been applied to experiments in cognitive science (e.g., memory and psychophysics). It has been extensively applied in adaptive psychometric testing, whereby the most diagnostic queries are automatically selected based on data collected so far. It has even been used in astronomy to aid efficient exploration of the skies. The statistical method can also be considered as a model for how humans, as natural experimenters in the world, efficiently and actively learn.
     The course will focus first on Bayesian adaptive design in clincial research, and then explore other applications.

Required textbook:

Berry, S. M., Carlin, B. P., Lee, J. J., & Muller, P. (2011). Bayesian Adaptive Methods for Clinical Trials. Boca Raton: CRC Press. ISBN: 9781439825488. The book has web sites with programs for examples in the book.

Other required readings will explore Bayesian adaptive methods in other (non-clinical) settings, such as experiment design in cognitive science, adaptive test design, and modeling of human active learning. Candidate readings are listed below (if you have specific readings to recommend, please let the instructor know asap).


• Cavagnaro, D. R., Myung, J. I., Pitt, M. A., & Kujala, J. V. (2010). Adaptive design optimization: A mutual information-based approach to model discrimination in cognitive science. Neural Computation, 22, 887-905.
• Kujala, J. V. & Lukka, T. J. (2006). Bayesian adaptive estimation: The next dimension. Journal of Mathematical Psychology, 50(4), 369-389.
• Loredo, T. J., & Chernoff, D. F. (2003). Bayesian adaptive exploration. In: E. D. Feigelson & G. J. Babu (Eds.), Statistical Challenges in Astronomy, Ch. 4, pp. 57-70. New York: Springer.
• van der Linden, W. J. & Pashley, P. J. (2010). Item selection and ability estimation in adaptive testing. In: W. J. van der Linder and D. A. W. Glas (eds.), Elements of Adaptive Testing. Springer. DOI: 10.1007/978-0-387-85461-8_1
• "Goals, Power, and Sample Size." Chapter 13 from Kruschke, J. K. (2010). Doing Bayesian data analysis: A tutorial with R and BUGS. Essential background before getting into the Berry et al. textbook.
Kruschke, J. K. (2008). Bayesian approaches to associative learning: From passive to active learning. Learning & Behavior, 36(3), 210-226. Active learning as optimal experiment design.

Pre-requisites:
A previous course in Bayesian data analysis, including experience with R and BUGS, such as P533. We will be using R and BUGS to carry out the examples presented in the required textbook (see the textbook web site).

Course format:
This is a seminar-style course, not a lecture-style course. Students will lead presentations of material and demonstrations of methods.

Course grading:
Grades will be based on presentation quality and participation when not presenting. Exercises or mini-projects recommended by presenters may also be assigned and count toward grading.