This is a companion post to my upcoming PyMCon talk, Learning Bayesian Statistics with Pokemon GO. I want to catalog resources that helped me to learn Bayesian inference for the benefit of others trying to do the same.
Do you have suggestions for this list? Feel free to email me@<this domain>.
Bayesian Data Analysis 3 by Andrew Gelman and several others is one of the gold-standard books for learning about Bayesian modeling. It is fairly mathematically involved, and I recommend using it as a reference. The book is available as a free PDF for non-commercial purposes.
Statistical Rethinking by Richard McElreath is another highly regarded book on Bayesian statistics. The conceptual explanations are among the best I’ve found anywhere. A second edition was recently releaed.
Probabilistic Programming & Bayesian Methods for Hackers by Cam Davidson is a free book in the form of a collection of Jupyter Notebooks. It focuses on practical applications of Bayesian methods, putting examples before developing any theory, and is a pretty easy read.
Bayesian Analysis with Python by Oswaldo Martin is a more practical reference than BDA3 or Rethinking, and focuses more on implementing ideas as you learn them.
A conceptual introduction to Hamiltonian Monte Carlo by Michael Betancourt is a research paper motivating HMC from first principles. It’s a dense but informative read.
PyMC3 is the main library used for MCMC in Python. Its API is relatively intuitive, though more complex models are harder because it’s built upon the now-defunct Theano. The PyMC3 example notebooks are an excellent introduction to what’s possible with Bayesian modeling.
The PyMC developers also maintain a repository of PyMC3 educational resources. This mostly consists of exercises from BDA3, Statistical Rethinking, and a couple of other books that have been worked out in PyMC3.
Stan is the most widely-used domain language for Bayesian modeling. Their forums are packed with educational resources (look under the “modeling” tag in particular). The Stan User’s Guide is also worth reading for developing your conceptual understanding of Bayesian methods. Stan has a large, interdisciplinary community, and their resources are generally applicable to anyone working in this space.
Pyro is also a Python package which focuses on variational inference methods but also implements MCMC samplers. Pyro is built on PyTorch, which (in my opinion) is nicer to work with than Theano or TensorFlow. Pyro’s documentation is harder to get started with than PyMC3’s, and variational inference is more complex than MCMC, but both Pyro and VI are quite powerful.
NumPyro (Github) is an alternative to Pyro, powered by JAX and just-in-time compilation. While it maintains API parity with Pyro, in practice it has always felt more MCMC-focused than Pyro itself. It is substantially faster than PyMC3 because of how incredible JAX is.
You might also refer to Colin Carroll’s blog post A Tour of Probabilistic Programming APIs.
Bayesian Inference is Just Counting - Richard McElreath is a gentle introduction to Bayesian inference and how it forces us to think differently from before (he says it’s roughly the first 6 chapters of the Statistical Rethinking book).
Eric Ma - Beyond Two Groups: Generalized Bayesian A/B[/C/D/E…] Testing - from PyCon 2019
(And I’m sure there will be many more from PyMCon soon!)
For a general understanding of MCMC:
- A practical guide to MCMC by Justin Ellis
- Hamiltonian Monte Carlo from Scratch by Colin Carroll
- Markov Chains: why walk when you can flow? by Richard McElreath
- Prior choice recommendations from the Stan developers
- How would you explain MCMC to a layperson? on Stats StackExchange
A set of Bayesian Notes by Jeffrey B. Arnold, though unmaintained, are helpful for developing some of the theory in Bayesian inference.
The Bayes Way links to lots of other resources.
Andrew Gelman’s blog is updated several times a day. It’s a combination of posts about Bayesian thinking, research, education, and whatever else crosses his mind. Definitely worth following, if only to increase your general knowledge.
Andrew’s Spinner, a homework question that someone posed on Twitter that turned out to be surprisingly interesting.
Why hierarchical models are awesome, tricky, and Bayesian by Thomas Wiecki
Pokemon GO shiny rates: a Bayesian perspective (I wrote this)
Hierarchical Partial Pooling for Repeated Binary Trials (aka “the baseball example”)
Bayesian methods for multilevel modeling (aka “the radon example”)
Bayesian product ranking at Wayfair - the rare industry use case! (I have no affiliation with Wayfair)
Bayesian estimation supersedes the t-test talking about how what we do is a more informative version of frequentist statistical techniques