This website contains part of the material I use for the semester-long course “Introduction to Bayesian statistics applied to life sciences” that I teach at Univ. of Florida. This course is geared towards students that have not been formally trained as statisticians and therefore it does not rely on linear algebra or advanced calculus. However, this material will require some understanding of basic calculus concepts and distribution theory as well as good grasp of programming.

Observation: I am still putting this website together so some of the material is not posted yet and there might be some rough edges. If you have any suggestions or comments, feel free to contact me (drvalle at ufl dot edu).

Intro material

  1. Approach to teaching to Bayes

  2. Introduction to Bayes

  3. Probabilities: a quick example on death penalty

  4. Probability rules

  5. Bayes theorem

  6. Quick overview of PMFs and PDFs

  7. The likelihood function
  1. Maximum likelihood estimation (MLE)

Basics of Bayes

  1. Conjugate likelihood-prior pairs: a basketball example
  1. An example for the normal-normal conjugate pair

  2. Are priors useful? A cancer example

  3. Monte Carlo integration
  1. Generative models and inverse modeling

  2. Full conditional distributions

  3. Gibbs sampling
  1. Metropolis-Hastings algorithm
  1. Different sources of uncertainty and the predictive distribution

Estimating population size

Robust regression model

Models with latent continuous variables


FCDs for regression parameters

Model for left censored data

Probit regression model

Mixed models

Modeling spatial and temporal correlation

Common problems in JAGS (and other MCMC algorithms)


  1. Big sum and big product notation

  2. “Proportional to” notation

  3. Difference between “dnorm” and “rnorm”

  4. Basic math facts

  5. Derivation of normal-normal conjugate pair

  6. Why can we deduce the posterior distribution just by knowing \(p(\pi|X)\) up to a proportionality constant?

UpDog logo  Proudly hosted with UpDog.