Bayesian Statistics

You are here


Bayesian Statistics

To be announced


Nowadays, with the advance of computing and Markov Chain Monte Carlo (MCMC) algorithms, Bayesian statistics is becoming a powerful alternative for traditional Frequentistic statistics. The philosophy behind Bayesian Statistics is discussed. Practical examples are studied, and analysed using the (freeware) program WinBUGS. Participants will be surprised how easy they can tackle problems that are quite complicated to handle with traditional Frequentistic statistics.


null-hypothesis.pngThere are two branches of statistics: Frequentistic statistics and Bayesian statistics. These branches involve different concepts of probability: probability as a "frequency in the long run" or as a "degree of belief". "Degree of belief" comprises the concept of a prior distribution. Bayesians combine two sources of information: prior information, summarized in a prior distribution, and data, represented by a model and associated likelihood. The idea of a likelihood function and maximum likelihood estimation in Frequentistic statistics is briefly introduced (or refreshed). The two sources of information in Bayesian statistics are combined with a theorem of Thomas Bayes. This theorem forms the basis of Bayesian statistics. Possibly, Bayes would have been quite surprised about the consequences of his paper in statistics.

We briefly introduce (and probably refresh):

  • The concept of conditional probability
  • Bayes' theorem
  • An example with Haemophilia. This example, that fits into the framework of Frequentistic statistics as well, has the advantage that implementation of prior information is straightforward. It offers a first impression how prior information and data are combined.
  • Combining the prior information (prior distribution) and the information in the data (likelihood) into a posterior distribution. The posterior distribution summarises all we know about a parameter or parameters based on the prior knowledge and the data.
  • Posterior inference = deriving conclusions from the posterior distribution.
  • Informative priors, non-informative priors, improper priors, conjugate priors.
  • Posterior inference with MCMC, Gibbs sampling and the WinBUGS package.
  • Examples with different models from different areas of application.
General information
Target Group The course is aimed at PhD candidates and other academics
Group Size 24 participants
Course duration 2 days
Language of instruction English
Frequency of recurrence Once a year (Autumn)
Number of credits 0.6 ECTS
Lecturers Dr. Gerrit Gort, Dr. Bas Engel
Prior knowledge Basic Statistics
Location Wageningen University Campus
Options for accommodation Accommodation is not included in the fee of the course, but there are several possibilities in Wageningen. For information on B&B's and hotels in Wageningen please visit Another option is Short Stay Wageningen. Furthermore Airbnb offers several rooms in the area. Note that besides the restaurants in Wageningen, there are also options to have dinner at Wageningen Campus.
More information

Claudius van de Vijver (PE&RC)
Phone: +31 (0) 317 485116

Lennart Suselbeek (PE&RC)
Phone: +31 (0) 317 485426

Registration of interest

At this moment, this course is not scheduled yet. However, if you register your interest in this activity below, we will inform you as soon as the course is scheduled and registration of participation is opened.