- published: 23 Jun 2011
- views: 49849
In statistics, Bayesian inference is a method of inference in which Bayes' rule is used to update the probability estimate for a hypothesis as additional evidence is learned. Bayesian updating is an important technique throughout statistics, and especially in mathematical statistics: Exhibiting a Bayesian derivation for a statistical method automatically ensures that the method works as well as any competing method, for some cases. Bayesian updating is especially important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a range of fields including science, engineering, medicine, and law.
In the philosophy of decision theory, Bayesian inference is closely related to discussions of subjective probability, often called "Bayesian probability." Bayesian probability provides a rational method for updating beliefs; however, non-Bayesian updating rules are compatible with rationality, according to Ian Hacking and Bas van Fraassen.
Bayesian inference derives the posterior probability as a consequence of two antecedents, a prior probability and a "likelihood function" derived from a probability model for the data to be observed. Bayesian inference computes the posterior probability according to Bayes' rule:
(ML 7.1) Bayesian inference - A simple example
21. Bayesian Statistical Inference I
24 - Bayesian inference in practice - posterior distribution: example Disease prevalence
Bayesian Inference and MCMC with Bob Carpenter
Bayesian Inference in R
Basic Inference in Bayesian Networks
The Equation that will CHANGE YOUR LIFE! (Informal Bayesian Inference for Skeptics)
Bayesian Inference 1 - Zoubin Ghahramani - MLSS 2013 Tübingen
Bayesian Inference and Data Analytics 11.04.2015
John Salvatier: Bayesian inference with PyMC 3