Bayesian Inference

Tags :: Bayesian Statistics

Inference is associated with obtatining conclusions based on evidence and reasoning. Bayes inference is a form of statistical inference based on combining probability distributions in order to obtain other distrubtions.

We estimate the value of parameter \(\theta\) given that we have observed some data \(Y\).

\[ p(\theta | Y ) = \frac{p(Y|\theta)p(\theta)}{p(Y)} \]

The likelihood function links the observed data with the unkown parameters whole the prior represents the uncertainty about the parameters before observing the data \(Y\).

By multiplying them we get the posterior distribution, which is the joint distribution over all the parameters in the model conditioned on the observed data.

The marginal likelihood is not generally computed, so it is very common to see Bayes expressed as a proportionality

\[ p(\theta|Y) \sim p(Y|\theta)\; p(\theta) \]

As the posterior is always a distribution, we can think of it as the logical consequence of combining a model with data, with probabilistic statements that are guaranteed to be mathematically consistent.

However in the real world our results are conditioned not only on the data, but also the models and bad data||models could lead to nonsensical statements.