Metropolis Hastings

Tags :: Bayesian Statistics

General algorithm

  1. Initialize
    1. pick an initial state \(x_0\)
    2. set \(t = 0\)
  2. Iterate
    1. Generate a random candidate state \(x’\) according to \(g(x’|x_t)\)
    2. Cacluate the acceptence probability \[A(x’, x_t) = \min(1, \frac{P(x’)}{P(x_t)}\frac{g(x_t | x’)}{g(x’ | x_t)})\]
    3. accept or reject
      1. generate a uniform random number \(u \in [0, 1]\)
      2. if \(u \leq A(x’, x_t)\) then accept the new state and set \(x_{t+1} = x’\)
      3. if \(u > A(x’, x_t)\) then reject the new state, and copy the old state forward
    4. increment; set \(t = t+1\)

Bayesian inference

When used to draw samples from the posterior distribution of a statistical model, the acceptence probability is given by \[ P_{acc}(\theta_i \rightarrow \theta^*) = \min\left(1, \frac{\mathcal{L}(y|\theta^*)P(\theta^*)}{\mathcal{L}(y|\theta_i)P(\theta_i)}\frac{Q(\theta_i | \theta^*)}{Q(\theta^* | \theta_i)}\right) \]

Important to note that it is not an optimization method.


Links to this note