我们都知道,在概率图模型的推断中,精确推断方法通常需要很大的计算开销。因此,在现实应用中近似推断方法更为常用。这类方法一般分为2类:通过使用随机化方法完成近似,比如MCMC(Markov Chain Monte Carlo);使用确定性近似完成近似推断,比如变分推断。本文主要讨论采样技术,基本思想是直接计算或逼近某个变量的期望往往比推断概率分布更加容易。西瓜书里举了一个非常直观的例子,假定我们的目标是计算函数f(x)在概率密度函数p(x)下的期望:
from random import random
def_discrete(N_i, x_i, WP, WL):# Use the model to get the energy when the pixel takes value c.
potential = lambda c: sum([WP for z in N_i if z == c]) + \
(WL if x_i == c else0.0)
# Now we can evaluate the Giibs distribution and find the probability that# the pixel yi is white.
Pwhite = exp(potential(1)) / sum([exp(potential(c)) for c in [0,1]])
return1.0if random() < Pwhite else0.0
from random import normalvariate as normal
clamp = lambda x, low, high: low if x < low else (high if x > high else x)
def_continuous(N_i, x_i, WP, WL):# Calculate the parameters for our normal distribution
variance = 0.5 / (WP * len(N_i) + WL)
mean = 2*variance * (WL*x_i + sum([WP*z for z in N_i]))
stdev = variance**0.5
z = normal(mean, stdev)
return clamp(mean + z*stdev, 0.0, 1.0)