+**Definition:** Let $m$ be a [generative model](/D/gm) with model parameters $\theta$ implying the [likelihood function](/D/lf) $p(y \vert \theta, m)$. Moreover, assume a [prior distribution](/D/prior) $p(\theta \vert m)$, a resulting [posterior distribution](/D/post) $p(\theta \vert y, m)$ and an [approximate](/D/vb) [posterior distribution](/D/post) $q(\theta)$. Then, the [Variational Bayesian](/D/vb) [log model evidence](/D/lme) is the expectation of the [log-likelihood function](/D/llf) with respect to the approximate posterior, minus the [Kullback-Leibler divergence](/D/kl) between approximate posterior and true posterior distribution:
0 commit comments