Skip to content

Commit 10b2e8a

Browse files
authored
added 4 proofs
1 parent effba3a commit 10b2e8a

4 files changed

Lines changed: 458 additions & 0 deletions

File tree

P/cov-ind.md

Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2020-09-03 06:05:00
9+
10+
title: "Covariance of independent random variables"
11+
chapter: "General Theorems"
12+
section: "Probability theory"
13+
topic: "Covariance"
14+
theorem: "Covariance under independence"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2020
19+
title: "Covariance"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2020-09-03"
22+
url: "https://en.wikipedia.org/wiki/Covariance#Uncorrelatedness_and_independence"
23+
24+
proof_id: "P158"
25+
shortcut: "cov-ind"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Theorem:** Let $X$ and $Y$ be [independent](/D/ind) [random variables](/D/rvar). Then, the [covariance](/D/cov) of $X$ and $Y$ is zero:
31+
32+
$$ \label{eq:cov-ind}
33+
X, Y \; \text{independent} \quad \Rightarrow \quad \mathrm{Cov}(X,Y) = 0 \; .
34+
$$
35+
36+
37+
**Proof:** The [covariance can be expressed in terms of expected values](/P/cov-mean) as
38+
39+
$$ \label{eq:cov-mean}
40+
\mathrm{Cov}(X,Y) = \mathrm{E}(X\,Y) - \mathrm{E}(X) \, \mathrm{E}(Y) \; .
41+
$$
42+
43+
For independent random variables, [the expected value of the product is equal to the product of the expected values](/P/mean-mult):
44+
45+
$$ \label{eq:mean-mult}
46+
\mathrm{E}(X\,Y) = \mathrm{E}(X) \, \mathrm{E}(Y) \; .
47+
$$
48+
49+
Taking \eqref{eq:cov-mean} and \eqref{eq:mean-mult} together, we have
50+
51+
$$ \label{eq:cov-ind-qed}
52+
\begin{split}
53+
\mathrm{Cov}(X,Y) &\overset{\eqref{eq:cov-mean}}{=} \mathrm{E}(X\,Y) - \mathrm{E}(X) \, \mathrm{E}(Y) \\
54+
&\overset{\eqref{eq:mean-mult}}{=} \mathrm{E}(X) \, \mathrm{E}(Y) - \mathrm{E}(X) \, \mathrm{E}(Y) \\
55+
&= 0 \; .
56+
\end{split}
57+
$$

P/mblr-lme.md

Lines changed: 130 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,130 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2020-09-03 09:23:00
9+
10+
title: "Log model evidence for multivariate Bayesian linear regression"
11+
chapter: "Statistical Models"
12+
section: "Multivariate normal data"
13+
topic: "Multivariate Bayesian linear regression"
14+
theorem: "Log model evidence"
15+
16+
sources:
17+
18+
proof_id: "P161"
19+
shortcut: "mblr-lme"
20+
username: "JoramSoch"
21+
---
22+
23+
24+
**Theorem:** Let
25+
26+
$$ \label{eq:GLM}
27+
Y = X B + E, \; E \sim \mathcal{MN}(0, V, \Sigma)
28+
$$
29+
30+
be a [general linear model](/D/glm) with measured $n \times v$ data matrix $Y$, known $n \times p$ design matrix $X$, known $n \times n$ [covariance structure](/D/matn) $V$ as well as unknown $p \times v$ regression coefficients $B$ and unknown $v \times v$ [noise covariance](/D/matn) $\Sigma$. Moreover, assume a [normal-Wishart prior distribution](/P/mblr-prior) over the model parameters $B$ and $T = \Sigma^{-1}$:
31+
32+
$$ \label{eq:GLM-NW-prior}
33+
p(B,T) = \mathcal{MN}(B; M_0, \Lambda_0^{-1}, T^{-1}) \cdot \mathcal{W}(T; P_0^{-1}, \nu_0) \; .
34+
$$
35+
36+
Then, the [log model evidence](/D/lme) for this model is
37+
38+
\begin{equation} \label{eq:GLM-NW-LME}
39+
\begin{split}
40+
\log p(y|m) = & \frac{v}{2} \log |P| - \frac{nv}{2} \log (2 \pi) + \frac{v}{2} \log |\Lambda_0| - \frac{v}{2} \log |\Lambda_n| + \\
41+
& \frac{\nu_0}{2} \log\left| \frac{1}{2} P_0 \right| - \frac{\nu_n}{2} \log\left| \frac{1}{2} P_n \right| + \log \Gamma_v \left( \frac{\nu_n}{2} \right) - \log \Gamma_v \left( \frac{\nu_0}{2} \right)
42+
\end{split}
43+
\end{equation}
44+
45+
where the [posterior hyperparameters](/D/post) are given by
46+
47+
\begin{equation} \label{eq:GLM-NW-post-par}
48+
\begin{split}
49+
M_n &= \Lambda_n^{-1} (X^\mathrm{T} P Y + \Lambda_0 M_0) \\
50+
\Lambda_n &= X^\mathrm{T} P X + \Lambda_0 \\
51+
P_n &= P_0 + Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n \\
52+
\nu_n &= \nu_0 + n \; .
53+
\end{split}
54+
\end{equation}
55+
56+
57+
**Proof:** According to the [law of marginal probability](/D/prob-marg), the [model evidence](/D/ml) for this model is:
58+
59+
$$ \label{eq:GLM-NW-ME-s1}
60+
p(Y|m) = \iint p(Y|B,T) \, p(B,T) \, \mathrm{d}B \, \mathrm{d}T \; .
61+
$$
62+
63+
According to the [law of conditional probability](/D/prob-cond), the integrand is equivalent to the [joint likelihood](/D/jl):
64+
65+
$$ \label{eq:GLM-NW-ME-s2}
66+
p(Y|m) = \iint p(Y,B,T) \, \mathrm{d}B \, \mathrm{d}T \; .
67+
$$
68+
69+
Equation \eqref{eq:GLM} implies the following [likelihood function](/D/lf)
70+
71+
$$ \label{eq:GLM-LF-Class}
72+
p(Y|B,\Sigma) = \mathcal{MN}(Y; X B, V, \Sigma) = \sqrt{\frac{1}{(2 \pi)^{nv} |\Sigma|^n |V|^v}} \, \exp\left[ -\frac{1}{2} \mathrm{tr}\left( \Sigma^{-1} (Y-XB)^\mathrm{T} V^{-1} (Y-XB) \right) \right]
73+
$$
74+
75+
which, for mathematical convenience, can also be parametrized as
76+
77+
$$ \label{eq:GLM-LF-Bayes}
78+
p(Y|B,T) = \mathcal{MN}(Y; X B, P, T^{-1}) = \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \, \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T (Y-XB)^\mathrm{T} P (Y-XB) \right) \right]
79+
$$
80+
81+
using the $v \times v$ [precision matrix](/D/precmat) $T = \Sigma^{-1}$ and the $n \times n$ [precision matrix](/D/precmat) $P = V^{-1}$.
82+
83+
<br>
84+
When [deriving the posterior distribution](/P/mblr-post) $p(B,T|Y)$, the joint likelihood $p(Y,B,T)$ is obtained as
85+
86+
\begin{equation} \label{eq:GLM-NW-LME-s1}
87+
\begin{split}
88+
p(Y,B,T) = \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|T|^p |\Lambda_0|^v}{(2 \pi)^{pv}}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot |T|^{(\nu_0-v-1)/2} \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_0 T \right) \right] \cdot \\
89+
& \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ (B-M_n)^\mathrm{T} \Lambda_n (B-M_n) + (Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n) \right] \right) \right] \; .
90+
\end{split}
91+
\end{equation}
92+
93+
Using the [probability density function of the matrix-normal distribution](/P/matn-pdf), we can rewrite this as
94+
95+
\begin{equation} \label{eq:GLM-NW-LME-s2}
96+
\begin{split}
97+
p(Y,B,T) = \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|T|^p |\Lambda_0|^v}{(2 \pi)^{pv}}} \sqrt{\frac{(2 \pi)^{pv}}{|T|^p |\Lambda_n|^v}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot |T|^{(\nu_0-v-1)/2} \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_0 T \right) \right] \cdot \\
98+
& \mathcal{MN}(B; M_n, \Lambda_n^{-1}, T^{-1}) \cdot \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n \right] \right) \right] \; .
99+
\end{split}
100+
\end{equation}
101+
102+
Now, $B$ can be integrated out easily:
103+
104+
\begin{equation} \label{eq:GLM-NW-LME-s3}
105+
\begin{split}
106+
\int p(Y,B,T) \, \mathrm{d}B = \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|\Lambda_0|^v}{|\Lambda_n|^v}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot |T|^{(\nu_0-v-1)/2} \cdot \\
107+
& \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ P_0 + Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n \right] \right) \right] \; .
108+
\end{split}
109+
\end{equation}
110+
111+
Using the [probability density function of the Wishart distribution](/P/wish-pdf), we can rewrite this as
112+
113+
$$ \label{eq:GLM-NW-LME-s4}
114+
\int p(Y,B,T) \, \mathrm{d}B = \sqrt{\frac{|P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|\Lambda_0|^v}{|\Lambda_n|^v}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \sqrt{\frac{2^{\nu_n v}}{|P_n|^{\nu_n}}} \, \frac{\Gamma_v \left( \frac{\nu_n}{2} \right)}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot \mathcal{W}(T; P_n^{-1}, \nu_n) \; .
115+
$$
116+
117+
Finally, $T$ can also be integrated out:
118+
119+
$$ \label{eq:GLM-NW-LME-s5}
120+
\iint p(Y,B,T) \, \mathrm{d}B \, \mathrm{d}T = \sqrt{\frac{|P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|\Lambda_0|^v}{|\Lambda_n|^v}} \sqrt{\frac{\left| \frac{1}{2} P_0 \right|^{\nu_0}}{\left| \frac{1}{2} P_n \right|^{\nu_n}}} \, \frac{\Gamma_v \left( \frac{\nu_n}{2} \right)}{\Gamma_v \left( \frac{\nu_0}{2} \right)} = p(y|m) \; .
121+
$$
122+
123+
Thus, the [log model evidence](/D/lme) of this model is given by
124+
125+
\begin{equation} \label{eq:GLM-NW-LME-s6}
126+
\begin{split}
127+
\log p(y|m) = & \frac{v}{2} \log |P| - \frac{nv}{2} \log (2 \pi) + \frac{v}{2} \log |\Lambda_0| - \frac{v}{2} \log |\Lambda_n| + \\
128+
& \frac{\nu_0}{2} \log\left| \frac{1}{2} P_0 \right| - \frac{\nu_n}{2} \log\left| \frac{1}{2} P_n \right| + \log \Gamma_v \left( \frac{\nu_n}{2} \right) - \log \Gamma_v \left( \frac{\nu_0}{2} \right) \; .
129+
\end{split}
130+
\end{equation}

P/mblr-post.md

Lines changed: 162 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,162 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2020-09-03 08:37:00
9+
10+
title: "Posterior distribution for multivariate Bayesian linear regression"
11+
chapter: "Statistical Models"
12+
section: "Multivariate normal data"
13+
topic: "Multivariate Bayesian linear regression"
14+
theorem: "Posterior distribution"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2020
19+
title: "Bayesian multivariate linear regression"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2020-09-03"
22+
url: "https://en.wikipedia.org/wiki/Bayesian_multivariate_linear_regression#Posterior_distribution"
23+
24+
proof_id: "P160"
25+
shortcut: "mblr-post"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Theorem:** Let
31+
32+
$$ \label{eq:GLM}
33+
Y = X B + E, \; E \sim \mathcal{MN}(0, V, \Sigma)
34+
$$
35+
36+
be a [general linear model](/D/glm) with measured $n \times v$ data matrix $Y$, known $n \times p$ design matrix $X$, known $n \times n$ [covariance structure](/D/matn) $V$ as well as unknown $p \times v$ regression coefficients $B$ and unknown $v \times v$ [noise covariance](/D/matn) $\Sigma$. Moreover, assume a [normal-Wishart prior distribution](/P/mblr-prior) over the model parameters $B$ and $T = \Sigma^{-1}$:
37+
38+
$$ \label{eq:GLM-NW-prior}
39+
p(B,T) = \mathcal{MN}(B; M_0, \Lambda_0^{-1}, T^{-1}) \cdot \mathcal{W}(T; P_0^{-1}, \nu_0) \; .
40+
$$
41+
42+
Then, the [posterior distribution](/D/post) is also a [normal-Wishart distribution](/D/nw)
43+
44+
$$ \label{eq:GLM-NW-post}
45+
p(B,T|Y) = \mathcal{MN}(B; M_n, \Lambda_n^{-1}, T^{-1}) \cdot \mathcal{W}(T; P_n^{-1}, \nu_n)
46+
$$
47+
48+
and the [posterior hyperparameters](/D/post) are given by
49+
50+
$$ \label{eq:GLM-NW-post-par}
51+
\begin{split}
52+
M_n &= \Lambda_n^{-1} (X^\mathrm{T} P Y + \Lambda_0 M_0) \\
53+
\Lambda_n &= X^\mathrm{T} P X + \Lambda_0 \\
54+
P_n &= P_0 + Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n \\
55+
\nu_n &= \nu_0 + n \; .
56+
\end{split}
57+
$$
58+
59+
60+
**Proof:** According to [Bayes' theorem](/P/bayes-th), the [posterior distribution](/D/post) is given by
61+
62+
$$ \label{eq:GLM-NG-BT}
63+
p(B,T|Y) = \frac{p(Y|B,T) \, p(B,T)}{p(Y)} \; .
64+
$$
65+
66+
Since $p(Y)$ is just a normalization factor, the [posterior is proportional](/P/post-jl) to the numerator:
67+
68+
$$ \label{eq:GLM-NG-post-JL}
69+
p(B,T|Y) \propto p(Y|B,T) \, p(B,T) = p(Y,B,T) \; .
70+
$$
71+
72+
Equation \eqref{eq:GLM} implies the following [likelihood function](/D/lf)
73+
74+
$$ \label{eq:GLM-LF-Class}
75+
p(Y|B,\Sigma) = \mathcal{MN}(Y; X B, V, \Sigma) = \sqrt{\frac{1}{(2 \pi)^{nv} |\Sigma|^n |V|^v}} \, \exp\left[ -\frac{1}{2} \mathrm{tr}\left( \Sigma^{-1} (Y-XB)^\mathrm{T} V^{-1} (Y-XB) \right) \right]
76+
$$
77+
78+
which, for mathematical convenience, can also be parametrized as
79+
80+
$$ \label{eq:GLM-LF-Bayes}
81+
p(Y|B,T) = \mathcal{MN}(Y; X B, P, T^{-1}) = \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \, \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T (Y-XB)^\mathrm{T} P (Y-XB) \right) \right]
82+
$$
83+
84+
using the $v \times v$ [precision matrix](/D/precmat) $T = \Sigma^{-1}$ and the $n \times n$ [precision matrix](/D/precmat) $P = V^{-1}$.
85+
86+
<br>
87+
Combining the [likelihood function](/D/lf) \eqref{eq:GLM-LF-Bayes} with the [prior distribution](/D/prior) \eqref{eq:GLM-NW-prior}, the [joint likelihood](/D/jl) of the model is given by
88+
89+
$$ \label{eq:GLM-NW-JL-s1}
90+
\begin{split}
91+
p(Y,B,T) = \; & p(Y|B,T) \, p(B,T) \\
92+
= \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \, \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T (Y-XB)^\mathrm{T} P (Y-XB) \right) \right] \cdot \\
93+
& \sqrt{\frac{|T|^p |\Lambda_0|^v}{(2 \pi)^{pv}}} \, \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T (B-M_0)^\mathrm{T} \Lambda_0 (B-M_0) \right) \right] \cdot \\
94+
& \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} |T|^{(\nu_0-v-1)/2} \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_0 T \right) \right] \; .
95+
\end{split}
96+
$$
97+
98+
Collecting identical variables gives:
99+
100+
$$ \label{eq:GLM-NW-JL-s2}
101+
\begin{split}
102+
p(Y,B,T) = \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|T|^p |\Lambda_0|^v}{(2 \pi)^{pv}}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot |T|^{(\nu_0-v-1)/2} \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_0 T \right) \right] \cdot \\
103+
& \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ (Y-XB)^\mathrm{T} P (Y-XB) + (B-M_0)^\mathrm{T} \Lambda_0 (B-M_0) \right] \right) \right] \; .
104+
\end{split}
105+
$$
106+
107+
Expanding the products in the exponent gives:
108+
109+
$$ \label{eq:GLM-NW-JL-s3}
110+
\begin{split}
111+
p(Y,B,T) = \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|T|^p |\Lambda_0|^v}{(2 \pi)^{pv}}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot |T|^{(\nu_0-v-1)/2} \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_0 T \right) \right] \cdot \\
112+
& \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ Y^\mathrm{T} P Y - Y^\mathrm{T} P X B - B^\mathrm{T} X^\mathrm{T} P Y + B^\mathrm{T} X^\mathrm{T} P X B + \right. \right. \right. \\
113+
& \hphantom{\exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ \right. \right. \right. \!\!\!} \; \left. \left. \left. B^\mathrm{T} \Lambda_0 B - B^\mathrm{T} \Lambda_0 M_0 - M_0^\mathrm{T} \Lambda_0 B + M_0^\mathrm{T} \Lambda_0 \mu_0 \right] \right) \right] \; .
114+
\end{split}
115+
$$
116+
117+
Completing the square over $B$, we finally have
118+
119+
$$ \label{eq:GLM-NW-JL-s4}
120+
\begin{split}
121+
p(Y,B,T) = \; & \sqrt{\frac{|T|^n |P|^v}{(2 \pi)^{nv}}} \sqrt{\frac{|T|^p |\Lambda_0|^v}{(2 \pi)^{pv}}} \sqrt{\frac{|P_0|^{\nu_0}}{2^{\nu_0 v}}} \frac{1}{\Gamma_v \left( \frac{\nu_0}{2} \right)} \cdot |T|^{(\nu_0-v-1)/2} \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_0 T \right) \right] \cdot \\
122+
& \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ (B-M_n)^\mathrm{T} \Lambda_n (B-M_n) + (Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n) \right] \right) \right] \; .
123+
\end{split}
124+
$$
125+
126+
with the [posterior hyperparameters](/D/post)
127+
128+
$$ \label{eq:GLM-NW-post-B-par}
129+
\begin{split}
130+
M_n &= \Lambda_n^{-1} (X^\mathrm{T} P Y + \Lambda_0 M_0) \\
131+
\Lambda_n &= X^\mathrm{T} P X + \Lambda_0 \; .
132+
\end{split}
133+
$$
134+
135+
Ergo, the joint likelihood is proportional to
136+
137+
$$ \label{eq:GLM-NW-JL-s5}
138+
p(Y,B,T) \propto |T|^{p/2} \cdot \exp\left[ -\frac{1}{2} \mathrm{tr}\left( T \left[ (B-M_n)^\mathrm{T} \Lambda_n (B-M_n) \right] \right) \right] \cdot |T|^{(\nu_n-v-1)/2} \cdot \exp\left[ -\frac{1}{2} \mathrm{tr}\left( P_n T \right) \right]
139+
$$
140+
141+
with the [posterior hyperparameters](/D/post)
142+
143+
$$ \label{eq:GLM-NW-post-T-par}
144+
\begin{split}
145+
P_n &= P_0 + Y^\mathrm{T} P Y + M_0^\mathrm{T} \Lambda_0 M_0 - M_n^\mathrm{T} \Lambda_n M_n \\
146+
\nu_n &= \nu_0 + n \; .
147+
\end{split}
148+
$$
149+
150+
From the term in \eqref{eq:GLM-NW-JL-s5}, we can isolate the posterior distribution over $B$ given $T$:
151+
152+
$$ \label{eq:GLM-NW-post-B}
153+
p(B|T,Y) = \mathcal{MN}(B; M_n, \Lambda_n^{-1}, T^{-1}) \; .
154+
$$
155+
156+
From the remaining term, we can isolate the posterior distribution over $T$:
157+
158+
$$ \label{eq:GLM-NW-post-T}
159+
p(T|Y) = \mathcal{W}(T; P_n^{-1}, \nu_n) \; .
160+
$$
161+
162+
Together, \eqref{eq:GLM-NW-post-B} and \eqref{eq:GLM-NW-post-T} constitute the [joint](/D/prob-joint) [posterior distribution](/D/post) of $B$ and $T$.

0 commit comments

Comments
 (0)