Skip to content

Commit cd0917b

Browse files
authored
added 5 proofs
1 parent e0c18ab commit cd0917b

5 files changed

Lines changed: 345 additions & 0 deletions

File tree

P/betabin-cdf.md

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-10-22 05:28:00
9+
10+
title: "Cumulative distribution function of the beta-binomial distribution"
11+
chapter: "Probability Distributions"
12+
section: "Univariate discrete distributions"
13+
topic: "Beta-binomial distribution"
14+
theorem: "Cumulative distribution function"
15+
16+
sources:
17+
18+
proof_id: "P366"
19+
shortcut: "betabin-cdf"
20+
username: "JoramSoch"
21+
---
22+
23+
24+
**Theorem:** Let $X$ be a [random variable](/D/rvar) following a [beta-binomial distribution](/D/betabin):
25+
26+
$$ \label{eq:betabin}
27+
X \sim \mathrm{BetBin}(n,\alpha,\beta) \; .
28+
$$
29+
30+
Then, the [cumulative distribution function](/D/cdf) of $X$ is
31+
32+
$$ \label{eq:betabin-cdf}
33+
F_X(x) = \frac{1}{\mathrm{B}(\alpha,\beta)} \cdot \frac{\Gamma(n+1)}{\Gamma(\alpha+\beta+n)} \cdot \sum_{i=0}^{x} \frac{\Gamma(\alpha+i) \cdot \Gamma(\beta+n-i)}{\Gamma(i+1) \cdot \Gamma(n-i+1)}
34+
$$
35+
36+
where $\mathrm{B}(x,y)$ is the beta function and $\Gamma(x)$ is the gamma function.
37+
38+
39+
**Proof:** The [cumulative distribution function](/D/cdf) is defined as
40+
41+
$$ \label{eq:cdf}
42+
F_X(x) = \mathrm{Pr}(X \leq x)
43+
$$
44+
45+
which, for a [discrete random variable](/D/rvar-disc), evaluates to
46+
47+
$$ \label{eq:cdf-disc}
48+
F_X(x) = \sum_{i=-\infty}^{x} f_X(i) \; .
49+
$$
50+
51+
With the [probability mass function of the beta-binomial distribution](/P/betabin-pmf), this becomes
52+
53+
$$ \label{eq:betabin-cdf-s1}
54+
F_X(x) = \sum_{i=0}^{x} {n \choose i} \cdot \frac{\mathrm{B}(\alpha+i,\beta+n-i)}{\mathrm{B}(\alpha,\beta)} \; .
55+
$$
56+
57+
Using the expression of binomial coefficients in terms of factorials
58+
59+
$$ \label{eq:bincoeff-facts}
60+
{n \choose k} = \frac{n!}{k! \, (n-k)!} \; ,
61+
$$
62+
63+
the relationship between factorials and the gamma function
64+
65+
$$ \label{eq:facts-gamfct}
66+
n! = \Gamma(n+1)
67+
$$
68+
69+
and the link between gamma function and beta function
70+
71+
$$ \label{eq:betafct-gamfct}
72+
\mathrm{B}(\alpha,\beta) = \frac{\Gamma(\alpha) \, \Gamma(\beta)}{\Gamma(\alpha+\beta)} \; ,
73+
$$
74+
75+
equation \eqref{eq:betabin-cdf-s1} can be further developped as follows:
76+
77+
$$ \label{eq:betabin-cdf-s2}
78+
\begin{split}
79+
F_X(x) &\overset{\eqref{eq:bincoeff-facts}}{=} \frac{1}{\mathrm{B}(\alpha,\beta)} \cdot \sum_{i=0}^{x} \frac{n!}{i! \, (n-i)!} \cdot \mathrm{B}(\alpha+i,\beta+n-i) \\
80+
&\overset{\eqref{eq:betafct-gamfct}}{=} \frac{1}{\mathrm{B}(\alpha,\beta)} \cdot \sum_{i=0}^{x} \frac{n!}{i! \, (n-i)!} \cdot
81+
\frac{\Gamma(\alpha+i) \cdot \Gamma(\beta+n-i)}{\Gamma(\alpha+\beta+n)} \\
82+
&= \frac{1}{\mathrm{B}(\alpha,\beta)} \cdot \frac{n!}{\Gamma(\alpha+\beta+n)} \cdot \sum_{i=0}^{x}
83+
\frac{\Gamma(\alpha+i) \cdot \Gamma(\beta+n-i)}{i! \, (n-i)!} \\
84+
&\overset{\eqref{eq:facts-gamfct}}{=} \frac{1}{\mathrm{B}(\alpha,\beta)} \cdot \frac{\Gamma(n+1)}{\Gamma(\alpha+\beta+n)} \cdot \sum_{i=0}^{x}
85+
\frac{\Gamma(\alpha+i) \cdot \Gamma(\beta+n-i)}{\Gamma(i+1) \cdot \Gamma(n-i+1)} \; .
86+
\end{split}
87+
$$
88+
89+
This completes the proof.

P/betabin-pmf.md

Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-10-20 08:56:00
9+
10+
title: "Probability mass function of the beta-binomial distribution"
11+
chapter: "Probability Distributions"
12+
section: "Univariate discrete distributions"
13+
topic: "Beta-binomial distribution"
14+
theorem: "Probability mass function"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2022
19+
title: "Beta-binomial distribution"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2022-10-20"
22+
url: "https://en.wikipedia.org/wiki/Beta-binomial_distribution#As_a_compound_distribution"
23+
24+
proof_id: "P364"
25+
shortcut: "betabin-pmf"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Theorem:** Let $X$ be a [random variable](/D/rvar) following a [beta-binomial distribution](/D/betabin):
31+
32+
$$ \label{eq:betabin}
33+
X \sim \mathrm{BetBin}(n,\alpha,\beta) \; .
34+
$$
35+
36+
Then, the [probability mass function](/D/pmf) of $X$ is
37+
38+
$$ \label{eq:betabin-pmf}
39+
f_X(x) = {n \choose x} \cdot \frac{\mathrm{B}(\alpha+x,\beta+n-x)}{\mathrm{B}(\alpha,\beta)}
40+
$$
41+
42+
where $\mathrm{B}(x,y)$ is the beta function.
43+
44+
45+
**Proof:** A [beta-binomial random variable](/D/betabin) is defined as a [binomial variate](/D/bin) for which the success probability is following a [beta distribution](/D/beta):
46+
47+
$$ \label{eq:betabin-bin-beta}
48+
\begin{split}
49+
X \mid p &\sim \mathrm{Bin}(n, p) \\
50+
p &\sim \mathrm{Bet}(\alpha, \beta) \; .
51+
\end{split}
52+
$$
53+
54+
Thus, we can combine the [law of marginal probability](/D/prob-marg) and the [law of conditional probability](/D/prob-cond) to derive the [probability](/D/prob) of $X$ as
55+
56+
$$ \label{eq:betabin-pmf-s1}
57+
\begin{split}
58+
p(x) &= \int_\mathcal{P} \mathrm{p}(x,p) \, \mathrm{d}p \\
59+
&= \int_\mathcal{P} \mathrm{p}(x \vert p) \, \mathrm{p}(p) \, \mathrm{d}p \; .
60+
\end{split}
61+
$$
62+
63+
Now, we can plug in the [probability mass function of the binomial distribution](/P/bin-pmf) and the [probability density function of the beta distribution](/P/beta-pdf) to get
64+
65+
$$ \label{eq:betabin-pmf-s2}
66+
\begin{split}
67+
p(x) &= \int_0^1 {n \choose x} \, p^x \, (1-p)^{n-x} \cdot \frac{1}{\mathrm{B}(\alpha, \beta)} \, p^{\alpha-1} \, (1-p)^{\beta-1} \, \mathrm{d}p \\
68+
&= {n \choose x} \cdot \frac{1}{\mathrm{B}(\alpha, \beta)} \, \int_0^1 p^{\alpha+x-1} \, (1-p)^{\beta+n-x-1} \, \mathrm{d}p \\
69+
&= {n \choose x} \cdot \frac{\mathrm{B}(\alpha+x,\beta+n-x)}{\mathrm{B}(\alpha, \beta)} \, \int_0^1 \frac{1}{\mathrm{B}(\alpha+x,\beta+n-x)} \, p^{\alpha+x-1} \, (1-p)^{\beta+n-x-1} \, \mathrm{d}p \; .
70+
\end{split}
71+
$$
72+
73+
Finally, we recognize that the integrand is equal to the [probability density function of a beta distribution](/P/beta-pdf) and [because probability density integrates to one](/D/pdf), we have
74+
75+
$$ \label{eq:betabin-pmf-qed}
76+
p(x) = {n \choose x} \cdot \frac{\mathrm{B}(\alpha+x,\beta+n-x)}{\mathrm{B}(\alpha,\beta)} = f_X(x) \; .
77+
$$
78+
79+
This completes the proof.

P/betabin-pmfitogf.md

Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-10-20 08:56:00
9+
10+
title: "Expression of the probability mass function of the beta-binomial distribution using only the gamma function"
11+
chapter: "Probability Distributions"
12+
section: "Univariate discrete distributions"
13+
topic: "Beta-binomial distribution"
14+
theorem: "Probability mass function in terms of gamma function"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2022
19+
title: "Beta-binomial distribution"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2022-10-20"
22+
url: "https://en.wikipedia.org/wiki/Beta-binomial_distribution#As_a_compound_distribution"
23+
24+
proof_id: "P365"
25+
shortcut: "betabin-pmfitogf"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Theorem:** Let $X$ be a [random variable](/D/rvar) following a [beta-binomial distribution](/D/betabin):
31+
32+
$$ \label{eq:betabin}
33+
X \sim \mathrm{BetBin}(n,\alpha,\beta) \; .
34+
$$
35+
36+
Then, the [probability mass function](/D/pmf) of $X$ can be expressed as
37+
38+
$$ \label{eq:betabin-pmfitogf}
39+
f_X(x) = \frac{\Gamma(n+1)}{\Gamma(x+1) \, \Gamma(n-x+1)} \cdot \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha) \, \Gamma(\beta)} \cdot \frac{\Gamma(\alpha+x) \, \Gamma(\beta+n-x)}{\Gamma(\alpha+\beta+n)}
40+
$$
41+
42+
where $\Gamma(x)$ is the gamma function.
43+
44+
45+
**Proof:** The [probability mass function of the beta-binomial distribution](/P/betabin-pmf) is given by
46+
47+
$$ \label{eq:betabin-pmf}
48+
f_X(x) = {n \choose x} \cdot \frac{\mathrm{B}(\alpha+x,\beta+n-x)}{\mathrm{B}(\alpha,\beta)} \; .
49+
$$
50+
51+
Note that the binomial coefficient can be expressed in terms of factorials
52+
53+
$$ \label{eq:bincoeff-facts}
54+
{n \choose x} = \frac{n!}{x! \, (n-x)!} \; ,
55+
$$
56+
57+
that factorials are related to the gamma function via $n! = \Gamma(n+1)$
58+
59+
$$ \label{eq:facts-gamfct}
60+
\frac{n!}{x! \, (n-x)!} = \frac{\Gamma(n+1)}{\Gamma(x+1) \, \Gamma(n-x+1)}
61+
$$
62+
63+
and that the beta function is related to the gamma function via
64+
65+
$$ \label{eq:betafct-gamfct}
66+
\mathrm{B}(\alpha,\beta) = \frac{\Gamma(\alpha) \, \Gamma(\beta)}{\Gamma(\alpha+\beta)} \; .
67+
$$
68+
69+
Applying \eqref{eq:bincoeff-facts}, \eqref{eq:facts-gamfct} and \eqref{eq:betafct-gamfct} to \eqref{eq:betabin-pmf}, we get
70+
71+
$$ \label{eq:betabin-pmfitogf-qed}
72+
f_X(x) = \frac{\Gamma(n+1)}{\Gamma(x+1) \, \Gamma(n-x+1)} \cdot \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha) \, \Gamma(\beta)} \cdot \frac{\Gamma(\alpha+x) \, \Gamma(\beta+n-x)}{\Gamma(\alpha+\beta+n)} \; .
73+
$$
74+
75+
This completes the proof.

P/fe-der.md

Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-10-20 10:47:00
9+
10+
title: "Derivation of the family evidence"
11+
chapter: "Model Selection"
12+
section: "Bayesian model selection"
13+
topic: "Family evidence"
14+
theorem: "Derivation"
15+
16+
sources:
17+
18+
proof_id: "P368"
19+
shortcut: "fe-der"
20+
username: "JoramSoch"
21+
---
22+
23+
24+
**Theorem:** Let $f$ be a family of $M$ [generative models](/D/gm) $m_1, \ldots, m_M$ with [model evidences](/D/me) $p(y \vert m_1), \ldots, p(y \vert m_M)$. Then, the [family evidence](/D/fe) can be expressed in terms of the model evidences as
25+
26+
$$ \label{eq:FE-marg}
27+
\mathrm{FE}(f) = \sum_{i=1}^M p(y|m_i) \, p(m_i|f)
28+
$$
29+
30+
where $p(m_i \vert f)$ are the within-[family](/D/lfe) [prior](/D/prior) [model](/D/gm) [probabilities](/D/prob).
31+
32+
33+
**Proof:** This a consequence of the [law of marginal probability](/D/prob-marg) for [discrete variables](/D/rvar-disc)
34+
35+
$$ \label{eq:prob-marg}
36+
p(y|f) = \sum_{i=1}^M p(y,m_i|f)
37+
$$
38+
39+
and the [law of conditional probability](/D/prob-cond) according to which
40+
41+
$$ \label{eq:prob-cond}
42+
p(y,m_i|f) = p(y|m_i,f) \, p(m_i|f) \; .
43+
$$
44+
45+
Since models are [nested within model families](/D/fe), such that $m_i \wedge f \leftrightarrow m_i$, we have the following equality of probabilities:
46+
47+
$$ \label{eq:prob-equal}
48+
p(y|m_i,f) = p(y|m_i \wedge f) = p(y|m_i) \; .
49+
$$
50+
51+
Plugging \eqref{eq:prob-cond} into \eqref{eq:prob-marg} and applying \eqref{eq:prob-equal}, we obtain:
52+
53+
$$ \label{eq:ME-marg-qed}
54+
\mathrm{FE}(f) = p(y|f) = \sum_{i=1}^M p(y|m_i) \, p(m_i|f) \; .
55+
$$

P/me-der.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-10-20 10:11:00
9+
10+
title: "Derivation of the model evidence"
11+
chapter: "Model Selection"
12+
section: "Bayesian model selection"
13+
topic: "Model evidence"
14+
theorem: "Derivation"
15+
16+
sources:
17+
18+
proof_id: "P367"
19+
shortcut: "me-der"
20+
username: "JoramSoch"
21+
---
22+
23+
24+
**Theorem:** Let $p(y \vert \theta,m)$ be a [likelihood function](/D/lf) of a [generative model](/D/gm) $m$ for making inferences on model parameters $\theta$ given measured data $y$. Moreover, let $p(\theta \vert m)$ be a [prior distribution](/D/prior) on model parameters $\theta$ in the parameter space $\Theta$. Then, the [model evidence](/D/me) (ME) can be expressed in terms of [likelihood](/D/lf) and [prior](/D/prior) as
25+
26+
$$ \label{eq:ME-marg}
27+
\mathrm{ME}(m) = \int_{\Theta} p(y|\theta,m) \, p(\theta|m) \, \mathrm{d}\theta \; .
28+
$$
29+
30+
31+
**Proof:** This a consequence of the [law of marginal probability](/D/prob-marg) for [continuous variables](/D/rvar-disc)
32+
33+
$$ \label{eq:prob-marg}
34+
p(y|m) = \int_{\Theta} p(y,\theta|m) \, \mathrm{d}\theta
35+
$$
36+
37+
and the [law of conditional probability](/D/prob-cond) according to which
38+
39+
$$ \label{eq:prob-cond}
40+
p(y,\theta|m) = p(y|\theta,m) \, p(\theta|m) \; .
41+
$$
42+
43+
Plugging \eqref{eq:prob-cond} into \eqref{eq:prob-marg}, we obtain:
44+
45+
$$ \label{eq:ME-marg-qed}
46+
\mathrm{ME}(m) = p(y|m) = \int_{\Theta} p(y|\theta,m) \, p(\theta|m) \, \mathrm{d}\theta \; .
47+
$$

0 commit comments

Comments
 (0)