You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: D/ind-cond.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -39,7 +39,7 @@ $$
39
39
where $p(x_1, \ldots, x_n \vert y)$ are the [joint (conditional) probabilities](/D/prob-joint) of $X_1, \ldots, X_n$ given $Y$ and $p(x_i)$ are the [marginal (conditional) probabilities](/D/prob-marg) of $X_i$ given $Y$.
40
40
41
41
<br>
42
-
2) A set of [random variables](/D/rvar) $X_1, \ldots, X_n$ with possible values $\mathcal{X}_1, \ldots, \mathcal{X}_n$ is called conditionally independent given the random variable $Y$ with possible values $\mathcal{Y}$, if
42
+
2) A set of [continuous random variables](/D/rvar-cont) $X_1, \ldots, X_n$ with possible values $\mathcal{X}_1, \ldots, \mathcal{X}_n$ is called conditionally independent given the random variable $Y$ with possible values $\mathcal{Y}$, if
Copy file name to clipboardExpand all lines: D/qf.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,7 @@ username: "JoramSoch"
27
27
---
28
28
29
29
30
-
**Definition:** Let $X$ be a [random variable](/D/rvar) with the [cumulative distribution function](/D/cdf) (CDF) $F_X(x)$. Then, the function $Q_X(p): [0,1] \to \mathbb{R}$ which is the inverse CDF is the quantile function (QF) of $X$. More precisly, the QF is the function that, for a given quantile $p \in [0,1]$, returns the smallest $x$ for which $F_X(x) = p$:
30
+
**Definition:** Let $X$ be a [random variable](/D/rvar) with the [cumulative distribution function](/D/cdf) (CDF) $F_X(x)$. Then, the function $Q_X(p): [0,1] \to \mathbb{R}$ which is the inverse CDF is the quantile function (QF) of $X$. More precisely, the QF is the function that, for a given quantile $p \in [0,1]$, returns the smallest $x$ for which $F_X(x) = p$:
31
31
32
32
$$ \label{eq:qf}
33
33
Q_X(p) = \min \left\lbrace x \in \mathbb{R} \, \vert \, F_X(x) = p \right\rbrace \; .
Copy file name to clipboardExpand all lines: D/rvar.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -33,4 +33,4 @@ username: "JoramSoch"
33
33
34
34
* formally, as a [measurable function](/D/meas-fct) $X$ defined on a [probability space](/D/prob-spc) $(\Omega, \mathcal{F}, P)$ that maps from a sample space $\Omega$ to the real numbers $\mathbb{R}$ using an event space $\mathcal{F}$ and a [probability function](/D/pmf) $P$;
35
35
36
-
* more broadly, as any random quantity $X$ such as a [random scalar](/D/rvar), a [random vector](/D/rvec) or a [random matrix](/D/rmat).
36
+
* more broadly, as any random quantity $X$ such as a [random event](/D/reve), a [random scalar](/D/rvar), a [random vector](/D/rvec) or a [random matrix](/D/rmat).
Copy file name to clipboardExpand all lines: P/gam-kl.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,12 +26,12 @@ username: "JoramSoch"
26
26
---
27
27
28
28
29
-
**Theorem:** Let $x$ be a [random variable](/D/rvar). Assume two [gamma distributions](/D/gam) $P$ and $Q$ specifying the probability distribution of $x$ as
29
+
**Theorem:** Let $X$ be a [random variable](/D/rvar). Assume two [gamma distributions](/D/gam) $P$ and $Q$ specifying the probability distribution of $X$ as
and the complexity penalty is the [Kullback-Leibler divergence](/D/kl) of [posterior](/D/post) from [prior](/D/prior)
@@ -54,7 +54,7 @@ $$ \label{eq:Com}
54
54
$$
55
55
56
56
57
-
**Proof:** We consider Bayesian inference on data $y$ using model $m$ with parameters $\theta$. Then, [Bayes' theorem](/P/bayes-th) makes a statement about the posterior distribution, i.e. the probability of parameters, given the data and the model:
57
+
**Proof:** We consider Bayesian inference on [data](/D/data) $y$ using [model](/D/gm) $m$ with parameters $\theta$. Then, [Bayes' theorem](/P/bayes-th) makes a statement about the [posterior distribution](/D/post), i.e. the probability of parameters, given the data and the model:
By definition, the left-hand side is the log model evidence and the terms on the right-hand side correspond to the posterior expectation of the log-likelihood function and the Kullback-Leibler divergence of posterior from prior
0 commit comments