Skip to content

Commit 4056b2f

Browse files
committed
corrected some pages
Several small mistakes/errors were corrected in several proofs/definitions.
1 parent e84890c commit 4056b2f

6 files changed

Lines changed: 22 additions & 18 deletions

File tree

D/cdf.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ definition: "Cumulative distribution function"
1616
sources:
1717
- authors: "Wikipedia"
1818
year: 2020
19-
title: "Probability density function"
19+
title: "Cumulative distribution function"
2020
in: "Wikipedia, the free encyclopedia"
2121
pages: "retrieved on 2020-02-17"
2222
url: "https://en.wikipedia.org/wiki/Cumulative_distribution_function#Definition"
@@ -27,7 +27,11 @@ username: "JoramSoch"
2727
---
2828

2929

30-
**Definition:**
30+
**Definition:** The cumulative distribution function (CDF) of a [random variable](/D/rvar) $X$ at a given value $x$ is defined as the [probability](/D/prob) that $X$ is smaller than $x$:
31+
32+
$$ \label{eq:cdf}
33+
F_X(x) = \mathrm{Pr}(X \leq x) \; .
34+
$$
3135

3236
1) Let $X$ be a discrete [random variable](/D/rvar) with possible outcomes $\mathcal{X}$ and the [probability mass function](/D/pmf) $f_X(x)$. Then, the function $F_X(x): \mathbb{R} \to [0,1]$ with
3337

D/mom.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,5 +30,5 @@ username: "JoramSoch"
3030
**Definition:** Let $X$ be a [random variable](/D/rvar) and let $n$ be a positive integer. Then, the $n$-th moment of $X$, also called ($n$-th) "raw moment" or "crude moment", is defined as the [expected value](/D/mean) of the $n$-th power of $X$:
3131

3232
$$ \label{eq:mom}
33-
\mu_n = \mathrm{E}[X^n] \; .
33+
\mu_n' = \mathrm{E}[X^n] \; .
3434
$$

P/bma-lme.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ $$
3737
where $p(\theta \vert m_i,y)$ is the posterior distributions over $\theta$ obtained using $m_i$.
3838

3939

40-
**Proof:** According to the [law of marginal probability](/D/prob-marg), the probability of the shared parameters $\theta$ conditional on the measured data $y$ [can be obtained](/D/bma-der) by marginalizing over the discrete variable model $m$:
40+
**Proof:** According to the [law of marginal probability](/D/prob-marg), the probability of the shared parameters $\theta$ conditional on the measured data $y$ [can be obtained](/P/bma-der) by marginalizing over the discrete variable model $m$:
4141

4242
$$ \label{eq:BMA-PMP}
4343
p(\theta|y) = \sum_{i=1}^{M} p(\theta|m_i,y) \cdot p(m_i|y) \; ,

P/ent-nonneg.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -36,23 +36,23 @@ $$
3636
**Proof:** The [entropy of a discrete random variable](/D/ent) is defined as
3737

3838
$$ \label{eq:ent}
39-
\mathrm{H}(X) = - \sum_{i=1}^{k} p(x_i) \cdot \log_b p(x_i)
39+
\mathrm{H}(X) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b p(x)
4040
$$
4141

4242
The minus sign can be moved into the sum:
4343

4444
$$ \label{eq:ent-dev}
45-
\mathrm{H}(X) = \sum_{i=1}^{k} \left[ p(x_i) \cdot \left( - \log_b p(x_i) \right) \right]
45+
\mathrm{H}(X) = \sum_{x \in \mathcal{X}} \left[ p(x) \cdot \left( - \log_b p(x) \right) \right]
4646
$$
4747

4848
Because the co-domain of [probability mass functions](/D/pmf) is $[0,1]$, we can deduce:
4949

5050
$$ \label{eq:nonneg}
5151
\begin{array}{rcccl}
52-
0 &\leq &p(x_i) &\leq &1 \\
53-
-\infty &\leq &\log_b p(x_i) &\leq &0 \\
54-
0 &\leq &-\log_b p(x_i) &\leq &+\infty \\
55-
0 &\leq &p(x_i) \cdot \left(-\log_b p(x_i)\right) &\leq &+\infty \; .
52+
0 &\leq &p(x) &\leq &1 \\
53+
-\infty &\leq &\log_b p(x) &\leq &0 \\
54+
0 &\leq &-\log_b p(x) &\leq &+\infty \\
55+
0 &\leq &p(x) \cdot \left(-\log_b p(x)\right) &\leq &+\infty \; .
5656
\end{array}
5757
$$
5858

P/gibbs-ineq.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -43,32 +43,32 @@ $$
4343
Let $I$ be the of all $x$ for which $p(x)$ is non-zero. Then, proving \eqref{eq:Gibbs-ineq} requires to show that
4444

4545
$$ \label{eq:Gibbs-ineq-s1}
46-
\sum_{x \in I} p(x) \, \frac{\ln p(x)}{\ln q(x)} \geq 0 \; .
46+
\sum_{x \in I} p(x) \, \ln \frac{p(x)}{q(x)} \geq 0 \; .
4747
$$
4848

4949
Because $\ln x \leq x - 1$, i.e. $-\ln x \geq 1 - x$, for all $x > 0$, with equality only if $x = 1$, we can say about the left-hand side that
5050

5151
$$ \label{eq:Gibbs-ineq-s2}
5252
\begin{split}
53-
\sum_{x \in I} p(x) \, \frac{\ln p(x)}{\ln q(x)} &\geq \sum_{x \in I} p(x) \left( 1 - \frac{p(x)}{q(x)} \right) \\
53+
\sum_{x \in I} p(x) \, \ln \frac{p(x)}{q(x)} &\geq \sum_{x \in I} p(x) \left( 1 - \frac{p(x)}{q(x)} \right) \\
5454
&= \sum_{x \in I} p(x) - \sum_{x \in I} q(x) \; .
5555
\end{split}
5656
$$
5757

58-
Finally, since $p(x)$ and $q(x)$ are [probability mass functions](/D/pmf), it holds that $0 \leq p(x),q(x) \leq 1$ and we also have
58+
Finally, since $p(x)$ and $q(x)$ are [probability mass functions](/D/pmf), we have
5959

6060
$$ \label{eq:p-q-pmf}
6161
\begin{split}
62-
\sum_{x \in I} p(x) &= 1 \\
63-
\sum_{x \in I} q(x) &\leq 1 \; ,
62+
0 \leq p(x) \leq 1, \quad \sum_{x \in I} p(x) &= 1 \quad \text{and} \\
63+
0 \leq q(x) \leq 1, \quad \sum_{x \in I} q(x) &\leq 1 \; ,
6464
\end{split}
6565
$$
6666

6767
such that it follows from \eqref{eq:Gibbs-ineq-s2} that
6868

6969
$$ \label{eq:Gibbs-ineq-s3}
7070
\begin{split}
71-
\sum_{x \in I} p(x) \, \frac{\ln p(x)}{\ln q(x)} &\geq \sum_{x \in I} p(x) - \sum_{x \in I} q(x) \\
71+
\sum_{x \in I} p(x) \, \ln \frac{p(x)}{q(x)} &\geq \sum_{x \in I} p(x) - \sum_{x \in I} q(x) \\
7272
&= 1 - \sum_{x \in I} q(x) \geq 0 \; .
7373
\end{split}
7474
$$

P/kl-nonneg.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,13 +39,13 @@ with $\mathrm{KL}[P \vert \vert Q] = 0$, if and only if $P = Q$.
3939
**Proof:** The discrete [Kullback-Leibler divergence](/D/kl) is defined as
4040

4141
$$ \label{eq:KL}
42-
\mathrm{KL}[P||Q] = \sum_{x \in \mathcal{X}} p(x) \cdot \log \frac{p(x)}{q(x)} \, \mathrm{d}x
42+
\mathrm{KL}[P||Q] = \sum_{x \in \mathcal{X}} p(x) \cdot \log \frac{p(x)}{q(x)}
4343
$$
4444

4545
which can be reformulated into
4646

4747
$$ \label{eq:KL-dev}
48-
\mathrm{KL}[P||Q] = \sum_{x \in \mathcal{X}} p(x) \cdot \log p(x) - \sum_{x \in \mathcal{X}} p(x) \cdot \log q(x) \, \mathrm{d}x \; .
48+
\mathrm{KL}[P||Q] = \sum_{x \in \mathcal{X}} p(x) \cdot \log p(x) - \sum_{x \in \mathcal{X}} p(x) \cdot \log q(x) \; .
4949
$$
5050

5151
[Gibbs' inequality](/P/gibbs-ineq) states that the [entropy](/D/ent) of a probability distribution is always less than or equal to the [cross-entropy](/D/ent-cross) with another probability distribution – with equality only if the distributions are identical –,

0 commit comments

Comments
 (0)