Skip to content

Commit a7453ca

Browse files
authored
Merge pull request #118 from StatProofBook/master
update to master
2 parents f846dc2 + f057011 commit a7453ca

15 files changed

Lines changed: 36 additions & 32 deletions

D/iass.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ $$ \label{eq:anova}
3333
y_{ijk} = \mu + \alpha_i + \beta_j + \gamma_{ij} + \varepsilon_{ijk}, \; \varepsilon_{ijk} \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2) \; .
3434
$$
3535

36-
Then, the interaction sum of squares is defined as the [explained sum of squares] (ESS) for each interaction, i.e. as the sum of squared deviations of the average for each cell from the average across all observations, controlling for the [treatment sums of squares](/D/trss) of the corresponding factors:
36+
Then, the interaction sum of squares is defined as the [explained sum of squares](/D/ess) (ESS) for each interaction, i.e. as the sum of squared deviations of the average for each cell from the average across all observations, controlling for the [treatment sums of squares](/D/trss) of the corresponding factors:
3737

3838
$$ \label{eq:iass}
3939
\begin{split}
@@ -42,4 +42,4 @@ $$ \label{eq:iass}
4242
\end{split}
4343
$$
4444

45-
Here, $\bar{y} _{i j \bullet}$ is the mean for the $(i,j)$-th cell (out of $a \times b$ cells), computed from $n_{ij}$ values $y_{ijk}$, $\bar{y} _{i \bullet \bullet}$ and $\bar{y} _{\bullet j \bullet}$ are the level means for the two factors and and $\bar{y} _{\bullet \bullet \bullet}$ is the mean across all values $y_{ijk}$.
45+
Here, $\bar{y}\_{i j \bullet}$ is the mean for the $(i,j)$-th cell (out of $a \times b$ cells), computed from $n\_{ij}$ values $y\_{ijk}$, $\bar{y}\_{i \bullet \bullet}$ and $\bar{y}\_{\bullet j \bullet}$ are the level means for the two factors and and $\bar{y}\_{\bullet \bullet \bullet}$ is the mean across all values $y\_{ijk}$.

D/trss.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,10 +33,10 @@ $$ \label{eq:anova}
3333
y_{ij} = \mu + \delta_i + \varepsilon_{ij}, \; \varepsilon_{ij} \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2) \; .
3434
$$
3535

36-
Then, the treatment sum of squares is defined as the [explained sum of squares] (ESS) for each main effect, i.e. as the sum of squared deviations of the average for each level of the factor, from the average across all observations:
36+
Then, the treatment sum of squares is defined as the [explained sum of squares](/D/ess) (ESS) for each main effect, i.e. as the sum of squared deviations of the average for each level of the factor, from the average across all observations:
3737

3838
$$ \label{eq:trss}
3939
\mathrm{SS}_\mathrm{treat} = \sum_{i=1}^{k} \sum_{j=1}^{n_i} (\bar{y}_i - \bar{y})^2 \; .
4040
$$
4141

42-
Here, $\bar{y}_i$ is the mean for the $i$-th level of the factor (out of $k$ levels), computed from $n_i$ values $y_{ij}$, and $\bar{y}$ is the mean across all values $y_{ij}$.
42+
Here, $\bar{y}\_i$ is the mean for the $i$-th level of the factor (out of $k$ levels), computed from $n_i$ values $y\_{ij}$, and $\bar{y}$ is the mean across all values $y\_{ij}$.

I/PbA.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ title: "Proof by Author"
88

99
- [Covariance matrix of the multinomial distribution](/P/mult-cov)
1010

11-
### JoramSoch (393 proofs)
11+
### JoramSoch (394 proofs)
1212

1313
- [Accuracy and complexity for the univariate Gaussian](/P/ug-anc)
1414
- [Accuracy and complexity for the univariate Gaussian with known variance](/P/ugkv-anc)
@@ -142,6 +142,7 @@ title: "Proof by Author"
142142
- [Kullback-Leibler divergence for the binomial distribution](/P/bin-kl)
143143
- [Kullback-Leibler divergence for the continuous uniform distribution](/P/cuni-kl)
144144
- [Kullback-Leibler divergence for the Dirichlet distribution](/P/dir-kl)
145+
- [Kullback-Leibler divergence for the discrete uniform distribution](/P/duni-kl)
145146
- [Kullback-Leibler divergence for the gamma distribution](/P/gam-kl)
146147
- [Kullback-Leibler divergence for the matrix-normal distribution](/P/matn-kl)
147148
- [Kullback-Leibler divergence for the multivariate normal distribution](/P/mvn-kl)
@@ -433,8 +434,8 @@ title: "Proof by Author"
433434
- [Encompassing prior method for computing Bayes factors](/P/bf-ep)
434435
- [Mean of the ex-Gaussian distribution](/P/exg-mean)
435436
- [Mean of the Wald distribution](/P/wald-mean)
436-
- [Method of moments for ex-Gaussian distributed data](/P/exg-mome)
437-
- [Method of moments for Wald distributed data](/P/wald-mome)
437+
- [Method of moments for ex-Gaussian-distributed data](/P/exg-mome)
438+
- [Method of moments for Wald-distributed data](/P/wald-mome)
438439
- [Moment-generating function of the ex-Gaussian distribution](/P/exg-mgf)
439440
- [Moment-generating function of the exponential distribution](/P/exp-mgf)
440441
- [Moment-generating function of the Wald distribution](/P/wald-mgf)

I/PbN.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -428,5 +428,6 @@ title: "Proof by Number"
428428
| P420 | bin-kl | [Kullback-Leibler divergence for the binomial distribution](/P/bin-kl) | JoramSoch | 2023-10-20 |
429429
| P421 | wald-skew | [Skewness of the Wald distribution](/P/wald-skew) | tomfaulkenberry | 2023-10-24 |
430430
| P422 | cuni-kl | [Kullback-Leibler divergence for the continuous uniform distribution](/P/cuni-kl) | JoramSoch | 2023-10-27 |
431-
| P423 | wald-mome | [Method of moments for Wald distributed data](/P/wald-mome) | tomfaulkenberry | 2023-10-30 |
432-
| P424 | exg-mome | [Method of moments for ex-Gaussian distributed data](/P/exg-mome) | tomfaulkenberry | 2023-10-30 |
431+
| P423 | wald-mome | [Method of moments for Wald-distributed data](/P/wald-mome) | tomfaulkenberry | 2023-10-30 |
432+
| P424 | exg-mome | [Method of moments for ex-Gaussian-distributed data](/P/exg-mome) | tomfaulkenberry | 2023-10-30 |
433+
| P425 | duni-kl | [Kullback-Leibler divergence for the discrete uniform distribution](/P/duni-kl) | JoramSoch | 2023-11-17 |

I/PbT.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -169,6 +169,7 @@ title: "Proof by Topic"
169169
- [Kullback-Leibler divergence for the binomial distribution](/P/bin-kl)
170170
- [Kullback-Leibler divergence for the continuous uniform distribution](/P/cuni-kl)
171171
- [Kullback-Leibler divergence for the Dirichlet distribution](/P/dir-kl)
172+
- [Kullback-Leibler divergence for the discrete uniform distribution](/P/duni-kl)
172173
- [Kullback-Leibler divergence for the gamma distribution](/P/gam-kl)
173174
- [Kullback-Leibler divergence for the matrix-normal distribution](/P/matn-kl)
174175
- [Kullback-Leibler divergence for the multivariate normal distribution](/P/mvn-kl)
@@ -253,8 +254,8 @@ title: "Proof by Topic"
253254
- [Median of the normal distribution](/P/norm-med)
254255
- [Method of moments for beta-binomial data](/P/betabin-mome)
255256
- [Method of moments for beta-distributed data](/P/beta-mome)
256-
- [Method of moments for ex-Gaussian distributed data](/P/exg-mome)
257-
- [Method of moments for Wald distributed data](/P/wald-mome)
257+
- [Method of moments for ex-Gaussian-distributed data](/P/exg-mome)
258+
- [Method of moments for Wald-distributed data](/P/wald-mome)
258259
- [Mode of the continuous uniform distribution](/P/cuni-med)
259260
- [Mode of the exponential distribution](/P/exp-mode)
260261
- [Mode of the log-normal distribution](/P/lognorm-mode)

I/PwS.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,7 @@ title: "Proofs without Source"
4747
- [Joint likelihood is the product of likelihood function and prior density](/P/jl-lfnprior)
4848
- [Kullback-Leibler divergence for the Bernoulli distribution](/P/bern-kl)
4949
- [Kullback-Leibler divergence for the continuous uniform distribution](/P/cuni-kl)
50+
- [Kullback-Leibler divergence for the discrete uniform distribution](/P/duni-kl)
5051
- [Kullback-Leibler divergence for the matrix-normal distribution](/P/matn-kl)
5152
- [Kullback-Leibler divergence for the normal distribution](/P/norm-kl)
5253
- [Linear combination of independent normal random variables](/P/norm-lincomb)
@@ -85,8 +86,8 @@ title: "Proofs without Source"
8586
- [Median of the exponential distribution](/P/exp-med)
8687
- [Median of the log-normal distribution](/P/lognorm-med)
8788
- [Median of the normal distribution](/P/norm-med)
88-
- [Method of moments for ex-Gaussian distributed data](/P/exg-mome)
89-
- [Method of moments for Wald distributed data](/P/wald-mome)
89+
- [Method of moments for ex-Gaussian-distributed data](/P/exg-mome)
90+
- [Method of moments for Wald-distributed data](/P/wald-mome)
9091
- [Mode of the continuous uniform distribution](/P/cuni-med)
9192
- [Mode of the exponential distribution](/P/exp-mode)
9293
- [Mode of the normal distribution](/P/norm-mode)

P/anova1-fols.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -40,13 +40,13 @@ F = \frac{\frac{1}{k-1} \sum_{i=1}^{k} n_i \hat{\delta}_i^2}{\frac{1}{n-k} \sum_
4040
$$
4141

4242

43-
**Theorem:** The [F-statistic for the main effect in one-way ANOVA](/P/anova1-f) is given in terms of the [sample means](/D/mean-samp) as
43+
**Proof:** The [F-statistic for the main effect in one-way ANOVA](/P/anova1-f) is given in terms of the [sample means](/D/mean-samp) as
4444

4545
$$ \label{eq:anova1-f}
4646
F = \frac{\frac{1}{k-1} \sum_{i=1}^{k} n_i (\bar{y}_i - \bar{y})^2}{\frac{1}{n-k} \sum_{i=1}^{k} \sum_{j=1}^{n_i} (y_{ij} - \bar{y}_i)^2}
4747
$$
4848

49-
where $\bar{y} _i$ is the average of all values $y_{ij}$ from category $i$ and $\bar{y}$ is the grand mean of all values $y_{ij}$ from all categories $i = 1, \ldots, k$.
49+
where $\bar{y}\_i$ is the average of all values $y\_{ij}$ from category $i$ and $\bar{y}$ is the grand mean of all values $y\_{ij}$ from all categories $i = 1, \ldots, k$.
5050

5151
1) The [ordinary least squares estimates for one-way ANOVA](/P/anova1-ols) are
5252

P/anova2-pss.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ $$ \label{eq:anova2-pss}
4545
\mathrm{SS}_\mathrm{tot} = \mathrm{SS}_{A} + \mathrm{SS}_{B} + \mathrm{SS}_{A \times B} + \mathrm{SS}_\mathrm{res}
4646
$$
4747

48-
where $\mathrm{SS} _\mathrm{tot}$ is the [total sum of squares](/D/tss), $\mathrm{SS} _{A}$, $\mathrm{SS} _{B}$ and $\mathrm{SS} _{A \times B}$ are [treatment](/D/trss) and [interaction sum of squares](/D/iass) (summing into the [explained sum of squares](/D/ess)) and $\mathrm{SS} _\mathrm{res}$ is the [residual sum of squares](/D/rss).
48+
where $\mathrm{SS}\_\mathrm{tot}$ is the [total sum of squares](/D/tss), $\mathrm{SS}\_{A}$, $\mathrm{SS}\_{B}$ and $\mathrm{SS}\_{A \times B}$ are [treatment](/D/trss) and [interaction sum of squares](/D/iass) (summing into the [explained sum of squares](/D/ess)) and $\mathrm{SS}\_\mathrm{res}$ is the [residual sum of squares](/D/rss).
4949

5050

5151
**Proof:** The [total sum of squares](/D/tss) for [two-way ANOVA](/D/anova2) is given by
@@ -54,7 +54,7 @@ $$ \label{eq:anova2-tss}
5454
\mathrm{SS}_\mathrm{tot} = \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{\bullet \bullet \bullet})^2
5555
$$
5656

57-
where $\bar{y} _{\bullet \bullet \bullet}$ is the mean across all values $y_{ijk}$. This can be rewritten as
57+
where $\bar{y}\_{\bullet \bullet \bullet}$ is the mean across all values $y\_{ijk}$. This can be rewritten as
5858

5959
$$ \label{eq:anova2-pss-s1}
6060
\begin{split}

P/bin-ent.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -87,4 +87,4 @@ $$ \label{eq:bin-ent-s4}
8787
\mathrm{H}(X) = n \cdot \mathrm{H}_\mathrm{bern}(p) - \mathrm{E}_\mathrm{lbc}(n,p) \; .
8888
$$
8989

90-
Note that, because $0 \leq \mathrm{H}_\mathrm{bern}(p) \leq 1$, we have $0 \leq n \cdot \mathrm{H}_\mathrm{bern}(p) \leq n$, and because the [entropy is non-negative](/P/ent-nonneg), it must hold that $n \geq \mathrm{E}_\mathrm{lbc}(n,p) \geq 0$.
90+
Note that, because $0 \leq \mathrm{H}\_\mathrm{bern}(p) \leq 1$, we have $0 \leq n \cdot \mathrm{H}\_\mathrm{bern}(p) \leq n$, and because the [entropy is non-negative](/P/ent-nonneg), it must hold that $n \geq \mathrm{E}\_\mathrm{lbc}(n,p) \geq 0$.

P/duni-kl.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ $$ \label{eq:KL-disc}
4343
\mathrm{KL}[P\,||\,Q] = \sum_{x \in \mathcal{X}} p(x) \, \ln \frac{p(x)}{q(x)} \; .
4444
$$
4545

46-
This means that the KL divergence of $P$ from $Q$ is only defined, if for all $x \in \mathcal{X}$, $q(x) = 0$ implies $p(x) = 0$. Thus, $\mathrm{KL}[P\,||\,Q]$ only exists, if $a_2 \leq a_1$ and $b_1 \leq b_2$, i.e. if $P$ only places non-zero probability where $Q$ also places non-zero probability, such that $q(x)$ is not zero for any $x \in \mathcal{X}$ where $p(x)$ is positive.
46+
This means that the KL divergence of $P$ from $Q$ is only defined, if for all $x \in \mathcal{X}$, $q(x) = 0$ implies $p(x) = 0$. Thus, $\mathrm{KL}[P\,\vert\vert\,Q]$ only exists, if $a_2 \leq a_1$ and $b_1 \leq b_2$, i.e. if $P$ only places non-zero probability where $Q$ also places non-zero probability, such that $q(x)$ is not zero for any $x \in \mathcal{X}$ where $p(x)$ is positive.
4747

4848
If this requirement is fulfilled, we can write
4949

0 commit comments

Comments
 (0)