Skip to content

Commit a4f1f5d

Browse files
authored
Merge pull request #191 from JoramSoch/master
added 2 definitions and corrected 3 proofs
2 parents 7b635e0 + 99f9395 commit a4f1f5d

6 files changed

Lines changed: 161 additions & 490 deletions

File tree

D/iass.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
layout: definition
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-12-14 13:14:00
9+
10+
title: "Interaction sum of squares"
11+
chapter: "Statistical Models"
12+
section: "Univariate normal data"
13+
topic: "Analysis of variance"
14+
definition: "Interaction sum of squares"
15+
16+
sources:
17+
- authors: "Nandy, Siddhartha"
18+
year: 2018
19+
title: "Two-Way Analysis of Variance"
20+
in: "Stat 512: Applied Regression Analysis"
21+
pages: "Purdue University, Summer 2018, Ch. 19"
22+
url: "https://www.stat.purdue.edu/~snandy/stat512/topic7.pdf"
23+
24+
def_id: "D184"
25+
shortcut: "iass"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Definition:** Let there be an analysis of variance (ANOVA) model with [two](/D/anova2) or [more](/D/anovan) factors influencing the measured data $y$ (here, using the [standard formulation](/P/anova2-pss) of [two-way ANOVA](/D/anova2)):
31+
32+
$$ \label{eq:anova}
33+
y_{ijk} = \mu + \alpha_i + \beta_j + \gamma_{ij} + \varepsilon_{ijk}, \; \varepsilon_{ijk} \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2) \; .
34+
$$
35+
36+
Then, the interaction sum of squares is defined as the [explained sum of squares] (ESS) for each interaction, i.e. as the sum of squared deviations of the average for each cell from the average across all observations, controlling for the [treatment sums of squares](/D/trss) of the corresponding factors:
37+
38+
\begin{equation} \label{eq:iass}
39+
\begin{split}
40+
\mathrm{SS}_\mathrm{A \times B} &= \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} ([\bar{y}_{i j \bullet} - \bar{y}_{\bullet \bullet \bullet}] - [\bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet \bullet \bullet}] - [\bar{y}_{\bullet j \bullet} - \bar{y}_{\bullet \bullet \bullet}])^2 \\
41+
&= \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (\bar{y}_{i j \bullet} - \bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet j \bullet} + \bar{y}_{\bullet \bullet \bullet})^2 \; .
42+
\end{split}
43+
\end{equation}
44+
45+
Here, $\bar{y} _{i j \bullet}$ is the mean for the $(i,j)$-th cell (out of $a \times b$ cells), computed from $n_{ij}$ values $y_{ijk}$, $\bar{y} _{i \bullet \bullet}$ and $\bar{y} _{\bullet j \bullet}$ are the level means for the two factors and and $\bar{y} _{\bullet \bullet \bullet}$ is the mean across all values $y_{ijk}$.

D/trss.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
---
2+
layout: definition
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2022-12-14 13:01:00
9+
10+
title: "Treatment sum of squares"
11+
chapter: "Statistical Models"
12+
section: "Univariate normal data"
13+
topic: "Analysis of variance"
14+
definition: "Treatment sum of squares"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2022
19+
title: "Analysis of variance"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2022-11-15"
22+
url: "https://en.wikipedia.org/wiki/Analysis_of_variance#Partitioning_of_the_sum_of_squares"
23+
24+
def_id: "D183"
25+
shortcut: "trss"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Definition:** Let there be an analysis of variance (ANOVA) model with [one](/D/anova1), [two](/D/anova2) or [multiple](/D/anovan) factors influencing the measured data $y$ (here, using the [reparametrized version](/P/anova1-repara) of [one-way ANOVA](/D/anova1)):
31+
32+
$$ \label{eq:anova}
33+
y_{ij} = \mu + \delta_i + \varepsilon_{ij}, \; \varepsilon_{ij} \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2) \; .
34+
$$
35+
36+
Then, the treatment sum of squares is defined as the [explained sum of squares] (ESS) for each main effect, i.e. as the sum of squared deviations of the average for each level of the factor, from the average across all observations:
37+
38+
$$ \label{eq:trss}
39+
\mathrm{SS}_\mathrm{treat} = \sum_{i=1}^{k} \sum_{j=1}^{n_i} (\bar{y}_i - \bar{y})^2 \; .
40+
$$
41+
42+
Here, $\bar{y}_i$ is the mean for the $i$-th level of the factor (out of $k$ levels), computed from $n_i$ values $y_{ij}$, and $\bar{y}$ is the mean across all values $y_{ij}$.

I/ToC.md

Lines changed: 15 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -565,19 +565,21 @@ title: "Table of Contents"
565565

566566
1.3. Analysis of variance <br>
567567
&emsp;&ensp; 1.3.1. *[One-way ANOVA](/D/anova1)* <br>
568-
&emsp;&ensp; 1.3.2. **[Ordinary least squares for one-way ANOVA](/P/anova1-ols)** <br>
569-
&emsp;&ensp; 1.3.3. **[Sums of squares in one-way ANOVA](/P/anova1-pss)** <br>
570-
&emsp;&ensp; 1.3.4. **[F-test for main effect in one-way ANOVA](/P/anova1-f)** <br>
571-
&emsp;&ensp; 1.3.5. **[F-statistic in terms of OLS estimates](/P/anova1-fols)** <br>
572-
&emsp;&ensp; 1.3.6. **[Reparametrization of one-way ANOVA](/P/anova1-repara)** <br>
573-
&emsp;&ensp; 1.3.7. *[Two-way ANOVA](/D/anova2)* <br>
574-
&emsp;&ensp; 1.3.8. **[Ordinary least squares for two-way ANOVA](/P/anova2-ols)** <br>
575-
&emsp;&ensp; 1.3.9. **[Sums of squares in two-way ANOVA](/P/anova2-pss)** <br>
576-
&emsp;&ensp; 1.3.10. **[Cochran's theorem for two-way ANOVA](/P/anova2-cochran)** <br>
577-
&emsp;&ensp; 1.3.11. **[F-test for main effect in two-way ANOVA](/P/anova2-fme)** <br>
578-
&emsp;&ensp; 1.3.12. **[F-test for interaction in two-way ANOVA](/P/anova2-fia)** <br>
579-
&emsp;&ensp; 1.3.13. **[F-test for grand mean in two-way ANOVA](/P/anova2-fgm)** <br>
580-
&emsp;&ensp; 1.3.14. **[F-statistics in terms of OLS estimates](/P/anova2-fols)** <br>
568+
&emsp;&ensp; 1.3.2. *[Treatment sum of squares](/D/trss)* <br>
569+
&emsp;&ensp; 1.3.3. **[Ordinary least squares for one-way ANOVA](/P/anova1-ols)** <br>
570+
&emsp;&ensp; 1.3.4. **[Sums of squares in one-way ANOVA](/P/anova1-pss)** <br>
571+
&emsp;&ensp; 1.3.5. **[F-test for main effect in one-way ANOVA](/P/anova1-f)** <br>
572+
&emsp;&ensp; 1.3.6. **[F-statistic in terms of OLS estimates](/P/anova1-fols)** <br>
573+
&emsp;&ensp; 1.3.7. **[Reparametrization of one-way ANOVA](/P/anova1-repara)** <br>
574+
&emsp;&ensp; 1.3.8. *[Two-way ANOVA](/D/anova2)* <br>
575+
&emsp;&ensp; 1.3.9. *[Interaction sum of squares](/D/iass)* <br>
576+
&emsp;&ensp; 1.3.10. **[Ordinary least squares for two-way ANOVA](/P/anova2-ols)** <br>
577+
&emsp;&ensp; 1.3.11. **[Sums of squares in two-way ANOVA](/P/anova2-pss)** <br>
578+
&emsp;&ensp; 1.3.12. **[Cochran's theorem for two-way ANOVA](/P/anova2-cochran)** <br>
579+
&emsp;&ensp; 1.3.13. **[F-test for main effect in two-way ANOVA](/P/anova2-fme)** <br>
580+
&emsp;&ensp; 1.3.14. **[F-test for interaction in two-way ANOVA](/P/anova2-fia)** <br>
581+
&emsp;&ensp; 1.3.15. **[F-test for grand mean in two-way ANOVA](/P/anova2-fgm)** <br>
582+
&emsp;&ensp; 1.3.16. **[F-statistics in terms of OLS estimates](/P/anova2-fols)** <br>
581583

582584
1.4. Simple linear regression <br>
583585
&emsp;&ensp; 1.4.1. *[Definition](/D/slr)* <br>

P/anova2-fgm.md

Lines changed: 13 additions & 158 deletions
Original file line numberDiff line numberDiff line change
@@ -64,177 +64,32 @@ H_1: &\; \mu \neq 0 \; .
6464
$$
6565

6666

67-
**Proof:** Denote sample sizes as
67+
**Proof:** Applying [Cochran's theorem for two-analysis of variance](/P/anova2-cochran), we find that the following squared sums
6868

69-
$$ \label{eq:samp-size}
69+
$$ \label{eq:anova2-ss-dist}
7070
\begin{split}
71-
n_{ij} &- \text{number of samples in category} \; (i,j) \\
72-
n_{i \bullet} &= \sum_{j=1}^{b} n_{ij} \\
73-
n_{\bullet j} &= \sum_{i=1}^{a} n_{ij} \\
74-
n &= \sum_{i=1}^{a} \sum_{j=1}^{b} n_{ij}
71+
\frac{\mathrm{SS}_M}{\sigma^2} &= \frac{1}{\sigma^2} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (\bar{y}_{\bullet \bullet \bullet} - \mu)^2 = \frac{1}{\sigma^2} n (\bar{y}_{\bullet \bullet \bullet} - \mu)^2 \\
72+
\frac{\mathrm{SS}_\mathrm{res}}{\sigma^2} &= \frac{1}{\sigma^2} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2 = \frac{1}{\sigma^2} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2
7573
\end{split}
7674
$$
7775

78-
and denote sample means as
76+
are [independent](/D/ind) and [chi-squared distributed](/D/chi2):
7977

80-
$$ \label{eq:mean-samp}
78+
$$ \label{eq:anova2-cochran-s1}
8179
\begin{split}
82-
\bar{y}_{\bullet \bullet \bullet} &= \frac{1}{n} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} y_{ijk} \\
83-
\bar{y}_{i \bullet \bullet} &= \frac{1}{n_{i \bullet}} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} y_{ijk} \\
84-
\bar{y}_{\bullet j \bullet} &= \frac{1}{n_{\bullet j}} \sum_{i=1}^{a} \sum_{k=1}^{n_{ij}} y_{ijk} \\
85-
\bar{y}_{i j \bullet} &= \frac{1}{n_{ij}} \sum_{k=1}^{n_{ij}} y_{ijk} \; .
80+
\frac{\mathrm{SS}_M}{\sigma^2} &\sim \chi^2(1) \\
81+
\frac{\mathrm{SS}_\mathrm{res}}{\sigma^2} &\sim \chi^2(n-ab) \; .
8682
\end{split}
8783
$$
8884

89-
Assume that $\mu$ zero, according to $H_0$ given by \eqref{eq:anova2-h0}. Under this null hypothesis, we have:
90-
91-
$$ \label{eq:yijk-h0}
92-
y_{ijk} \sim \mathcal{N}(\alpha_i + \beta_j + \gamma_{ij}, \sigma^2) \quad \text{for all} \quad i, j, k \; .
93-
$$
94-
95-
Thus, the [random variable](/D/rvar) $U_{ijk} = (y_{ijk} - \alpha_i - \beta_j - \gamma_{ij})/\sigma$ [follows a standard normal distribution](/P/norm-snorm)
96-
97-
$$ \label{eq:Uijk-h0}
98-
U_{ijk} = \frac{y_{ijk} - \alpha_i - \beta_j - \gamma_{ij}}{\sigma} \sim \mathcal{N}(0, 1) \; .
99-
$$
100-
101-
Now consider the following sum
102-
103-
$$ \label{eq:sum-Uijk-s1}
104-
\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} U_{ijk}^2 = \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} \left( \frac{y_{ijk} - \alpha_i - \beta_j - \gamma_{ij}}{\sigma} \right)^2 \\
105-
$$
106-
107-
which can be rewritten as follows:
108-
109-
$$ \label{eq:sum-Uijk-s2}
110-
\begin{split}
111-
\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} U_{ijk}^2 = \frac{1}{\sigma^2} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} & \left[ (y_{ijk} - \alpha_i - \beta_j - \gamma_{ij}) - \right. \\
112-
&\left. [\bar{y}_{\bullet \bullet \bullet} + (\bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{\bullet j \bullet} - \bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{i j \bullet} - \bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet j \bullet} + \bar{y}_{\bullet \bullet \bullet})] \right. + \\
113-
&\left. [\bar{y}_{\bullet \bullet \bullet} + (\bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{\bullet j \bullet} - \bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{i j \bullet} - \bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet j \bullet} + \bar{y}_{\bullet \bullet \bullet})] \right]^2 \\
114-
= \frac{1}{\sigma^2}\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} & \left[ (y_{ijk} - [\bar{y}_{\bullet \bullet \bullet} + (\bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{\bullet j \bullet} - \bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{i j \bullet} - \bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet j \bullet} + \bar{y}_{\bullet \bullet \bullet})]) + \right. \\
115-
&\left. (\bar{y}_{\bullet \bullet \bullet}) + ([\bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet \bullet \bullet}] - \alpha_i) + ([\bar{y}_{\bullet j \bullet} - \bar{y}_{\bullet \bullet \bullet}] - \beta_j) \right. + \\
116-
&\left. ([\bar{y}_{i j \bullet} - \bar{y}_{i \bullet \bullet} - \bar{y}_{\bullet j \bullet} + \bar{y}_{\bullet \bullet \bullet}] - \gamma_{ij}) \right]^2 \\
117-
= \frac{1}{\sigma^2}\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} & \left[ (y_{ijk} - \bar{y}_{i j \bullet}) + (\bar{y}_{\bullet \bullet \bullet}) + (\bar{y}_{i j \bullet} - \bar{y}_{\bullet \bullet \bullet} - \alpha_i - \beta_j - \gamma_{ij}) \right]^2
118-
\end{split}
119-
$$
120-
121-
Because the following sum over $k$ is zero for all $(i,j)$
122-
123-
$$ \label{eq:sum-yijk}
124-
\begin{split}
125-
\sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet}) &= \sum_{k=1}^{n_{ij}} y_{ijk} - n_{ij} \bar{y}_{ij \bullet} \\
126-
&= \sum_{k=1}^{n_{ij}} y_{ijk} - n_{ij} \cdot \frac{1}{n_{ij}} \sum_{k=1}^{n_{ij}} y_{ijk} \\
127-
&= 0, \; (i,j) \in \left\lbrace 1, \ldots, a \right\rbrace \times \left\lbrace 1, \ldots, b \right\rbrace \; ,
128-
\end{split}
129-
$$
130-
131-
the following sum over $(i,j,k)$ and is also zero
132-
133-
$$ \label{eq:sum-yib}
134-
\begin{split}
135-
\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (\bar{y}_{i j \bullet} - \bar{y}_{\bullet \bullet \bullet}) &= \sum_{i=1}^{a} \sum_{j=1}^{b} n_{ij} \bar{y}_{i j \bullet} - \bar{y}_{\bullet \bullet \bullet} \sum_{i=1}^{a} \sum_{j=1}^{b} n_{ij} \\
136-
&= \sum_{i=1}^{a} \sum_{j=1}^{b} n_{ij} \cdot \frac{1}{n_{ij}} \sum_{k=1}^{n_{ij}} y_{ijk} - n \cdot \frac{1}{n} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} y_{ijk} \\
137-
&= 0
138-
\end{split}
139-
$$
140-
141-
and the term $\bar{y}_{\bullet \bullet \bullet}$ does not depend on $i$, $j$ and $k$
142-
143-
$$ \label{eq:yb-const}
144-
\bar{y}_{\bullet \bullet \bullet} = \text{const.} \; ,
145-
$$
146-
147-
non-square products in \eqref{eq:sum-Uijk-s2} disappear and the sum reduces to
148-
149-
$$ \label{eq:sum-Uijk-s3}
150-
\begin{split}
151-
\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} U_{ijk}^2 = \frac{1}{\sigma^2} & \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} \left[ (y_{ijk} - \bar{y}_{i j \bullet})^2 + (\bar{y}_{\bullet \bullet \bullet})^2 + (\bar{y}_{i j \bullet} - \bar{y}_{\bullet \bullet \bullet} - \alpha_i - \beta_j - \gamma_{ij})^2 + \right] \\
152-
= \frac{1}{\sigma^2} & \left[ \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} \right. (y_{ijk} - \bar{y}_{i j \bullet})^2 + \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (\bar{y}_{\bullet \bullet \bullet})^2 + \\
153-
& \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} \left. (\bar{y}_{i j \bullet} - \bar{y}_{\bullet \bullet \bullet} - \alpha_i - \beta_j - \gamma_{ij})^2 \right]
154-
\end{split}
155-
$$
156-
157-
[Cochran's theorem](/P/snorm-cochran) states that, if a sum of squared [standard normal](/D/snorm) [random variables](/D/rvar) can be written as a sum of squared forms
158-
159-
$$ \label{eq:cochran-p1}
160-
\begin{split}
161-
\sum_{i=1}^{n} U_i^2 = \sum_{j=1}^{m} Q_j \quad &\text{where} \quad Q_j = U^\mathrm{T} B^{(j)} U \\
162-
&\text{with} \quad \sum_{j=1}^{m} B^{(j)} = I_n \\
163-
&\text{and} \quad r_j = \mathrm{rank}(B^{(j)}) \; ,
164-
\end{split}
165-
$$
166-
167-
then the terms $Q_j$ are [independent](/D/ind) and each term $Q_j$ follows a [chi-squared distribution](/D/chi2) with $r_j$ degrees of freedom:
168-
169-
$$ \label{eq:cochran-p2}
170-
Q_j \sim \chi^2(r_j), \; j = 1, \ldots, m \; .
171-
$$
172-
173-
First, we define the $n \times 1$ vector $U$:
174-
175-
$$ \label{eq:U}
176-
U = \left[ \begin{matrix} u_{1 \bullet} \\ \vdots \\ u_{a \bullet} \end{matrix} \right] \quad \text{where} \quad u_{i \bullet} = \left[ \begin{matrix} u_{i1} \\ \vdots \\ u_{ib} \end{matrix} \right] \quad \text{where} \quad u_{ij} = \left[ \begin{matrix} (y_{i,j,1} - \alpha_i - \beta_j - \gamma_{ij})/\sigma \\ \vdots \\ (y_{i,j,n_{ij}} - \mu - \alpha_i - \beta_j)/\sigma \end{matrix} \right] \; .
177-
$$
178-
179-
Next, we specify the $n \times n$ matrices $B$
180-
181-
$$ \label{eq:B}
182-
\begin{split}
183-
B^{(1)} &= I_n - \mathrm{diag}\left[ \mathrm{diag}\left( \frac{1}{n_{11}} J_{n_{11}}, \; \ldots, \; \frac{1}{n_{1b}} J_{n_{1b}} \right), \; \ldots, \; \mathrm{diag}\left( \frac{1}{n_{a1}} J_{n_{a1}}, \; \ldots, \; \frac{1}{n_{ab}} J_{n_{ab}} \right) \right] \\
184-
B^{(2)} &= \frac{1}{n} J_n \\
185-
B^{(3)} &= \mathrm{diag}\left[ \mathrm{diag}\left( \frac{1}{n_{11}} J_{n_{11}}, \; \ldots, \; \frac{1}{n_{1b}} J_{n_{1b}} \right), \; \ldots, \; \mathrm{diag}\left( \frac{1}{n_{a1}} J_{n_{a1}}, \; \ldots, \; \frac{1}{n_{ab}} J_{n_{ab}} \right) \right] - \frac{1}{n} J_n
186-
\end{split}
187-
$$
188-
189-
where $J_n$ is an $n \times n$ matrix of ones, $J_{n,m}$ is an $n \times m$ matrix of ones and $\mathrm{diag}\left( A_1, \ldots, A_n \right)$ denotes a block-diagonal matrix composed of $A_1, \ldots, A_n$. We observe that those matrices satisfy
190-
191-
$$ \label{eq:U-Q-B}
192-
\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} U_{ijk]^2 = Q_1 + Q_2 + Q_3 = U^\mathrm{T} B^{(1)} U + U^\mathrm{T} B^{(2)} U + U^\mathrm{T} B^{(3)} U
193-
$$
194-
195-
as well as
196-
197-
$$ \label{eq:B-In}
198-
B^{(1)} + B^{(2)} + B^{(3)} = I_n
199-
$$
200-
201-
and their ranks are
202-
203-
$$ \label{eq:B-rk}
204-
\begin{split}
205-
\mathrm{rank}\left( B^{(1)} \right) &= n - a \cdot b \\
206-
\mathrm{rank}\left( B^{(2)} \right) &= 1 \\
207-
\mathrm{rank}\left( B^{(3)} \right) &= n - (n-ab) - 1 = a \cdot b - 1 \; .
208-
\end{split}
209-
$$
210-
211-
Let's write down the [explained sum of squares](/D/ess) and the [residual sum of squares](/D/rss) for [two-way analysis of variance](/D/anova2) as
212-
213-
$$ \label{eq:ess-rss}
214-
\begin{split}
215-
\mathrm{ESS} &= \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (\bar{y}_{\bullet \bullet \bullet})^2 \\
216-
\mathrm{RSS} &= \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2 \; .
217-
\end{split}
218-
$$
219-
220-
Then, using \eqref{eq:sum-Uijk-s3}, \eqref{eq:cochran-p1}, \eqref{eq:cochran-p2}, \eqref{eq:B} and \eqref{eq:B-rk}, we find that
221-
222-
$$ \label{eq:ess-rss-dist}
223-
\begin{split}
224-
\frac{\mathrm{ESS}}{\sigma^2} = \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} \left( \frac{\bar{y}_{\bullet \bullet \bullet}}{\sigma} \right)^2 &= Q_2 = U^\mathrm{T} B^{(2)} U \sim \chi^2(1) \\
225-
\frac{\mathrm{RSS}}{\sigma^2} = \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} \left( \frac{y_{ijk} - \bar{y}_{i j \bullet}}{\sigma} \right)^2 &= Q_1 = U^\mathrm{T} B^{(1)} U \sim \chi^2(n-ab) \; .
226-
\end{split}
227-
$$
228-
229-
Because $\mathrm{ESS}/\sigma^2$ and $\mathrm{RSS}/\sigma^2$ are also independent by \eqref{eq:cochran-p2}, the F-statistic from \eqref{eq:anova2-fgm} is equal to the ratio of two independent [chi-squared distributed](/D/chi2) [random variables](/D/rvar) divided by their degrees of freedom
85+
Thus, the F-statistic from \eqref{eq:anova2-fgm} is equal to the ratio of two [independent](/D/ind) [chi-squared distributed](/D/chi2) [random variables](/D/rvar) divided by their degrees of freedom
23086

23187
$$ \label{eq:anova2-fgm-ess-tss}
23288
\begin{split}
233-
F_M &= \frac{(\mathrm{ESS}/\sigma^2)/(1)}{(\mathrm{RSS}/\sigma^2)/(n-ab)} \\
234-
&= \frac{\mathrm{ESS}/(1)}{\mathrm{RSS}/(n-ab)} \\
235-
&= \frac{\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (\bar{y}_{\bullet \bullet \bullet})^2}{\frac{1}{n-ab} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2} \\
236-
&= \frac{(\bar{y}_{\bullet \bullet \bullet})^2 \sum_{i=1}^{a} \sum_{j=1}^{b} n_{ij}}{\frac{1}{n-ab} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2} \\
237-
&= \frac{n (\bar{y}_{\bullet \bullet \bullet})^2}{\frac{1}{n-ab} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2}
89+
F_M &= \frac{(\mathrm{SS}_M/\sigma^2)/(1)}{(\mathrm{SS}_\mathrm{res}/\sigma^2)/(n-ab)} \\
90+
&= \frac{\mathrm{SS}_M/(1)}{\mathrm{SS}_\mathrm{res}/(n-ab)} \\
91+
&\overset{\eqref{eq:anova2-ss-dist}}{=} \frac{n (\bar{y}_{\bullet \bullet \bullet} - \mu)^2}{\frac{1}{n-ab} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2} \\
92+
&\overset{\eqref{eq:anova2-fgm}}{=} \frac{n (\bar{y}_{\bullet \bullet \bullet})^2}{\frac{1}{n-ab} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{n_{ij}} (y_{ijk} - \bar{y}_{i j \bullet})^2}
23893
\end{split}
23994
$$
24095

0 commit comments

Comments
 (0)