You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: D/mult.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -33,4 +33,4 @@ $$ \label{eq:mult}
33
33
X \sim \mathrm{Mult}(n, \left[p_1, \ldots, p_k \right]) \; ,
34
34
$$
35
35
36
-
if $X$ are the numbers of observations belonging to $k$ distinct categories in $n$ [independent](/D/ind) trials, where each trial has [$k$ possible outcomes](/D/cat) and the category probabilities are identical across trials.
36
+
if $X$ are the numbers of observations belonging to $k$ distinct categories in $n$ [independent](/D/ind) trials, where each trial has $k$ [possible outcomes](/D/cat) and the category probabilities are identical across trials.
Copy file name to clipboardExpand all lines: P/mult-cov.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,13 +46,13 @@ $$
46
46
which [has the variance](/P/bin-var) $\mathrm{Var}(X_i) = n p_i(1-p_i) = n (p_i - p_i^2)$, constituting the elements of the main diagonal in $\mathrm{Cov}(X)$ in \eqref{eq:mult-cov}. To prove $\mathrm{Cov}(X_i, X_j) = -n p_i p_j$ for $i \ne j$ (which constitutes the off-diagonal elements of the covariance matrix), we first recognize that
the indicator function $\mathbb{I}_i$ being a [Bernoulli-distributed](/D/bern) random variable with the [expected value](/P/bern-mean) $p_i$. Then, we have
55
+
where the indicator function $\mathbb{I}_i$ is a [Bernoulli-distributed](/D/bern) random variable with the [expected value](/P/bern-mean) $p_i$. Then, we have
Using the [linearity of the expected value](/P/mean-lin), the [additivity of the variance under independence](/P/var-add) and [scaling of the variance upon multiplication](/P/var-scal), the sample mean follows a [normal distribution](/D/norm)
67
+
Using the [linear combination formula for normal random variables](/P/norm-lincomb), the sample mean follows a [normal distribution](/D/norm) with the following parameters:
68
68
69
69
$$ \label{eq:mean-samp-dist}
70
-
\bar{y} = \frac{1}{n} \sum_{i=1}^{n} y_i \sim \mathcal{N}\left( \frac{1}{n} n \mu, \left(\frac{1}{n}\right)^2 n \sigma^2 \right) = \mathcal{N}\left( \mu, \sigma^2/n \right)
and additionally using the [invariance of the variance under addition](/P/var-inv) and applying the null hypothesis from \eqref{eq:ttest1-h0}, the distribution of $Z = \sqrt{n}(\bar{y}-\mu_0)/\sigma$ becomes [standard normal](/D/snorm)
73
+
Again employing the linear combination theorem and applying the null hypothesis from \eqref{eq:ttest1-h0}, the distribution of $Z = \sqrt{n}(\bar{y}-\mu_0)/\sigma$ becomes [standard normal](/D/snorm)
be a[univariate Gaussian data set](/D/ug) representing two groups of unequal size $n_1$ and $n_2$ with unknown means $\mu_1$ and $\mu_2$ and equal unknown variance $\sigma^2$. Then, the [test statistic](/D/tstat)
45
+
be two[univariate Gaussian data sets](/D/ug) representing two groups of unequal size $n_1$ and $n_2$ with unknown means $\mu_1$ and $\mu_2$ and equal unknown variance $\sigma^2$. Then, the [test statistic](/D/tstat)
46
46
47
47
$$ \label{eq:t}
48
48
t = \frac{(\bar{y}_1-\bar{y}_2)-\mu_\Delta}{s_p \cdot \sqrt{\frac{1}{n_1}+\frac{1}{n_2}}}
Using the [linearity of the expected value](/P/mean-lin), the [additivity of the variance under independence](/P/var-add) and [scaling of the variance upon multiplication](/P/var-scal), the sample means follow a [normal distribution](/D/norm)
88
+
Using the [linear combination formula for normal random variables](/P/norm-lincomb), the sample means follows [normal distributions](/D/norm) with the following parameters:
and additionally using the [invariance of the variance under addition](/P/var-inv) and applying the null hypothesis from \eqref{eq:ttest2-h0}, the distribution of $Z = ( ( \bar{y}_1 - \bar{y}_2 ) - \mu_{\Delta} ) / ( \sigma \sqrt{1/n_1+1/n_2} )$ becomes [standard normal](/D/snorm)
97
+
Again employing the linear combination theorem and applying the null hypothesis from \eqref{eq:ttest2-h0}, the distribution of $Z = ( ( \bar{y}_1 - \bar{y}_2 ) - \mu_{\Delta} ) / ( \sigma \sqrt{1/n_1+1/n_2} )$ becomes [standard normal](/D/snorm)
Copy file name to clipboardExpand all lines: P/ugkv-ztest1.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,13 +64,13 @@ $$ \label{eq:mean-samp}
64
64
\bar{y} = \frac{1}{n} \sum_{i=1}^{n} y_i \; .
65
65
$$
66
66
67
-
Using the [linearity of the expected value](/P/mean-lin), the [additivity of the variance under independence](/P/var-add) and [scaling of the variance upon multiplication](/P/var-scal), the sample mean follows a [normal distribution](/D/norm)
67
+
Using the [linear combination formula for normal random variables](/P/norm-lincomb), the sample mean follows a [normal distribution](/D/norm) with the following parameters:
68
68
69
69
$$ \label{eq:mean-samp-dist}
70
-
\bar{y} = \frac{1}{n} \sum_{i=1}^{n} y_i \sim \mathcal{N}\left( \frac{1}{n} n \mu, \left(\frac{1}{n}\right)^2 n \sigma^2 \right) = \mathcal{N}\left( \mu, \sigma^2/n \right)
and additionally using the [invariance of the variance under addition](/P/var-inv), the distribution of $z = \sqrt{n/\sigma^2} (\bar{y}-\mu_0)$ becomes
73
+
Again employing the linear combination theorem, the distribution of $z = \sqrt{n/\sigma^2} (\bar{y}-\mu_0)$ becomes
0 commit comments