You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**Proof:** A binomial variable[is defined as](/D/bin) the number of successes observed in $n$ [independent](/D/ind) trials, where each trial has [two possible outcomes](/D/bern) (success/failure) and the [probability](/D/prob) of success and failure are identical across trials ($p$, $q = 1-p$).
37
+
**Proof:** A [binomial variable](/D/bin) is defined as the number of successes observed in $n$ [independent](/D/ind) trials, where each trial has [two possible outcomes](/D/bern) (success/failure) and the [probability](/D/prob) of success and failure are identical across trials ($p$, $q = 1-p$).
38
38
39
39
If one has obtained $x$ successes in $n$ trials, one has also obtained $(n-x)$ failures. The probability of a particular series of $x$ successes and $(n-x)$ failures, when order does matter, is
Copy file name to clipboardExpand all lines: P/mlr-rssdist.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -85,7 +85,7 @@ $$ \label{eq:mvn-cochran}
85
85
x \sim \mathcal{N}(\mu, \sigma^2 I_n) \quad \Rightarrow \quad y = x^\mathrm{T} A x /\sigma^2 \sim \chi^2\left( \mathrm{tr}(A), \mu^\mathrm{T} A \mu \right) \; .
86
86
$$
87
87
88
-
First, we formulate the residuals in terms of transformed measurements $\tilde{y}$:
88
+
First, we [formulate the residuals](/P/mlr-mat) in terms of transformed measurements $\tilde{y}$:
Copy file name to clipboardExpand all lines: P/mlr-wlsdist.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ title: "Distributions of estimated parameters, fitted signal and residuals in mu
11
11
chapter: "Statistical Models"
12
12
section: "Univariate normal data"
13
13
topic: "Multiple linear regression"
14
-
theorem: "Distribution of estimated parameters, signal and residuals"
14
+
theorem: "Distribution of WLS estimates, signal and residuals"
15
15
16
16
sources:
17
17
- authors: "Koch, Karl-Rudolf"
@@ -116,7 +116,7 @@ $$
116
116
3) The [residuals of the linear regression model](/P/mlr-mat) are given by
117
117
118
118
$$ \label{eq:e-est}
119
-
\hat{\varepsilon} = y - X \hat{\beta} = \left( I_n - X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} \right) y = \left( I_n - P \right) y
119
+
\hat{\varepsilon} = y - X \hat{\beta} = \left( I_n - X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} V^{-1} \right) y = \left( I_n - P \right) y
120
120
$$
121
121
122
122
and thus, by applying \eqref{eq:mvn-ltt} to \eqref{eq:e-est}, they are distributed as
@@ -126,9 +126,9 @@ $$ \label{eq:e-est-dist}
126
126
\hat{\varepsilon} &\sim \mathcal{N}\left( \left[ I_n - X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} V^{-1} \right] X \beta, \, \sigma^2 \left[ I_n - P \right] V \left[ I_n - P \right]^\mathrm{T} \right) \\
127
127
&\sim \mathcal{N}\left( X \beta - X \beta, \, \sigma^2 \left[ V - V P^\mathrm{T} - P V + P V P^\mathrm{T} \right] \right) \\
128
128
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - V V^{-1} X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} - X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} V^{-1} V + P V P^\mathrm{T} \right] \right) \\
129
-
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - 2 P + X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} V^{-1} V V^{-1} X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} \right] \right) \\
130
-
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - 2 P + P \right] \right) \\
131
-
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} \right] \right) \\
129
+
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - 2 P V + X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} V^{-1} V V^{-1} X (X^\mathrm{T} V^{-1} X)^{-1} X^\mathrm{T} \right] \right) \\
130
+
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - 2 P V + P V \right] \right) \\
131
+
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ V - P V \right] \right) \\
132
132
&\sim \mathcal{N}\left( 0, \, \sigma^2 \left[ I_n - P \right] V \right) \; .
**Proof:** A multinomial variable[is defined as](/D/mult) a vector of the numbers of observations belonging to $k$ distinct categories in $n$ [independent](/D/ind) trials, where each trial has [$k$ possible outcomes](/D/cat) and the category [probabilities](/D/prob) are identical across trials.
37
+
**Proof:** A [multinomial variable](/D/mult) is defined as a vector of the numbers of observations belonging to $k$ distinct categories in $n$ [independent](/D/ind) trials, where each trial has [$k$ possible outcomes](/D/cat) and the category [probabilities](/D/prob) are identical across trials.
38
38
39
39
The probability of a particular series of $x_1$ observations for category $1$, $x_2$ observations for category $2$ etc., when order does matter, is
0 commit comments