You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: D/iglm.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ affiliation: "BCCN Berlin"
7
7
e_mail: "joram.soch@bccn-berlin.de"
8
8
date: 2021-10-21 15:31:00
9
9
10
-
title: "General linear model"
10
+
title: "Inverse general linear model"
11
11
chapter: "Statistical Models"
12
12
section: "Multivariate normal data"
13
13
topic: "Inverse general linear model"
@@ -28,7 +28,7 @@ username: "JoramSoch"
28
28
---
29
29
30
30
31
-
**Definition:** Let there be a [general linear models](/D/glm) of measured data $Y \in \mathbb{R}^{n \times v}$ in terms of the [design matrix](/D/glm) $X \in \mathbb{R}^{n \times p}$:
31
+
**Definition:** Let there be a [general linear model](/D/glm) of measured data $Y \in \mathbb{R}^{n \times v}$ in terms of the [design matrix](/D/glm) $X \in \mathbb{R}^{n \times p}$:
32
32
33
33
$$ \label{eq:glm}
34
34
Y = X B + E, \; E \sim \mathcal{MN}(0, V, \Sigma) \; .
Copy file name to clipboardExpand all lines: P/cfm-exist.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ affiliation: "BCCN Berlin"
7
7
e_mail: "joram.soch@bccn-berlin.de"
8
8
date: 2021-10-21 17:43:00
9
9
10
-
title: "Existence of the corresponding forward model"
10
+
title: "Existence of a corresponding forward model"
11
11
chapter: "Statistical Models"
12
12
section: "Multivariate normal data"
13
13
topic: "Inverse general linear model"
@@ -28,7 +28,7 @@ username: "JoramSoch"
28
28
---
29
29
30
30
31
-
**Theorem:** Let there be observations $Y \in \mathbb{R}^{n \times v}$ and $X \in \mathbb{R}^{n \times p}$ and consider a weight matrix $W \in \mathbb{R}^{v \times p}$ predicting $X$ from $Y$:
31
+
**Theorem:** Let there be observations $Y \in \mathbb{R}^{n \times v}$ and $X \in \mathbb{R}^{n \times p}$ and consider a weight matrix $W = f(Y,X) \in \mathbb{R}^{v \times p}$ predicting $X$ from $Y$:
Copy file name to clipboardExpand all lines: P/cfm-para.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,7 +28,7 @@ username: "JoramSoch"
28
28
---
29
29
30
30
31
-
**Theorem:** Let there be observations $Y \in \mathbb{R}^{n \times v}$ and $X \in \mathbb{R}^{n \times p}$ and consider a weight matrix $W \in \mathbb{R}^{v \times p}$ predicting $X$ from $Y$:
31
+
**Theorem:** Let there be observations $Y \in \mathbb{R}^{n \times v}$ and $X \in \mathbb{R}^{n \times p}$ and consider a weight matrix $W = f(Y,X) \in \mathbb{R}^{v \times p}$ predicting $X$ from $Y$:
Copy file name to clipboardExpand all lines: P/iglm-blue.md
+17-5Lines changed: 17 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -73,7 +73,13 @@ $$ \label{eq:W-hat-dist}
73
73
\tilde{W} = M X \sim \mathcal{MN}(M Y W, M V M^T, \Sigma_x) \;
74
74
$$
75
75
76
-
which requires that $M Y = I_v$. This is fulfilled by any matrix $M = (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} + D$ where $D$ is a $v \times n$ matrix which satisfies $D Y = 0$.
76
+
[which requires](/P/matn-mean) that $M Y = I_v$. This is fulfilled by any matrix
77
+
78
+
$$ \label{eq:M-D}
79
+
M = (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} + D
80
+
$$
81
+
82
+
where $D$ is a $v \times n$ matrix which satisfies $D Y = 0$.
77
83
78
84
<br>
79
85
3) Third, the [best linear unbiased estimator](/D/blue) is the one with minimum [variance](/D/var), i.e. the one that minimizes the expected Frobenius norm
@@ -85,26 +91,32 @@ $$
85
91
Using the [matrix-normal distribution](/D/matn) of $\tilde{W}$ from \eqref{eq:W-hat-dist}
86
92
87
93
$$ \label{eq:W-hat-W-dist}
88
-
\left( \tilde{W} - W \right) \sim \mathcal{MN}(0, M V M^T, \Sigma_x)
94
+
\left( \tilde{W} - W \right) \sim \mathcal{MN}(0, M V M^\mathrm{T}, \Sigma_x)
89
95
$$
90
96
91
97
and the property of the [Wishart distribution](/D/wish)
92
98
93
99
$$ \label{eq:E-XX}
94
-
X \sim \mathcal{MN}(0, U, V) \quad \Rightarrow \quad \left\langle X X^T \right\rangle = \mathrm{tr}(V) \, U \; ,
100
+
X \sim \mathcal{MN}(0, U, V) \quad \Rightarrow \quad \left\langle X X^\mathrm{T} \right\rangle = \mathrm{tr}(V) \, U \; ,
95
101
$$
96
102
97
103
this [variance](/D/var) can be evaluated as a function of $M$:
&\overset{\eqref{eq:E-XX}}{=} \mathrm{tr}\left[ \mathrm{tr}(\Sigma_x) \, M V M^\mathrm{T} \right] \\
111
+
&= \mathrm{tr}(\Sigma_x) \; \mathrm{tr}(M V M^\mathrm{T}) \; .
112
+
\end{split}
101
113
$$
102
114
103
115
As a function of $D$ and using $D Y = 0$, it becomes:
104
116
105
117
$$ \label{eq:Var-D}
106
118
\begin{split}
107
-
\mathrm{Var}\left[ \tilde{W}(D) \right] &= \mathrm{tr}(\Sigma_x) \; \mathrm{tr}\!\left[ \left( (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} + D \right) V \left( (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} + D \right)^\mathrm{T} \right] \\
119
+
\mathrm{Var}\left[ \tilde{W}(D) \right] &\overset{\eqref{eq:M-D}}{=} \mathrm{tr}(\Sigma_x) \; \mathrm{tr}\!\left[ \left( (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} + D \right) V \left( (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} + D \right)^\mathrm{T} \right] \\
108
120
&= \mathrm{tr}(\Sigma_x) \; \mathrm{tr}\!\left[ (Y^\mathrm{T} V^{-1} Y)^{-1} \, Y^\mathrm{T} V^{-1} V V^{-1} Y \; (Y^\mathrm{T} V^{-1} Y)^{-1} + \right. \\
109
121
&\hphantom{=\mathrm{tr}(\Sigma_x) \; \mathrm{tr}\!\left[\right.} \left. \, (Y^\mathrm{T} V^{-1} Y)^{-1} Y^\mathrm{T} V^{-1} V D^\mathrm{T} + D V V^{-1} Y (Y^\mathrm{T} V^{-1} Y)^{-1} + D V D^\mathrm{T} \right] \\
110
122
&= \mathrm{tr}(\Sigma_x) \left[ \mathrm{tr}\!\left( (Y^\mathrm{T} V^{-1} Y)^{-1} \right) + \mathrm{tr}\!\left( D V D^\mathrm{T} \right) \right] \; .
Copy file name to clipboardExpand all lines: P/iglm-dist.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,7 @@ $$ \label{eq:iglm}
40
40
X = Y W + N, \; N \sim \mathcal{MN}(0, V, \Sigma_x)
41
41
$$
42
42
43
-
where $W \in \mathbb{R}^{v \times p}$ is a matrix, such that $B \, W = I_p$, and the covariance across columns is $\Sigma_x = W^\mathrm{T} \Sigma W$.
43
+
where $W \in \mathbb{R}^{v \times p}$ is a matrix, such that $B \, W = I_p$, and the [covariance across columns](/D/matn) is $\Sigma_x = W^\mathrm{T} \Sigma W$.
44
44
45
45
46
46
**Proof:** The [linear transformation theorem for the matrix-normal distribution](/P/matn-ltt) states:
@@ -49,13 +49,13 @@ $$ \label{eq:matn-ltt}
49
49
X \sim \mathcal{MN}(M, U, V) \quad \Rightarrow \quad Y = AXB + C \sim \mathcal{MN}(AMB+C, AUA^\mathrm{T}, B^\mathrm{T}VB) \; .
50
50
$$
51
51
52
-
The matrix $W$ exists, if the rows of $B \in \mathbb{R}^{p \times v}$ are linearly independent, such that $\mathrm{rk}(B) = p$. Then, right-multiplying the model \eqref{eq:glm} and applying \eqref{eq:matn-ltt} with $W$ yields
52
+
The matrix $W$ exists, if the rows of $B \in \mathbb{R}^{p \times v}$ are linearly independent, such that $\mathrm{rk}(B) = p$. Then, right-multiplying the model \eqref{eq:glm} with $W$ and applying \eqref{eq:matn-ltt} yields
53
53
54
54
$$ \label{eq:iglm-s1}
55
55
Y W = X B W + E W, \; E W \sim \mathcal{MN}(0, V, W^\mathrm{T} \Sigma W) \; .
56
56
$$
57
57
58
-
Applying $B \, W = I_p$ and rearranging, we have
58
+
Employing $B \, W = I_p$ and rearranging, we have
59
59
60
60
$$ \label{eq:iglm-s2}
61
61
X = Y W - E W, \; E W \sim \mathcal{MN}(0, V, W^\mathrm{T} \Sigma W) \; .
0 commit comments