You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: D/cvlme.md
+7-1Lines changed: 7 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -50,7 +50,7 @@ username: "JoramSoch"
50
50
51
51
**Definition:** Let there be a [data set](/D/data) $y$ with mutually exclusive and collectively exhaustive subsets $y_1, \ldots, y_S$. Assume a [generative model](/D/gm) $m$ with model parameters $\theta$ implying a [likelihood function](/D/lf) $p(y \vert \theta, m)$ and a [non-informative](/D/prior-inf)[prior density](/D/prior) $p_{\mathrm{ni}}(\theta \vert m)$.
52
52
53
-
Then, the cross-validated log model evidence of $m$ is given by
53
+
Then, the cross-validated log model evidence (cvLME) of $m$ is given by
| P466 | cdf-probexc |[Exceedance probability for a random variable in terms of cumulative distribution function](/P/cdf-probexc)| JoramSoch | 2024-09-06 |
475
+
| P467 | postpred-jl |[Posterior predictive distribution is a marginal distribution of the joint likelihood](/P/postpred-jl)| aloctavodia | 2024-09-11 |
476
+
| P468 | mean-wlln |[Weak law of large numbers](/P/mean-wlln)| JoramSoch | 2024-09-13 |
477
+
| P469 | mean-mse |[The expected value minimizes the mean squared error](/P/mean-mse)| salbalkus | 2024-09-13 |
Copy file name to clipboardExpand all lines: P/blr-lbf.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ username: "JoramSoch"
21
21
---
22
22
23
23
24
-
**Theorem:** Let $y = \left[ y_1, \ldots, y_n \right]^\mathrm{T}$ be an $n \times 1$ vector of a [measured univariate signal](/D/data) and consider two [linear regression models](/D/mlr) with [design matrices](/D/mlr) $X_1, X_2$ and [precision matrices](/P/blr-prior) $P_1, P_2$, entailing potentially different [regression coefficients](/D/mlr) $\beta_1, \beta_2$ and [noise precisions](/D/blr-prior) $\tau_1, \tau_2$:
24
+
**Theorem:** Let $y = \left[ y_1, \ldots, y_n \right]^\mathrm{T}$ be an $n \times 1$ vector of a [measured univariate signal](/D/data) and consider two [linear regression models](/D/mlr) with [design matrices](/D/mlr) $X_1, X_2$ and [precision matrices](/P/blr-prior) $P_1, P_2$, entailing potentially different [regression coefficients](/D/mlr) $\beta_1, \beta_2$ and [noise precisions](/P/blr-prior) $\tau_1, \tau_2$:
Copy file name to clipboardExpand all lines: P/blr-postind.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ username: "JoramSoch"
21
21
---
22
22
23
23
24
-
**Theorem:** Let $y = \left\lbrace y_1, \ldots, y_S \right\rbrace$ be a set of $S$ [conditionally independent data sets](/D/ind-cond) assumed to follow [linear regression models](/D/mlr) with [design matrices](/D/mlr) $X_1, \ldots, X_S$, [number of data points](/D/mlr) $n_1, \ldots, n_S$ and [precision matrices](/P/blr-prior) $P_1, \ldots, P_n$, governed by identical [regression coefficients](/D/mlr) $\beta$ and identical [noise precision](/D/blr-prior) $\tau$:
24
+
**Theorem:** Let $y = \left\lbrace y_1, \ldots, y_S \right\rbrace$ be a set of $S$ [conditionally independent data sets](/D/ind-cond) assumed to follow [linear regression models](/D/mlr) with [design matrices](/D/mlr) $X_1, \ldots, X_S$, [number of data points](/D/mlr) $n_1, \ldots, n_S$ and [precision matrices](/P/blr-prior) $P_1, \ldots, P_n$, governed by identical [regression coefficients](/D/mlr) $\beta$ and identical [noise precision](/P/blr-prior) $\tau$:
0 commit comments