Skip to content

Commit ad45378

Browse files
Merge pull request #270 from salbalkus/master
added "mean-ls"
2 parents cef64d9 + af721ec commit ad45378

2 files changed

Lines changed: 44 additions & 1 deletion

File tree

I/ToC.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,7 @@ title: "Table of Contents"
122122
&emsp;&ensp; 1.10.13. **[Weak law of large numbers](/P/mean-wlln)** <br>
123123
&emsp;&ensp; 1.10.14. *[Expected value of a random vector](/D/mean-rvec)* <br>
124124
&emsp;&ensp; 1.10.15. *[Expected value of a random matrix](/D/mean-rmat)* <br>
125+
&emsp;&ensp; 1.10.16. **[Expected value minimizes expected squared error](/P/mean-ls)** <br>
125126

126127
1.11. Variance <br>
127128
&emsp;&ensp; 1.11.1. *[Definition](/D/var)* <br>
@@ -907,4 +908,4 @@ title: "Table of Contents"
907908
3.5. Bayesian model averaging <br>
908909
&emsp;&ensp; 3.5.1. *[Definition](/D/bma)* <br>
909910
&emsp;&ensp; 3.5.2. **[Derivation](/P/bma-der)** <br>
910-
&emsp;&ensp; 3.5.3. **[Calculation from log model evidences](/P/bma-lme)** <br>
911+
&emsp;&ensp; 3.5.3. **[Calculation from log model evidences](/P/bma-lme)** <br>

P/mean-ls.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Salvador Balkus"
6+
affiliation: "Harvard T.H. Chan School of Public Health"
7+
e_mail: "sbalkus@g.harvard.edu"
8+
date: 2024-09-13 23:30:00
9+
10+
title: "Expected value minimizes expected squared error"
11+
chapter: "General Theorems"
12+
section: "Probability theory"
13+
topic: "Expected value"
14+
theorem: "Expected value minimizes expected squared error"
15+
16+
sources:
17+
18+
proof_id: "P468"
19+
shortcut: "mean-ls"
20+
username: "salbalkus"
21+
---
22+
23+
24+
**Theorem:** Let $X_1, X_2, \ldots, X_n$ be a collection of [random variables](/D/rvar) with common [mean](/D/mean) $E(X_i) = \mu$. Then, $\mu$ minimizes the mean squared error; that is,
25+
26+
$$
27+
\text{argmin}_{a \in \mathbb{R}} E\Big((X_i - a)^2\Big) = \mu
28+
$$
29+
30+
**Proof:** Using the [linearity of expectation](/P/mean-lin) we can simplify the objective function like so:
31+
32+
$$
33+
E\Big((X_i - a)^2\Big) = E\Big(X_i^2 - 2aX_i + a^2\Big) = a^2 - 2a\mu + E(X_i^2)
34+
$$
35+
36+
Setting $\frac{d}{da}a^2 - 2a\mu + E(X_i^2) = 0$ to perform a [derivative test](https://en.wikipedia.org/wiki/Derivative_test), we can compute the derivative to obtain
37+
38+
$$
39+
2a - 2\mu = 0 \implies a = \mu
40+
$$
41+
42+
The second derivative is equal to 2, which is greater than 0; since this is the sole critical point, we can conclude by the second derivative test that this value is the unique global minimum. This completes the proof that $\text{argmin}_{a \in \mathbb{R}} E\Big((X_i - a)^2\Big) = \mu$.

0 commit comments

Comments
 (0)