Skip to content

Commit da144af

Browse files
authored
Merge pull request #279 from JoramSoch/master
added 1 definition and 1 proof
2 parents 53e9097 + 24ab806 commit da144af

3 files changed

Lines changed: 108 additions & 8 deletions

File tree

D/est-bias.md

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
---
2+
layout: definition
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2024-11-08 10:53:01
9+
10+
title: "Biased vs. unbiased estimator"
11+
chapter: "General Theorems"
12+
section: "Estimation theory"
13+
topic: "Basic concepts of estimation"
14+
definition: "Biased vs. unbiased"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2024
19+
title: "Estimator"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2024-11-08"
22+
url: "https://en.wikipedia.org/wiki/Estimator#Bias"
23+
24+
def_id: "D209"
25+
shortcut: "est-bias"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Definition:** Let $\hat{\theta}: \mathcal{Y} \rightarrow \Theta$ be an [estimator](/D/est) of a [parameter](/D/para) $\theta \in \Theta$ from [data](/D/data) $y \in \mathcal{Y}$. Then,
31+
32+
* $\hat{\theta}$ is called an unbiased estimator when its [expected value](/D/mean) is equal to the parameter that it is estimating: $\mathrm{E}_{\hat{\theta}}\left[ \hat{\theta} \right] = \theta$, where the expectation is calculated over all possible samples $y$ leading to values of $\hat{\theta}$.
33+
34+
* $\hat{\theta}$ is called a biased estimator otherwise, i.e. when $\mathrm{E}_{\hat{\theta}}\left[ \hat{\theta} \right] \neq \theta$.

I/ToC.md

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -112,14 +112,15 @@ title: "Table of Contents"
112112
&emsp;&ensp; 1.9.3. *[Characteristic function](/D/cf)* <br>
113113
&emsp;&ensp; 1.9.4. **[Characteristic function of arbitrary function](/P/cf-fct)** <br>
114114
&emsp;&ensp; 1.9.5. *[Moment-generating function](/D/mgf)* <br>
115-
&emsp;&ensp; 1.9.6. **[Moment-generating function of arbitrary function](/P/mgf-fct)** <br>
116-
&emsp;&ensp; 1.9.7. **[Moment-generating function of linear transformation](/P/mgf-ltt)** <br>
117-
&emsp;&ensp; 1.9.8. **[Moment-generating function of linear combination](/P/mgf-lincomb)** <br>
118-
&emsp;&ensp; 1.9.9. *[Probability-generating function](/D/pgf)* <br>
119-
&emsp;&ensp; 1.9.10. **[Probability-generating function in terms of expected value](/P/pgf-mean)** <br>
120-
&emsp;&ensp; 1.9.11. **[Probability-generating function of zero](/P/pgf-zero)** <br>
121-
&emsp;&ensp; 1.9.12. **[Probability-generating function of one](/P/pgf-one)** <br>
122-
&emsp;&ensp; 1.9.13. *[Cumulant-generating function](/D/cgf)* <br>
115+
&emsp;&ensp; 1.9.6. **[Moment-generating function of sum of independents](/P/mgf-sumind)** <br>
116+
&emsp;&ensp; 1.9.7. **[Moment-generating function of arbitrary function](/P/mgf-fct)** <br>
117+
&emsp;&ensp; 1.9.8. **[Moment-generating function of linear transformation](/P/mgf-ltt)** <br>
118+
&emsp;&ensp; 1.9.9. **[Moment-generating function of linear combination](/P/mgf-lincomb)** <br>
119+
&emsp;&ensp; 1.9.10. *[Probability-generating function](/D/pgf)* <br>
120+
&emsp;&ensp; 1.9.11. **[Probability-generating function in terms of expected value](/P/pgf-mean)** <br>
121+
&emsp;&ensp; 1.9.12. **[Probability-generating function of zero](/P/pgf-zero)** <br>
122+
&emsp;&ensp; 1.9.13. **[Probability-generating function of one](/P/pgf-one)** <br>
123+
&emsp;&ensp; 1.9.14. *[Cumulant-generating function](/D/cgf)* <br>
123124

124125
<p id="Expected value"></p>
125126
1.10. Expected value <br>
@@ -279,6 +280,7 @@ title: "Table of Contents"
279280
<p id="Basic concepts of estimation"></p>
280281
3.1. Basic concepts of estimation <br>
281282
&emsp;&ensp; 3.1.1. *[Estimator](/D/est)* <br>
283+
&emsp;&ensp; 3.1.2. *[Biased vs. unbiased](/D/est-bias)* <br>
282284

283285
<p id="Point estimates"></p>
284286
3.2. Point estimates <br>

P/mgf-sumind.md

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2024-11-08 10:46:00
9+
10+
title: "Moment-generating function of a sum of independent random variables"
11+
chapter: "General Theorems"
12+
section: "Probability theory"
13+
topic: "Other probability functions"
14+
theorem: "Moment-generating function of sum of independents"
15+
16+
sources:
17+
- authors: "Probability Fact"
18+
year: 2021
19+
title: "If X and Y are independent, the moment generating function (MGF)"
20+
in: "X"
21+
pages: "retrieved on 2024-11-08"
22+
url: "https://x.com/ProbFact/status/1468264616706859016"
23+
24+
proof_id: "P478"
25+
shortcut: "mgf-sumind"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Theorem:** Let $X$ and $Y$ be two [independent](/D/ind) [random variables](/D/rvar) and let $Z = X + Y$. Then, the [moment-generating function](/D/mgf) of $Z$ is given by
31+
32+
$$ \label{eq:mgf-sumind}
33+
M_Z(t) = M_X(t) \cdot M_Y(t)
34+
$$
35+
36+
where $M_X(t)$, $M_Y(t)$ and $M_Z(t)$ are the [moment-generating functions](/D/mgf) of $X$, $Y$ and $Z$.
37+
38+
39+
**Proof:** The [moment-generating function of a random variable](/D/mgf) $X$ is
40+
41+
$$ \label{eq:mfg}
42+
M_X(t) = \mathrm{E} \left( \exp \left[ t X \right] \right)
43+
$$
44+
45+
and therefore the moment-generating function of the sum $Z$ is given by
46+
47+
$$ \label{eq:mgf-sumind-s1}
48+
\begin{split}
49+
M_Z(t)
50+
&= \mathrm{E} \left( \exp \left[ t Z \right] \right) \\
51+
&= \mathrm{E} \left( \exp \left[ t (X + Y) \right] \right) \\
52+
&= \mathrm{E} \left( \exp \left[ t X \right] \cdot \exp \left[ t Y \right] \right) \; .
53+
\end{split}
54+
$$
55+
56+
Because the [expected value is multiplicative for independent random variables](/P/mean-mult), we have
57+
58+
$$ \label{eq:mgf-sumind-s2}
59+
\begin{split}
60+
M_Z(t)
61+
&= \mathrm{E} \left( \exp \left[ t X \right] \right) \cdot \mathrm{E} \left( \exp \left[ t Y \right] \right) \\
62+
&= M_X(t) \cdot M_Y(t) \; .
63+
\end{split}
64+
$$

0 commit comments

Comments
 (0)