You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**Definition:** Let $\hat{\theta}: \mathcal{Y} \rightarrow \Theta$ be an [estimator](/D/est) of a [parameter](/D/para) $\theta \in \Theta$ from [data](/D/data) $y \in \mathcal{Y}$. Then,
31
+
32
+
* $\hat{\theta}$ is called an unbiased estimator when its [expected value](/D/mean) is equal to the parameter that it is estimating: $\mathrm{E}_{\hat{\theta}}\left[ \hat{\theta} \right] = \theta$, where the expectation is calculated over all possible samples $y$ leading to values of $\hat{\theta}$.
33
+
34
+
* $\hat{\theta}$ is called a biased estimator otherwise, i.e. when $\mathrm{E}_{\hat{\theta}}\left[ \hat{\theta} \right] \neq \theta$.
**Theorem:** Let $X$ and $Y$ be two [independent](/D/ind)[random variables](/D/rvar) and let $Z = X + Y$. Then, the [moment-generating function](/D/mgf) of $Z$ is given by
31
+
32
+
$$ \label{eq:mgf-sumind}
33
+
M_Z(t) = M_X(t) \cdot M_Y(t)
34
+
$$
35
+
36
+
where $M_X(t)$, $M_Y(t)$ and $M_Z(t)$ are the [moment-generating functions](/D/mgf) of $X$, $Y$ and $Z$.
37
+
38
+
39
+
**Proof:** The [moment-generating function of a random variable](/D/mgf) $X$ is
40
+
41
+
$$ \label{eq:mfg}
42
+
M_X(t) = \mathrm{E} \left( \exp \left[ t X \right] \right)
43
+
$$
44
+
45
+
and therefore the moment-generating function of the sum $Z$ is given by
46
+
47
+
$$ \label{eq:mgf-sumind-s1}
48
+
\begin{split}
49
+
M_Z(t)
50
+
&= \mathrm{E} \left( \exp \left[ t Z \right] \right) \\
0 commit comments