Skip to content

Commit 9cc65b7

Browse files
authored
Merge pull request #274 from JoramSoch/master
added 1 definition and 1 proof
2 parents 6f1ee2a + 720f2ef commit 9cc65b7

3 files changed

Lines changed: 188 additions & 1 deletion

File tree

D/stat.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
---
2+
layout: definition
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2024-10-04 10:54:22
9+
10+
title: "Statistic"
11+
chapter: "General Theorems"
12+
section: "Probability theory"
13+
topic: "Random experiments"
14+
definition: "Sample statistic"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2024
19+
title: "Statistic"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2024-10-04"
22+
url: "https://en.wikipedia.org/wiki/Statistic"
23+
24+
def_id: "D205"
25+
shortcut: "stat"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Definition:** A statistic, also "sample statistic", is any quantity calculated from the [data points](/D/data) in a [sample](/D/samp) that is used for statistical purposes.
31+
32+
Examples include statistics used to estimate the [parameters](/D/para) of a [probability distribution](/D/dist) and [test statistics](/D/tstat) to evaluate [statistical hypotheses](/D/hyp).

I/ToC.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ title: "Table of Contents"
2121
&emsp;&ensp; 1.1.3. *[Event space](/D/eve-spc)* <br>
2222
&emsp;&ensp; 1.1.4. *[Probability space](/D/prob-spc)* <br>
2323
&emsp;&ensp; 1.1.5. *[Measured data](/D/data)* <br>
24+
&emsp;&ensp; 1.1.6. *[Sample statistic](/D/stat)* <br>
2425

2526
<p id="Random variables"></p>
2627
1.2. Random variables <br>
@@ -481,7 +482,8 @@ title: "Table of Contents"
481482
&emsp;&ensp; 3.2.23. **[Differential entropy](/P/norm-dent)** <br>
482483
&emsp;&ensp; 3.2.24. **[Kullback-Leibler divergence](/P/norm-kl)** <br>
483484
&emsp;&ensp; 3.2.25. **[Maximum entropy distribution](/P/norm-maxent)** <br>
484-
&emsp;&ensp; 3.2.26. **[Linear combination](/P/norm-lincomb)** <br>
485+
&emsp;&ensp; 3.2.26. **[Linear combination of independent normals](/P/norm-lincomb)** <br>
486+
&emsp;&ensp; 3.2.27. **[Normal and uncorrelated does not imply independent](/P/norm-indcorr)** <br>
485487

486488
<p id="t-distribution"></p>
487489
3.3. t-distribution <br>

P/norm-corrind.md

Lines changed: 153 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,153 @@
1+
---
2+
layout: proof
3+
mathjax: true
4+
5+
author: "Joram Soch"
6+
affiliation: "BCCN Berlin"
7+
e_mail: "joram.soch@bccn-berlin.de"
8+
date: 2024-10-04 10:59:19
9+
10+
title: "Normally distributed and uncorrelated does not imply independent"
11+
chapter: "Probability Distributions"
12+
section: "Univariate continuous distributions"
13+
topic: "Normal distribution"
14+
theorem: "Normal and uncorrelated does not imply independent"
15+
16+
sources:
17+
- authors: "Wikipedia"
18+
year: 2024
19+
title: "Misconceptions about the normal distribution"
20+
in: "Wikipedia, the free encyclopedia"
21+
pages: "retrieved on 2024-10-04"
22+
url: "https://en.wikipedia.org/wiki/Misconceptions_about_the_normal_distribution#A_symmetric_example"
23+
24+
proof_id: "P473"
25+
shortcut: "norm-corrind"
26+
username: "JoramSoch"
27+
---
28+
29+
30+
**Theorem:** Consider two [random variables](/D/rvar) $X$ and $Y$. If each of them is [normally distributed](/D/norm) and both are [uncorrelated](/D/corr), then $X$ and $Y$ are not necessarily [independent](/D/ind).
31+
32+
33+
**Proof:** As an example, let $V$ follow a [Bernoulli distribution](/D/bern) with [success probability](/D/bern) $1/2$ and let $W$ be defined as a transformation of $V$:
34+
35+
$$ \label{eq:V-W}
36+
\begin{split}
37+
V &\sim \mathrm{Bern}\left( \frac{1}{2} \right) \\
38+
W &= 2V-1 \; .
39+
\end{split}
40+
$$
41+
42+
By [definition of the Bernoulli distribution](/D/bern), it follows that
43+
44+
$$ \label{eq:V-W-dist}
45+
p(V=0) = p(V=1) = \frac{1}{2}
46+
\quad \Rightarrow \quad
47+
p(W=-1) = p(W=+1) = \frac{1}{2} \; .
48+
$$
49+
50+
Moreover, let $X$ follow a [standard normal distribution](/D/snorm) and let $Y$ be defined as a transformation of $X$ and $W$:
51+
52+
$$ \label{eq:X-Y}
53+
\begin{split}
54+
X &\sim \mathcal{N}(0,1) \\
55+
Y &= WX \; .
56+
\end{split}
57+
$$
58+
59+
Then, by the nature of the [random variable](/D/rvar) $W$, it follows that
60+
61+
$$ \label{eq:X-Y-dist}
62+
p(W=-1) = p(W=+1) = \frac{1}{2}
63+
\quad \Rightarrow \quad
64+
p(Y=-X) = p(Y=+X) = \frac{1}{2} \; .
65+
$$
66+
67+
Since the negative of a [standard normal](/D/snorm) random variable [is also standard normally distributed](/P/norm-lincomb),
68+
69+
$$ \label{eq:X-dist}
70+
X \sim \mathcal{N}(0,1)
71+
\quad \Rightarrow \quad
72+
-X \sim \mathcal{N}(0,1) \; ,
73+
$$
74+
75+
we can calculate the [probability density function](/D/pdf) belonging to the [mixture distribution](/D/dist-mixt) of $Y$ as follows:
76+
77+
$$ \label{eq:Y-pdf}
78+
\begin{split}
79+
p(y)
80+
&= p(y|Y=-X) \cdot p(Y=-X) + p(y|Y=+X) \cdot p(Y=+X) \\
81+
&\overset{\eqref{eq:X-Y-dist}}{=} \mathcal{N}(y; 0, 1) \cdot \frac{1}{2} + \mathcal{N}(y; 0, 1) \cdot \frac{1}{2} \\
82+
&= \mathcal{N}(y; 0, 1)
83+
\end{split}
84+
$$
85+
86+
where we have used the [law of marginal probability](/D/prob-marg) in the first line and $\mathcal{N}(x; \mu, \sigma^2)$ denotes the [probability density function of the normal distribution](/P/norm-pdf). Thus, $Y$ is also [standard normally distributed](/D/snorm):
87+
88+
$$ \label{eq:Y-dist}
89+
Y \sim \mathcal{N}(0,1) \; .
90+
$$
91+
92+
This means that both $X$ and $Y$ have expected value zero:
93+
94+
$$ \label{eq:X-Y-mean}
95+
\mathrm{E}(X) = \mathrm{E}(Y) = 0 \; .
96+
$$
97+
98+
With that, we can start to work out the covariance of $X$ and $Y$:
99+
100+
$$ \label{eq:X-Y-cov-s1}
101+
\begin{split}
102+
\mathrm{Cov}(X,Y)
103+
&= \mathrm{E}\left[ \left( X-\mathrm{E}(X) \right) \left( Y-\mathrm{E}(Y) \right) \right] \\
104+
&\overset{\eqref{eq:X-Y-mean}}{=} \mathrm{E}\left[ XY \right] \\
105+
&\overset{\eqref{eq:X-Y}}{=} \mathrm{E}\left[ XWX \right] \\
106+
&= \mathrm{E}\left[ WX^2 \right] \; .
107+
\end{split}
108+
$$
109+
110+
Since $W$ and $X$ are [independent](/D/ind) by construction, their [expected values factorize](/P/mean-mult):
111+
112+
$$ \label{eq:X-Y-cov-s2}
113+
\begin{split}
114+
\mathrm{Cov}(X,Y)
115+
&= \mathrm{E}[W] \cdot \mathrm{E}[X^2] \\
116+
&= \left( (-1) \cdot p(W=-1) + (+1) \cdot p(W=+1) \right) \cdot \mathrm{E}[X^2] \\
117+
&\overset{\eqref{eq:V-W-dist}}{=} \left( (-1) \cdot \frac{1}{2} + (+1) \cdot \frac{1}{2} \right) \cdot \mathrm{E}[X^2] \\
118+
&= 0 \cdot \mathrm{E}[X^2] \\
119+
&= 0 \; .
120+
\end{split}
121+
$$
122+
123+
Thus, $X$ and $Y$ are [uncorrelated](/D/corr):
124+
125+
$$ \label{eq:X-Y-corr}
126+
\mathrm{Corr}(X,Y) = \frac{\mathrm{Cov}(X,Y)}{\sqrt{\mathrm{Var}(X)} \sqrt{\mathrm{Var}(Y)}} = 0 \; .
127+
$$
128+
129+
Yet, $X$ and $Y$ are not [independent](/D/ind), since the [marginal density](/D/dist-marg) of $Y$ is
130+
131+
$$ \label{eq:Y-dist-marg}
132+
p(y) = \mathcal{N}(y; 0, 1) \; ,
133+
$$
134+
135+
but the [conditional density](/D/dist-cond) of $Y$ given $X$ is
136+
137+
$$ \label{eq:Y-dist-cond}
138+
p(y|x) = \left\{
139+
\begin{array}{rl}
140+
1/2 \; , & \text{if} \; y = -x \\
141+
1/2 \; , & \text{if} \; y = +x \\
142+
0 \; , & \text{otherwise}
143+
\end{array}
144+
\right. \; ,
145+
$$
146+
147+
thus violating the [behavior of probability under independence](/P/prob-ind):
148+
149+
$$ \label{eq:X-Y-dep}
150+
p(Y) \neq p(Y|X) \; .
151+
$$
152+
153+
Therefore, $X$ and $Y$ defined by \eqref{eq:X-Y} and \eqref{eq:V-W} constitute an example for two [random variables](/D/rvar) that are [normally distributed](/D/norm) and [uncorrelated](/D/corr), but not [independent](/D/ind).

0 commit comments

Comments
 (0)