The central limit theorem (CLT) states that the average of independent random variables tends toward a normal distribution even if the original variables themselves are not normally distributed.
Let \(\{X_1 \dots X_n\}\) be a sequence of \(n\) independent and identically distributed random variables drawn from distributions of expected values given by \(\mu\) and finite variances given by \(\sigma^2\). Let \(S_n\) the sample average: \[S_n = \frac{X_1 + X_2 + \dots + X_n}{n} \] Then as \(n\) tends to the infinite, we have the convergence (in distribution) of \(S_n\) towards a normal distribution: \[\frac{S_n-\mu}{\frac{\sigma}{\sqrt{n}}} \longrightarrow \mathcal{N}(0,1)\]
We are running 1000 times the sum of n draws in an uniform distribution \(\mathcal{U}(0,1)^{*}\).
Select n :
\(^{*}\) Mean of \(\;\mathcal{U}(0,1)\) is \(\mu = 0.5\), standard deviation is \(\sigma = \frac{1}{2\sqrt{3}}\).