What is central limit theorem explain?

What is central limit theorem explain?

The Central Limit Theorem is one of the most important and commonly used results in probability theory and statistics. The theorem means the normal distribution of a sequence of random variables. The theorem is the basic theory of mathematical statistics and error analysis. It is the most important part in probability theory.

The theorem (CLT) states that the distribution of sample means approximates a normal distribution as the sample size gets larger, regardless of the population’s distribution.

For many beginners, it is an abstract concept that is not easy to understand. In order to make it easier for beginners to understand and master the basic concepts of the Central Limit Theorem. An interactive simulation program used here to allow us to understand the basic concepts of the Central Limit Theorem through interactive experiments. 

Introduction 

The theorem suggests that the normal distribution becomes used widely. And why it is usually sought-after for the accurate approximation for the average of a set of data with small sample size, moreover, in nature and production, some phenomena have been impacted by many independent random variables. If the impact of each variable is tiny. Hence we can say that the total impact is submitted to normal distribution. 

Definition 0.1 The central limit theorem refers to a given population of arbitrary distribution. If we randomly select n samples from these populations each time with a total of m times. Then take the m groups of samples to find the average value. The distribution of these averages are close to a normal distribution. 

The first demonstration of the central limit theorem, initially used by the French mathematician Pierre-Simon Laplace[1] in 1810, explains that ”the sum or average of an infinite sequence of independent and identically distributed random variables, when suitably rescaled, tends to a normal distribution.”

After fourteen years, one of the French mathematician, Dr. Simeon-Denis Poisso, continued to the study of previous concepts of the theorem by Laplace with standardization and enhancement. Dr. Laplace and his contemporary mathematicians became devoted to the theorem since its magnificence in measurements of the mean quantity and in application in other fields. 

Back To News

Paul Romer

Definition 0.2 Mean of measurements can be approximated by the normal distribution if and only if they are IID(independent and identically distributed) datas. IID acquires that each random variable are mutual independent and all has same probability distribution. 

References [1] Gerald James Whitrow. “Pierre-Simon, marquis de Laplace ”. In: The Editors of Encyclopaedia Britannica (Jul 20, 1998). doi: https://www.britannica.com/biography/Pierre-Simon marquis-de-Laplace.