The average value of the sum of squares of the difference between the data in the sample and the average value of the sample is called sample variance; The arithmetic square root of sample variance is called sample standard deviation. Sample variance and sample standard deviation are both measures of sample fluctuation. The greater the sample variance or standard deviation, the greater the fluctuation of sample data.
In mathematics, e {[x-E(X)] 2} is generally used to measure the deviation degree of random variable X from its mean value E (x), which is called the variance of X.
definition
Let x be a random variable, and if e {[x-e (x)] 2} exists, then e {[x-e (x)] 2} is called the variance of x, and is recorded as D(X) or DX. That is, d (x) = e {[x-e (x)] 2}, σ (x) = d (x) 0.5 (the same dimension as x) is called standard deviation or mean square deviation.
From the definition of variance, the following commonly used calculation formulas can be obtained:
D(X)=E(X^2)-[E(X)]^2
S 2 = [(x 1-x pull) 2+(x2-x pull) 2+(x3-x pull) 2+…+(xn-x pull) 2]/n
Several important properties of variance (assuming that each variance exists).
(1) Let c be a constant, then D(c)=0.
(2) If X is a random variable and C is a constant, then D (CX) = (C 2) D (X).
(3) Let x and y be two independent random variables, then D(X+Y)=D(X)+D(Y).
(4) The necessary and sufficient condition for d (x) = 0 is that x takes the constant value c with the probability of 1, that is, P{X=c}= 1, where e (x) = c.
Variance is the square of standard deviation.