Current location - Training Enrollment Network - Mathematics courses - Variance refers to the sum of squares of deviations of the average of all observed values.
Variance refers to the sum of squares of deviations of the average of all observed values.
Variance refers to the average of each number in a set of data MINUS the sum of squares of the average of this set of data.

For example:

There are five numbers, 1 2 3 4 5. Let's find their variance:

Find their average first.

( 1+2+3+4+5)/5=3

Then find the square of the difference between each number and the variance.

(1-3) quadratic +2-3 quadratic +3-3 quadratic +4-3 quadratic +5-3 quadratic = 10.

Because it is a number of 5, divide by 10 by 5=2.

So we get the variance.

1. variance) is a measure of dispersion when probability theory and statistical variance measure a random variable or a group of data. Variance in probability theory is used to measure the deviation between random variables and their mathematical expectations (that is, the mean value).

The variance (sample variance) in statistics is the average of the sum of squares of the difference between each data and its average.

2. Variance is the average of the square of the difference between the actual value and the expected value, and the standard deviation is the arithmetic square root of variance. [4] In actual calculation, we use the following formula to calculate the variance.

Variance is the average of the sum of squares of the difference between each data and the average, that is, where X represents the average of samples, N represents the number of samples, xi represents individuals, and S 2 represents variance.

3. The average difference is the arithmetic mean of the absolute value of the deviation between all units in the population and their arithmetic mean.

The average difference is the average deviation. Deviation is the difference between the marked value of each unit in the population and the arithmetic mean. Because the sum of deviations is zero, the sum of deviations divided by the number of deviations can not get the average value of deviations, and the absolute number of deviations must be taken to eliminate the symbol.

The average difference reflects the average difference between each symbol value and the arithmetic average. The greater the average difference, the greater the difference between each symbol value and the arithmetic mean, and the smaller the representativeness of the arithmetic mean; The smaller the average difference, the smaller the difference between each symbol value and the arithmetic average, and the greater the representativeness of the arithmetic average.