Namely (x):
① Definition of expectation:
In probability theory and mathematical statistics, mathematical expectation (or simply mean, or expectation) is the sum of the possible results multiplied by the results in each experiment, which is one of the most basic mathematical characteristics. It reflects the average value of random variables.
It should be noted that the expected value is not necessarily equal to the common sense "expectation"-"expected value" is not necessarily equal to every result. The expected value is the average of the output values of variables. The expected value is not necessarily contained in the set of output values of variables.
The law of large numbers stipulates that as the number of repetitions approaches infinity, the arithmetic average of numerical values almost inevitably converges to the expected value.
If a random variable only takes a finite number of values or can be listed in a certain order, its range of values is one or several finite or infinite intervals, such a random variable is called a discrete random variable.
② Expected calculation:
Discrete type: e (x) = x1* p (x1)+x2 * p (x2)+...+xn * p (xn).
Continuous type: E(x)=x*f(x) is the integral of x from negative infinity to positive infinity, and f(x) is the probability density.
Second, D(x):
① Definition of variance:
Variance is a measure of dispersion when probability theory and statistical variance measure random variables or a set of data. Variance in probability theory is used to measure the deviation between random variables and their mathematical expectations (that is, the mean value).
The variance (sample variance) in statistics is the average value of the square of the difference between each sample value and the average value of all sample values. In many practical problems, it is of great significance to study variance or deviation.
Variance is a measure of the difference between the source data and the expected value.
② Statistical significance of variance:
When the data distribution is scattered (that is, the data fluctuates greatly around the average value), the sum of squares of differences between each data and the average value is large, and the variance is large; When the data distribution is concentrated, the sum of squares of the differences between each data and the average value is very small. Therefore, the greater the variance, the greater the data fluctuation; The smaller the variance, the smaller the data fluctuation.
The average value of the sum of squares of the difference between the data in the sample and the average value of the sample is called sample variance; The arithmetic square root of sample variance is called sample standard deviation. Sample variance and sample standard deviation are both measures of sample fluctuation. The greater the sample variance or standard deviation, the greater the fluctuation of sample data.
Extended data:
I. Mean and variance:
1. When x and y are not related, E(XY)=E(X)E(Y).
2.D(X)=E(X^2)-(E(X))^2
At this time, e (x (x+y-2)) = e (x 2+xy-2x) = e (x 2)+e (xy)-2e (x).
Important distribution:
1, 0- 1 distribution: E(X)=p, D(X)=p( 1-p)
2. Binomial distribution b (n, p): p (x = k) = c (k \ n) p k (1-p) (n-k), e (x) = NP, d (x) = NP (1).
3. Poisson distribution x ~ p (x = k) = (λ k/k! )e^-λ,E(X)=λ,D(X)=λ
4. Uniform distribution U (a, b):: E (x) = (a+b)/2, D (x) = (b-a) 2/ 12.
References:
Baidu Encyclopedia-Mathematical Expectation
Baidu encyclopedia-variance