Current location - Training Enrollment Network - Mathematics courses - Probability theory and mathematical statistics
Probability theory and mathematical statistics
First of all, short answer questions

1, sets and events belong to different mathematical fields, and they are not completely equivalent, but their properties are similar in many places.

For example, some operations of sets (intersection and complement) are similar to those of events (sum, product and inverse).

To some extent, it can be said that random events are subsets of the sample space, so that the relationship between them can be clearly seen.

2. Bernoulli test, that is, repeating the test n times under the same conditions, is called n independent repeated tests, that is, Bernoulli test.

3. Examples of conditional probability:

Have a classmate, the probability of failing math is 0. 15, the probability of failing Chinese is 0.05, and the probability of failing both is 0.03. In an exam, it is known that he failed in math, so what is the probability of his failing in Chinese?

Note that event A is "failing math" and event B is "failing Chinese", so P(A)=0. 15 P(B)=0.05, P(AB).

=0.03, then p (b ~ a) = p (ab)/p (a) = 0.2.

4. Simple random samples are samples obtained by simple random sampling.

Simple random sampling is also called simple random sampling, pure random sampling, SRS sampling,

This means that n cells are randomly selected as samples from a total of n cells.

A sampling method that makes every possible sample equal in probability.

Simple random samples are independent and identically distributed, while ordinary samples are not.

5. The integral of the density function of continuous random variables is the distribution function.

One property is that the integral on (-∞, +∞) is equal to 1.

And if the distribution function of x is f(x), the integral of the density function on (-∞, x] is equal to the function F(x) of F(x).

Another important property is that the density function of continuous random variables is not unique.

Specifically:

The distribution function of a random variable is unique, whether it is continuous or discrete.

But the density function of continuous random variables is not unique.

If the distribution function of x is f(x), as long as the integral on (-∞, x] is equal to the function F(x) of F(x), it can be said to be a density function of x. We know that the function value of a finite number of points of the integrand function changes (actually, the function value of an infinite number of points changes), and the integration result will not change. Therefore, when the distribution function is known to calculate the density function, it is not necessary to define the derivative at the piecewise point, and it does not matter to define the density function value at that point casually.

For another example, x obeys the uniform distribution on [0, 1] and is written by the density function.

f(x)= 1(0 & lt; x & lt 1); 0 (other), and written as f (x) =1(0 ≤ x ≤1); 0 (Other) is OK.

I can't help you with the remaining questions. Find your own way.