Current location - Training Enrollment Network - Mathematics courses - What does the statistics of r square mean?
What does the statistics of r square mean?
r? It refers to the goodness of fit, that is, the fitting degree of the regression line to the observed value.

Expression: R2=SSR/SST= 1-SSE/SST.

Where: SST=SSR+SSE, SST (sum of total squares) is sum of total squares, SSR (sum of regression squares) is sum of regression squares, and SSE (sum of error squares) is sum of residual squares.

Regression sum of squares: SSR (regression sum of squares) = ESS (explanation sum of squares).

Sum of squares of residuals: SSE (sum of squares of errors) = RSS (sum of squares of residuals).

Sum of squares of total deviation: SST (sum of squares of total deviation) = TSS (sum of squares of total deviation).

SSE+SSR=SST RSS+ESS=TSS

R-squared statistics is used for linear regression analysis of variables in statistics. When using the least square method to estimate parameters, R-square is the ratio of the sum of squares of regression to the sum of squares of total deviation, indicating the proportion that the sum of squares of regression can explain in the sum of squares of total deviation. The bigger the proportion, the better.

The more accurate the model is, the more significant the regression effect is. R squared is between 0 and 1, and the closer it is to 1, the better the regression fitting effect is. Generally speaking, the goodness of fit of models exceeding 0.8 is higher.