Partial least square method is a mathematical optimization technique, which seeks the best function matching of a set of data by minimizing the sum of squares of errors. Some absolutely unknowable truth values are obtained by the simplest method to minimize the sum of squares of errors. Many other optimization problems can also be expressed in the form of least squares by minimizing energy or maximizing entropy.
Extended data
Compared with the traditional multiple linear regression model, the characteristics of partial least squares regression are:
(1) can be used for regression modeling when multiple correlations of independent variables are serious;
(2) When the number of sample points is less than the number of variables, regression modeling is allowed;
(3) Partial least squares regression will contain all the original independent variables in the final model;
(4) Partial least squares regression model is easier to identify system information and noise (even some non-random noise);
(5) In the partial least squares regression model, the regression coefficient of each independent variable will be easier to explain.
When calculating variance and covariance, there are two methods to take the coefficient before the sum sign:1(n-1) when randomly selecting the sample point set; If it is not randomly selected, this coefficient can be taken as1/n.
reference data
Baidu encyclopedia-partial least squares method