1. gradient method: gradient method is a commonly used optimization algorithm to find the local maximum or minimum of a function. It calculates the gradient (partial derivative) of the function to find the direction in which the function rises fastest, and then iteratively updates along this direction until it reaches the local maximum or minimum.
2. Convex optimization method: If a function is convex, its global maximum and minimum can be determined by solving the convex optimization problem. Convex optimization methods include linear programming and quadratic programming. They use the convexity of the function to simplify the process of solving the problem.
3. Constrained optimization method: For constrained optimization problems, the constrained optimization method can be used to determine the maximum and minimum values. These methods introduce constraints into the objective function and use appropriate algorithms to solve optimization problems with constraints.
4. Numerical optimization method: For complex multivariate functions, analytical solutions may not be found, so numerical optimization method can be used to approximately determine the maximum and minimum values. Numerical optimization methods include simplex method, genetic algorithm and simulated annealing algorithm. They approximate the maximum or minimum value of the function by iterative search.
5. Lagrange multiplier method: Lagrange multiplier method is a common method to solve constrained optimization problems. It transforms constraint conditions into unconstrained conditions by introducing Lagrange function, and then determines the maximum and minimum values by solving the extreme value of Lagrange function.
It should be noted that the maximum and minimum values of multivariate functions may not be unique, which depends on the selection of initial points and the constraints of the problem. Therefore, different situations and possible results need to be considered when determining the maximum and minimum values.