The first is linear algebra. In neural networks, a lot of calculations are matrix multiplication, which requires the knowledge of linear algebra. Inner product operation is also used to calculate cosine similarity of vectors, and various decomposition methods of matrices also appear in principal component analysis and singular value decomposition.
Followed by probability theory and statistics. Broadly speaking, the core of machine learning is statistical inference. Many giants of machine learning are statistical masters, such as Michael Jordan, Yang Lekun and Hinton. In addition, Bayesian formula and hidden Markov model are also widely used in machine learning.
Once again, calculus. This is one of the core knowledge in machine learning. Calculus is needed to calculate the gradient in gradient descent method or to deduce the error transmission in back propagation.
You can learn about your employment situation.