Linear Algebra
Linear algebra is the branch of mathematics concerning linear transformations over vector spaces, and it is used in science and engineering to model natural phenomena.
Algebra helps data specialists to understand the inner workings of ML algorithms, to formulate extensions to existing algorithms and to create new algorithms.
Linear algebra is the branch of mathematics concerning linear transformations over vector spaces, and it is used in science and engineering to model natural phenomena. Even the nonlinear equations are most often approximated by linear equations for computational purposes. It could be claimed that the primary design pattern in scientific computations is to reduce the computations into linear algebra computations.
The concept of a vector(a tensor) is at the heart of Machine Learning computations. Essentially, data fields are represented as tensors, and a Machine Learning algorithm involves linear algebra computations over these tensors. A strong background in Linear Algebra helps the Data Workers to understand the inner workings of ML algorithms, to formulate extensions to existing algorithms and to create new algorithms. Along with Calculus, Probability Theory and Optimization Theory, Linear Algebra lies at the mathematical foundations of Machine Learning.
Sample Topics
- The concepts of linearity and non-linearity
- Vector norms, dot products, various distance metrics
- Quadratic forms
- Matrix computations
- Linear independence, span, basis and rank
- Systems of linear equations, Gaussian elimination, matrix inversion
- Eigenvectors, eigenvalues
- Matrix factorizations, SVD, PCA, ICA, LSA, NMF
- Least squares problems and regularization
- Special matrices in Machine Learning