
Ian Stewart’s book, “17 Equations that Changed the World,” describes how these equations came to be used in machine learning :-
- Pythagorean’s Theorem: Although this theorem is best recognized for its geometric uses, it is important in machine learning, particularly in relation to distance-based techniques like k-nearest neighbors.
- Logarithms: These are used in ML approaches like feature engineering and normalization to scale and alter data.
- Derivatives: Derivatives are a key component of optimization algorithms and are used to train machine learning models, which enables them to recognize patterns and get better over time.
- The Law of Gravity: Despite their apparent disconnection, gravitational physics has inspired some optimization methods, such as gravitational search optimization.
- Imaginary Numbers: Applied in ML for data analysis and signal processing, they are used in complex algebra.
- Euler’s Formula, a unique equation that ties together calculus, trigonometric functions, and complex exponentials, provides insight into the mathematics behind brain networks.
- Normal Distribution: A fundamental idea in machine learning (ML), it forms the basis of statistics and probability.
- The wave equation is helpful in understanding data as waves and is used in signal processing and image analysis.
- Fourier Transformation: A key step in the core machine learning (ML) technique of feature extraction in image and signal processing.
- The Navier-Stokes equation is important for fluid dynamics simulations and can be employed in computational fluid dynamics for AI-based aerodynamics.
- Maxwell’s Equations: Essential to electromagnetic theory and the foundation of innovations like machine learning (ML) image processing.
- The second law of thermodynamics is related to the improvement of machine learning models and procedures, albeit being more oblique.
- Relativity: Algorithmic strategies like GPS, necessary for location-based ML applications, are inspired by Einstein’s theories.
- Schrodinger’s Equation: A key concept in quantum mechanics, a paradigm shift for machine learning.
- Information Theory: Entropy and information gain, which are essential to decision tree algorithms, are based on Shannon’s work.
- Chaos Theory: The randomness and unpredictability of chaotic systems has applications to machine learning.
- Black Scholes Equation: Mainly employed in finance, but also useful in risk modeling, a crucial component of machine learning in trading.

Leave a comment