본문 바로가기
ML & DL/Mathematics For Machine Learning

Mathematics For Machine Learning 스터디 노트

by 별준 2022. 7. 15.

 

 

Home - MML Study Note

 

junstar92.github.io

 

최근 PRML(Pattern Recognition and Machine Learning)을 공부하면서... 보다가 수학 공부나 더하러 간다는 악명을 듣기는 했습니다만..., 진짜로 수학 공부를 조금 더 해야할 것 같아서 MML(Mathematics for Machine Learning) 교재를 기반으로 선형대수학과 확률 등 머신러닝/딥러닝에 기반이 되는 수학을 다시 공부할 필요성을 느껴서 최근 MML 책을 보면서 공부하고 있습니다. 공부하며 정리한 내용을 블로그에 작성하려다가 아무래도 github으로 관리하는 것이 조금 더 편해서 github을 사용하여 공부한 내용들을 정리하고 있습니다. 챕터 별로 링크는 아래에 달아두도록 하겠습니다.. !


Ch1. Introdunction

  • 0. Intro (link)
  • 1. Finding Words for Intuitions (link)

Ch2. Linear Algebra

  • 0. Intro (link)
  • 1. Systems of Linear Equations (link)
  • 2. Matrices (link)
  • 3. Solving Systems of Linear Equations (link)
  • 4. Vector Spaces (link)
  • 5. Linear Independence (link)
  • 6. Basis and Rank (link)
  • 7. Linear Mappings (link)
  • 8. Affine Spaces (link)

Ch3. Analytic Geometry

  • 0. Intro (link)
  • 1. Norms (link)
  • 2. Inner Products (link)
  • 3. Lengths and Distances (link)
  • 4. Angles and orthogonality (link)
  • 5. Orthonormal Basis (link)
  • 6. Orthogonal Complement (link)
  • 7. Inner Product of Functions (link)
  • 8. Orthogonal Projections (link)
  • 9. Rotations (link)

Ch4. Matrix Decompositions

  • 0. Intro (link)
  • 1. Determinant and Trace (link)
  • 2. Eigenvalues and Eigenvectors (link)
  • 3. Cholesky Decomposition (link)
  • 4. Eigenvdecomposition and Diagonalization (link)
  • 5. Singular Value Decomposition (link)
  • 6. Matrix Approximation (link)
  • 7. Matrix Phylogeny (link)

Ch5. Vector Calculus

  • 0. Intro (link)
  • 1. Differentiation of Univariate Functions (link)
  • 2. Partial Differentiation and Gradients (link)
  • 3. Gradients of Vector-Valued Functions (link)
  • 4. Gradients of Matrices (link)
  • 5. Useful Identities for Computing Gradients (link)
  • 6. Backpropagation and Automatic Differentiation (link)
  • 7. Higher-order Derivatives (link)
  • 8. Linearization and Multivariate Taylor Series (link)

Ch6. Probability and Distributions

  • 0. Intro (link)
  • 1. Construction of a Probability Space (link)
  • 2. Discrete and Continuous Probabilities (link)
  • 3. Sum Rule, Product Rule, and Bayes' Theorem (link)
  • 4. Summary Statistics and Independence (link)
  • 5. Gaussian Distribution (link)
  • 6. Conjugacy and the Exponential Family (link)
  • 7. Change of Variables/Inverse Transform (link)

Ch7. Continuous Optimizations

  • 0. Intro (link)
  • 1. Optimization Using Gradient Descent (link)
  • 2. Constrained Optimization and Lagrange Multipliers (link)
  • 3. Convex Optimization (link)

Ch8. When Models Meet Data

  • 0. Intro (link)
  • 1. Data, Models, and Learning (link)
  • 2. Empirical Risk Minimization (link)
  • 3. Parameter Estimation (link)
  • 4. Probabilistic Modeling and Inference (link)
  • 5. Directed Graphical Models (link)
  • 6. Model Selection (link)

Ch9. Linear Regressions

  • 0. Intro (link)
  • 1. Problem Formulation (link)
  • 2. Parameter Estimation (link)
  • 3. Bayesian Linear Regression (link)
  • 4. Maximum Likelihood as Orthogonal Projection (link)

Ch10. Dimensionality Reduction with Principal Component Analysis

todo

Ch11. Density Estimation with Gaussian Mixture Models

todo

Ch12. Classification with Support Vector Machines

todo

 


References

  • SaVAnNa Lab (link)
  • 공돌이의 수학정리노트 (link)
  • 프리드버그 선형대수학 유튜브 강의 (link)
  • Essential of linear algebra - 3Blue1Brown (link)

댓글