AI/북 리뷰 - Mathematics for Machine Learning2 11. Density Estimation Chap. 9 : Regression Chap. 10 : Dimensionality reduction. Chap 11 : Density Estimation. 1. Expectation Maximization Algorithm 2. Latent Variable Perspective. of density estimation with Mixture Models When? Huge, Representing characteristics ( Gaussian or Beta distribution ) p_{k} : For example - Gaussians, Bernoullis, or Gammas pi_{k} : Mixture weight. 11.1 Gaussian Mixture Model 11.2 Parameter .. 2020. 1. 27. 10장 - 1/2 10. Dimensionality Reduction with Principal Component Analysis many dimensions are redundant and can be explained by a combination of other dimensions. Principal component analysis (PCA), an algorithm for linear dimensionality reduction. PCA - basis / basis change // projection // eigen-values // Gaussian distribution // constrained optimization. 10.1 problem setting. we are interested in findin.. 2020. 1. 20. 이전 1 다음