【机器学习】4 Gaussian models

本章目录

4 Gaussian models 97

4.1 Introduction 97

4.1.1 Notation 97

4.1.2 Basics 97

4.1.3 MLE for an MVN 99

4.1.4 Maximum entropy derivation of the Gaussian * 101

4.2 Gaussian discriminant analysis 101

4.2.1 Quadratic discriminant analysis (QDA) 102

4.2.2 Linear discriminant analysis (LDA) 103

4.2.3 Two-class LDA 104

4.2.4 MLE for discriminant analysis 106

4.2.5 Strategies for preventing overfitting 106

4.2.6 Regularized LDA * 107

4.2.7 Diagonal LDA 108

4.2.8 Nearest shrunken centroids classifier * 109

4.3 Inference in jointly Gaussian distributions 110

4.3.1 Statement of the result 111

4.3.2 Examples 111

4.3.3 Information form 115

4.3.4 Proof of the result * 116

4.4 Linear Gaussian systems 119

4.4.1 Statement of the result 119

4.4.2 Examples 120

4.4.3 Proof of the result * 124

4.5 Digression: The Wishart distribution * 125

4.5.1 Inverse Wishart distribution 126

4.5.2 Visualizing the Wishart distribution * 127

4.6 Inferring the parameters of an MVN 127

4.6.1 Posterior distribution of μ 128

4.6.2 Posterior distribution of Σ * 128

4.6.3 Posterior distribution of μ and Σ * 132

4.6.4 Sensor fusion with unknown precisions * 138

github下载链接https://github.com/916718212/Machine-Learning-A-Probabilistic-Perspective-.git