目录 Chapter 1 Basis of Matrix Calculation 1.1 Fundamental Concepts 1.1.1 Notation 1.1.2 “BiggerBlock” Interpretations of Matrix Multiplication 1.1.3 Fundamental Linear Algebra 1.1.4 Four Fundamental Subspaces of a Matrix 1.1.5 Vector Norms 1.1.6 Determinants 1.1.7 Properties of Determinants 1.2 The Most Basic Matrix Decomposition 1.2.1 Gaussian Elimination 1.2.2 The LU Decomposition 1.2.3 The LDM Factorization 1.2.4 The LDL Decomposition for Symmetric Matrices 1.2.5 Cholesky Decomposition 1.2.6 Applications and Examples of the Cholesky Decomposition 1.2.7 Eigendecomposition 1.2.8 Matrix Norms 1.2.9 Covariance Matrices 1.3 Singular Value Decomposition (SVD) 1.3.1 Orthogonalization 1.3.2 Existence Proof of the SVD 1.3.3 Partitioning the SVD 1.3.4 Properties and Interpretations of the SVD 1.3.5 Relationship between SVD and ED 1.3.6 Ellipsoidal Interpretation of the SVD 1.3.7 An Interesting Theorem 1.4 The Quadratic Form 1.4.1 Quadratic Form Theory 1.4.2 The Gaussian MultiVariate Probability Density Function 1.4.3 The Rayleigh Quotient Chapter 2 The Solution of Least Squares Problems 2.1 Linear Least Squares Estimation 2.1.1 Example: Autoregressive Modelling 2.1.2 The LeastSquares Solution 2.1.3 Interpretation of the Normal Equations 2.1.4 Properties of the LS Estimate 2.1.5 Linear LeastSquares Estimation and the Cramer Rao Lower Bound 2.2 A Generalized “PseudoInverse” Approach to Solving the Leastsquares Problem 2.2.1 Least Squares Solution Using the SVD 2.2.2 Interpretation of the PseudoInverse Chapter 3 Principal Component Analysis 3.1 Introductory Example 3.2 Theory 3.2.1 Taking Linear Combinations 3.2.2 Explained Variation 3.2.3 PCA as a Model 3.2.4 Taking More Components 3.3 History of PCA 3.4 Practical Aspects
以下为对购买帮助不大的评价