chapter 1 introduction 1.1 feature extraction 1.1.1 pca and subspace tracking 1.1.2 pca neural works 1.1.3 extension or generalization of pca 1.2 basis for subspace tracking 1.2.1 concept of subspace 1.2.2 subspace tracking method 1.3 main features of this book 1.4 organization of this book references chapter 2 matrix analysis basics 2.1 introduction 2.2 singular value deition 2.2.1 theorem and uniqueness of svd 2.2.2 properties of svd 2.3 eigenvalue deition 2.3.1 eigenvalue problem and eigen equation 2.3.2 eigenvalue and eigenvector 2.3.3 eigenvalue deition of hermitian matrix 2.3.4 generalized eigenvalue deition 2.4 rayleigh quotient and its characteristics 2.4.1 rayleigh quotient 2.4.2 gradient and conjugate gradient algorithm for rq 2.4.3 generalized rayleigh quotient 2.5 matrix analysis 2.5.1 differential and integral of matrix with respect to scalar 2.5.2 gradient of real function with respect to real vector 2.5.3 gradient matrix of real function 2.5.4 gradient matrix of trace function 2.5.5 gradient matrix of determinant 2.5.6 hessian matrix 2.6 summary references chapter 3 neural works for principal ponent analysis 3.1 introduction 3.2 review of neural based pca algorithms 3.3 neural based pca algorithms foundation 3.3.1 hebbian learning rule 3.3.2 ojas learning rule 3.4 hebbian/anti-hebbian rule based principal ponent analysis 3.4.1 subspace learning algorithms 3.4.2 generalized hebbian algorithm 3.4.3 learning machine for adaptive feature extraction via pca 3.4.4 the dot-product-decorrelation algorithm 3.4.5 anti-hebbian rule based principal ponent analysis 3.5 least mean squared error based principal ponent analysis 3.5.1 least mean square error reconstruction algorithm 3.5.2 projection appromation subspace tracking algorithm 3.5.3 robust rls algorithm 3.6 optimization based principal ponent analysis 3.6.1 novel information criterion algorithm 3.6.2 coupled principal ponent analysis 3.7 nonlinear principal ponent analysis 3.7.1 kernel principal ponent analysis 3.7.2 robust/nonlinear principal ponent analysis 3.7.3 autoassociative work based nonlinear pca 3.8 other pca or extensions of pca 3.9 summary references chapter 4 neural works for minor ponent analysis 4.1 introduction 4.2 review of neural work based mca algorithms 4.2.1 extracting the first minor ponent 4.2.2 ojas minor subspace analysis 4.2.3 self-stabilizing mca 4.2.4 orthogonal oja algorithm 4.2.5 other mca algorithm 4.3 mca en linear neuron 4.3.1 the sudden divergence 4.3.2 the instability divergence 4.3.3 the numerical divergence 4.4 a novel self-stabilizing mca linear neurons 4.4.1 a self-stabilizing algorithm for tracking one mc 4.4.2 ms tracking algorithm 4.4.3 puter simulations 4.5 total least squares problem application 4.5.1 a novel neural algorithm for total least squares filtering 4.5.2 puter simulations 4.6 summary references chapter 5 dual pure for principal and minor ponent analysis 5.1 introduction 5.2 review of neural work based dual pure methods 5.2.1 chens unified stabilization approach 5.2.2 hasans self-normalizing dual systems 5.2.3 pengs unified learning algorithm to extract principal and minor ponents 5.2.4 mantons dual pure principal and minor ponent flow 5.3 a novel dual pure method for principal and minor subspace tracking 5.3.1 preliminaries 5.3.2 a novel information criterion and its landscape 5.3.3 dual pure subspace gradient flow 5.3.4 global convergence analysis 5.3.5 numerical simulations 5.4 another novel dual pure algorithm for principal and minor subspace analysis 5.4.1 the criterion for a and msa and its landscape 5.4.2 dual pure algorithm for a and msa 5.4.3 experimental results 5.5 summary references chapter 6 deterministic discrete time system for the analysis of iterative algorithms 6.1 introduction 6.2 review of performance analysis methods for neural work based pca algorithms 6.2.1 deterministic continuous-time system method 6.2.2 stochastic discrete-time system method 6.2.3 lyapunov function approach 6.2.4 deterministic discrete-time system method 6.3 ddt system of a novel mca algorithm 6.3.1 self-stabilizing mca extraction algorithms 6.3.2 convergence analysis via ddt system 6.3.3 puter simulations 6.4 ddt system of a unified pca and mca algorithm 6.4.1 introduction 6.4.2 a unified self-stabilizing algorithm for pca and mca 6.4.3 convergence analysis 6.4.4 puter simulations 6.5 summary references chapter 7 generalized principal ponent analysis 7.1 introduction 7.2 review of generalized feature extraction algorithm 7.2.1 mathews quasi-newton algorithm for generalized symmetric eigenvalue problem 7.2.2 self-organizing algorithms for generalized eigen deition 7.2.3 fast rls-like algorithm for generalized eigen deition 7.2.4 generalized eigenvector extraction algorithm based on rls method 7.2.5 fast adaptive algorithm for the generalized symmetric eigenvalue problem 7.2.6 fast generalized eigenvector tracking based on the power method 7.2.7 generalized eigenvector extraction algorithm based on newton method 7.2.8 online algorithms for extracting minor generalized eigenvector 7.3 a novel principal generalized eigenvector extraction algorithm 7.3.1 algorithm description 7.3.2 convergence analysis 7.3.3 puter simulations 7.4 novel multiple gmc extraction algorithm 7.4.1 a weighted information criterion 7.4.2 multiple gmcs extraction algorithm 7.4.3 simulations and application experiments 7.5 summary references chapter 8 coupled principal ponent analysis 8.1 introduction 8.2 review of coupled principal ponent analysis 8.2.1 mollers coupled pca algorithm 8.2.2 nguyens coupled generalized eigen-pairs extraction algorithm 8.2.3 coupled singular value deition of a cross-covariance matrix 8.3 unified and coupled algorithm for minor and principal eigen-pair extraction 8.3.1 coupled dynamical system 8.3.2 the unified and coupled learning algorithms 8.3.3 analysis of convergence and self-stabilizing property 8.3.4 simulation experiments 8.4 adaptive coupled generalized eigen-pairs extraction algorithms 8.4.1 a coupled generalized system for gmca and gpca 8.4.2 adaptive implementation of coupled generalized systems 8.4.3 convergence analysis 8.4.4 numerical examples 8.5 summary references chapter 9 singular feature extraction and its neural work 9.1 introduction 9.2 review of cross-correlation feature method 9.2.1 cross-correlation neural works model and deflation method 9.2.2 parallel svd learning algorithms on double stiefel manifold 9.2.3 double generalized hebbian algorithm (dgha) for svd 9.2.4 cross-associative neural work for svd(cann) 9.2.5 coupled svd of a cross-covariance matrix 9.3 an effective neural learning algorithm for extracting cross- correlation feature 9.3.1 preliminaries 9.3.2 novel information criterion formulation for s 9.3.3 adaptive learning algorithm and performance analysis 9.3.4 puter simulations 9.4 coupled cross-correlation neural work algorithm for principal singular triplet extraction of a cross-covariance matrix 9.4.1 a novel information criterion and a coupled system 9.4.2 online implementation and stability analysis 9.4.3 simulation experiments 9.5 summary references
以下为对购买帮助不大的评价