目录 Preface 1 Introduction Mathematical Formulation Example: A Transportation Problem Continuous versus Discrete Optimization Constrained and Unconstrained Optimization Global and Local Optimization Stochastic and Deterministic Optimization Optimization Algorithms Convexity Notes and References 2 Fundamentals of Unconstrained Optimization 2.1 What Is a Solution? Recognizing a Local Minimum Nonsmooth Problems 2.2 Overview of Algorithms Two Strategies: Line Search and Trust Region Search Directions for Line Search Methods Models for Trust—Region Methods Scaling Rates of Convergence R—Rates of Convergence Notes and References Exercises 3 Line Search Methods 3.1 Step Length The Wolfe Conditions The Goldstein Conditions Sufficient Decrease and Backtracking 3.2 Convergence of Line Search Methods 3.3 Rate of Convergence Convergence Rate of Steepest Descent Quasi—Newton Methods Newtons Method Coordinate Descent Methods 3.4 Step—Length Selection Algorithms Interpolation The Initial Step Length A Line Search Algorithm for the Wolfe Conditions Notes and References Exerases 4 Trust—Region Methods Outline of the Algorithm 4.1 The Cauchy Point and Related Algorithms The Cauchy Point Improving on the Cauchy Point The DoglegMethod Two—Dimensional Subspace Minimization Steihaugs Approach 4.2 Using Nearly Exact Solutions to the Subproblem Characterizing Exact Solutions Calculating Nearly Exact Solutions The Hard Case Proof of Theorem 4.3 4.3 Global Convergence Reduction Obtained by the Cauchy Point Convergence to Stationary Points Convergence of Algorithms Based on Nearly Exact Solutions 4.4 Other Enhancements Scaling Non—Euclidean Trust Regions Notes and References Exercises 5 Conjugate Gradient Methods 5.1 The Linear Conjugate Gradient Method Conjugate Direction Methods Basic Properties of the Conjugate Gradient Method A Practical Form of the Conjugate Gradient Method Rate of Convergence Preconditioning Practical Preconditioners 5.2 Nonlinear Conjugate Gradient Methods The Fletcher—Reeves Method The Polak—Ribiere Method Quadratic Termination and Restarts Numerical Performance Behavior of the Fletcher—Reeves Method Global Convergence Notes and References Exerases 6 Practical Newton Methods 6.1 Inexact Newton Steps 6.2 Line Search Newton Methods Line Search Newton—CG Method Modified Newtons Method 6.3 Hessian Modifications Eigenvalue Modification Adding a Multiple of the Identity Modified Cholesky Factorization Gershgorin Modification Modified Symmetric Indefinite Factorization 6.4 Trust—Region Newton Methods Newton—Dogleg and Subspace—Minimization Methods Accurate Solution of the Trust—Region Problem Trust—Region Newton—CG Method Preconditioning the Newton—CG Method Local Convergence of Trust—Region Newton Methods Notes and References Exerases 7 Calculating Derivatives 7.1 Finite—Difference Derivative Approximations Approximating the Gradient Approximating a Sparse Jacobian Approximatingthe Hessian Approximating a Sparse Hessian 7.2 Automatic Differentiation An Example The Forward Mode The Reverse Mode Vector Functions and Partial Separability Calculating Jacobians of Vector Functions Calculating Hessians: Forward Mode Calculating Hessians: Reverse Mode Current Limitations Notes and References Exercises 8 Quasi—Newton Methods 8.1 The BFGS Method Properties ofthe BFGS Method Implementation 8.2 The SR1 Method Properties of SRl Updating 8.3 The Broyden Class Properties ofthe Broyden Class 8.4 Convergence Analysis Global Convergence ofthe BFGS Method Superlinear Convergence of BFGS Convergence Analysis of the SR1 Method Notes and References Exercises 9 Large—Scale Quasi—Newton and Partially Separable Optimization 9.1 Limited—Memory BFGS Relationship with Conjugate Gradient Methods 9,2 General Limited—Memory Updating Compact Representation of BFGS Updating SR1 Matrices Unrolling the Update 9.3 Sparse Quasi—Newton Updates 9.4 Partially Separable Functions A Simple Example Internal Variables 9.5 Invariant Subspaces and Partial Separability Sparsity vs.Partial Separability Group Partial Separability 9.6 Algorithms for Partially Separable Functions Exploiting Partial Separabilityin Newtons Method Quasi—Newton Methods for Partially Separable Functions Notes and References Exercises …… 10 Nonlinear Least—Squares Problems 11 Nonlinear Equations 12 Theory of Constrained Optimization 13 Linear Programming: The Simplex Method 14 Linear Programming:Interior—Point Methods 15 Fundamentals of Algorithms for Nonlinear Constrained Optimization 16 Quadratic Programnung 17 Penalty, Barrier, and Augmented Lagrangian Methods 18 Sequential Quadratic Programming A Background Material References Index
以下为对购买帮助不大的评价