目录 Preface prefcetothe Second Edition 1 Introduction Mathematical Formulation Example:A Transportation Problem Continuous versus Discrete Optimization Constrained and Unconstrained Optimization Global and Local Optimization Stocbastic and Deterministic Optimization Convexity Optimization Algorithms Notes and References 2 Fundamentals of Unconstrained Optimization 2.1 What ls a Solution? Recognizing a Local Minimum Nonsmooth Problems 2.2 Overview of A1gorithms Two Strategies:Line Search and Trust Region Search Directions for Line Search Methods Models for Trust-Region Methods Scaling Exercises 3 Line Search Methods 3.1 Step Length The Wolfe Conditions The Goldstein Conditions Sufficient Decrease and Backtracking 3.2 Convergence of Line Search Methods 3.3 Rate of Convergence Convergence Rate of Steepest Descent Newtons Method Quasi-Newton Methods 3.4 Newtons Method with Hessian Modification Eigenvalue Modification Adding a Multiple of the ldentity Modified Cholesky Factorization Modified Symmetric Indefinite Factorization 3.5 Step-Length Selection Algorithms lnterpolation lnitial Step Length A Line Search A1gorithm for the Wolfe Conditions Notes and References Exercises 4 Trust-Region Methods Outline of the Trust-Region Approach 4.1 A1gorithms Based on the Cauchy Point The Cauchy Point lmpro时ng on the Cauchy Point The Dogleg Method Two-Dinlensional Subspace Mininlization 4.2 Global Convergence Reduction Obtained by the Cauchy Point Convergence to Stationary Points 4.3 lterative Solution of the Subproblem The Hard Case Proof of Theorem 4. Convergence of Algorithms Based on Nearly Exact Solutions 4.4 Local Convergence ofTrust-Region Newton Methods 4.5 0ther Enhancements Scaling Trust Regions in 0ther Norms Notes and References Exercises 5 Conjugate Gradient Methods 5.1 The linear Conjugate Gradient Method Conjugate Direction Methods Basic Properties of thee Conjugate Gradient Method A Practical Form of the Conjugate Gradient Method Rate of Convergence Preconditioning Practical Preconditioners 5.2 Nonlinear Conjugate Gradient Methods The Fletcher-Reeves Method The Polak-Ribière Method and Variants Quadratic Termination and Restarts Behavior of the Fletcher-Reeves Method Global Convergence Numerical Performance Notes and Reference Exercises 6 Quasi-Newton Methods 6.1 The BFGS Method Properties ofthe BFGS Method Implementation 6.2 The SR1 Method Properties of SR1 Updating 6.3 The Broyden Class 6.4 Convergence Analysis Global Convergence of the BFGS Method Superlinear Convergence of the BFGS Method Convergence Analysis of the SR1 Method Notes and References Exercises 7 Large-Scale Unconstrained optimization 7.1 lnexact Newton Methods Local Convergence of Inexact Newton Methods Line Search Newton-CG Method Trust-Region Newton-CG Method Preconditioning the Trust-Region Newton-CG Method Trust-Region Newton-Lanczos Method 7.2 Limited-Memory Quasi-Newton Methods Limited-Memory BFGS Relationship with Conjugate Gradient Methods General Lirnited:d-Memory Updatiug Compact Representation of BFGS Updating Unrolling the Update 7.3 Sparse Quasi-Newton Updates 7.4 Algorithms for Partially Separable Fnnctions 7.5 Perspectives and Sotrware Notes and References Exercises 8 Calculating Derivatives 8.1 Finite-Difference Derivative Approximations Approximating the Gradient Approximating a Sparse Jacobian Approximatiug the Hessian Approximatiug a Sparse Hessian 8.2 Automatic Differentiation Au Example The Forward Mode The Reverse Mode Vector Fnnctions and Partial Separablity Calculating Jacobians ofVector Funlctions Calculating Hessians:Forward Mode Calculating Hessians:Reverse Mode Current Lirnitations Notess and References Exercises 9 Derivatve-Free Optiimization 9.1 Finite Differences and Noise 9.2 Model-Based Methods Interpolation aod Polyoomial Bases Updating the Interpolation Set A Method Based on Minimum-Change Updating 9.3 Coordinate and Pattern-Search Methods Coordinate Search Method Pattern-Search Methods 9.4 A Conjugate-Direction Method 9.5 Nelder-Mead Method 9.6 Implicit Filtering Notes and References Exercises 10 Least-Sqnares Problems 10.1 Background 10.2 Linear Least-Squares Problems 10.3 Algorithms for Nonlinear Least-Squares Problems The Gauss-Newton Method Convergence of the Gauss Newton Method The Levenberg-Marquardt Method Implementation of the Levenberg-Marquardt Method Convergence of the Levenberg-Marquardt Method Methods for Large-Residual Problems 10.4 Orthogonal Distance Regression Nootes and References Exerclses 11 Nonlinear Equations 11.1 Local A1gorithms Newtons Method for Nonlinear Equations Inexact Newton Methods Broydens Methods Tensor Methods 11.2 Practical Methods Merit Functions Line Search Methods Trust-Region Methods 11.3 Continuation/Homotopy Methods Motivation Practical Continuation Methods Notes and References Exercises 12 Theory of Constrained Optimization Local and Global Solutions Smoothness 12.1 Examples A Single Equality Constraint A Single Inequality Constraint Two Inequality Constraints 12.2 Tangent Cone and Constraint Qualifications 12.3 First-Order Optimality Conditions 12.4 First-Order Optimality Conditions:Proof Relating the Tangent Cone and the First-Order Feasible Direction Set A Fundamental Necessary Condition Farkas Lemma Proof ofTbeorem 12. 12.5 Second-Order Conditions Second-Order Conditions and projected Hessians 12.6 Other Constraint Qualifications 12.7 A Geometric Viewpoint 12.8 Lagrange Multipliers and Sensitivity 12.9 Duality Notes and References Exercises 13 Linear Programming:Tbe Sirnplex Method Linear Programming 13.1 Optimality and Duality Optimality Conditions Tbe Dual Problem 13.2 Geometry of the Feasible Set Bases and Basic Feasible Points A Single Step of the Feasible Polytope 13.3 The Sirnplex Metbod Outline A Single Step of the Metbod 13.4 Linear Algebra in the Sirnplex Metbod 13.5 Other Important Detaills Pricing and Selection of the Entering Index Starting the Sirnplex Method Degenerate Steps and Cycling 13.6 Tbe Dual Sirnplex Method 13.7 Presolving 13.8 Where Does the Sirnplex Metbod Fit Notes and References Exfercises 14 Linear Programming:lnterior-Point Methods 14.1 Primal-Dual Methods Outlioe The Central Path Central Path Neighborhoods and path-Following Methods 14.2 Practical Primal-Dual Algorithms Corrector and Centering Steps Step Lengths Starting Point A Practiica1 Algorithm Solving Linear Systems 14.3 Other Primal-Dual Algorithms and Extensions 0ther Parimal-Followmg Methods Potential-Reduction Metheods Extenlsions 14.4 Perspectives and Software Notes and References Exercises 15 Fundamentals of A1gorithms for Nonlinear Constrained Optization 15.1 Categorizing Optimization Algorithms 15.2 The Combmatorial Difficulty of Inequality Constrained Problems 15.3 Elimiuation of Variables Simple Elimination usmg Lmear Constraints General Reduction Strategies for Lmear Constraints Effect of lnequality Constraints 15.4 Merit Functions and Filtes Merit Functions Filters 15.5 The Maratos Effect 15.6 Second-Order Correction and Nonmonotone Tecbniques Nonmonotone (Watcbdog) Strategy Notes and References Exercises 16 Quadratic Programs 16.1 Equality-Constrained Quadratic Programs Properties of Equality-Constrained QPs 16.2 Direct Solution of the KKT System Factormg 也e Full KKT System Scbur-Complement Method Null-Space Method 16.3 Iterative Solution of the KKT System CG Applied to the Reduced System The ProjectedCG Method 16.4 Inequality-Constrained Problems Optimality Conditions for Inequality-Constrained Problems Degeneracy 16.5 Active-Set Methods for Convex QPs Specification of the Active-Set Method for Convex QP Further Remarks on the Active-Set Method Finite Termination of Active-Set A1gorithm on Strictly Convex QPs Updating Factorizations 16.6 Interior-Point Methods Solving the PrinIal-Dual System Step Length Selection A Practical PrinIal-Dual Method 16.7 The Gradient Projection Method Caucby Point Computation Subspace Mininimization 16.8 Perspectives and Software Notes and References Exercises 17 Penalty an
以下为对购买帮助不大的评价