Chapter 1 Introduction to ControITheory 1.1 Historical Review and Classical Control Theory 1.2 Modern Control Theory 1.3 Design of Control Systems 1.4 Outline of This Book
Chapter 2 State-Space Description of Dynamic Systems 2.1 State-Space Representation of Dynamic Systems 2.2 Obtaining State Equations 2.3 Transfer Function Matrix and Realizations 2.4 State-Space Representation of Linear Discrete-Time Systems 2.5 Summaries Exercises Problems
Chapter 3 Dynamic Analysis of Linear Systems 3.1 Solution of LTI State Equations 3.2 Numerical Solution of State Equations 3.3 Solution of Linear Discrete-Time State Equations 3.4 Discretization of Continuous-Time Systems Exercises Problems
Chapter 4 Controllability and Observability 4.1 Preliminary:Cayley-Hamilton Theorem 4.2 Controllability and Observability of LTI Systems 4.3 Structural Decomposition of LTI System 4.4 Controllability, Observability and Transfer Function 4.5 Controllability and Observability of Discrete-Time Systems Exercises Problems
Chapter 5 Lyapunov Stability 5.1 Preliminary Examples 5.2 Stability Concepts 5.3 First Method of Lyapunov 5.4 Second Method of Lyapunov 5.5 Lyapunov Equation Exercise Problems
Chapter 6 State Feedback and State Observer 6.1 State Feedback and Output Feedback 6.2 Pole Placement Using State Feedback 6.3 Pole Placement Using Output Feedback 6.4 StateObserver 6.5 Feedback From Estimated States 6.6 The Engineering Applications of State Feedback and Observer Exercises Problems
Chapter 7 An Introduction of Optimal Control Theory 7.1 Problem Form ulatio n 7.2 Preliminaries:The Extremum Problem of Functional 7.3 The Variational Approach to Optimal Control Problems 7.4 Minimum Principle and Its Application Exercises Problems Answers to Selected Exercises References
以下为对购买帮助不大的评价