Series Preface * Preface to First Edition * 1 Introduction * 2 Systems * 3 Reachability and Controllability * 4 Nonlinear Controllability * 5 Feedback and Stabilization * 6 Outputs * 7 Observers and Dynamic Feedback * 8 Linear-Quadratic Optimal Control * 9 Time-Optimal Control of Linear Systems * 10 Remarks on Nonlinear Optimal Control * Appendixes * A Linear Algebra * B Differentials * C Ordinary Differential Equations * Bibliography * List of Symbols * Index
Geared primarily to an audience consisting of mathematically advanced undergraduate or beginning graduate students, this text may additionally be used by engineering students interested in a rigorous, proof-oriented systems course that goes beyond the classical frequency-domain material and more applied courses. The minimal mathematical background required is a working knowledge of linear algebra and differential equations. The book covers what constitutes the common core of control theory and is unique in its emphasis on foundational aspects. While covering a wide range of topics written in a standard theorem/proof style, it also develops the necessary techniques from scratch. In this second edition, new chapters and sections have been added, dealing with time optimal control of linear systems, variational and numerical approaches to nonlinear control, nonlinear controllability via Lie-algebraic methods, and controllability of recurrent nets and of linear systems with bounded controls.
This textbook introduces the basic concepts and results of mathematical control and system theory. It is geared primarily to mathematically advanced undergraduate or beginning graduate students. It can also be used by engineering students interested in a rigorous, proof-oriented systems course.