Normal view MARC view ISBD view

Calculus of variations and optimal control theory : a concise introduction / Daniel Liberzon.

By: Liberzon, Daniel, 1973-.
Material type: materialTypeLabelBookPublisher: Princeton, N.J. : Princeton University Press, 2012Description: 1 online resource (xv, 235 pages) : illustrations.Content type: text Media type: computer Carrier type: online resourceISBN: 9781400842643; 1400842646; 9781680159080; 1680159089.Subject(s): Calculus of variations | Control theory | Calcul des variations | Th�eorie de la commande | MATHEMATICS -- Calculus | MATHEMATICS -- Mathematical Analysis | MATHEMATICS -- Applied | Calculus of variations | Control theory | Optimale Kontrolle | VariationsrechnungGenre/Form: Electronic book. | Electronic books.Additional physical formats: Print version:: Calculus of Variations and Optimal Control Theory : A Concise Introduction.DDC classification: 515.64 Other classification: SK 660 | MAT 490f Online resources: Click here to access online
Contents:
Chapter 1. Introduction -- Chapter 2. Calculus of Variations -- Chapter 3. From Calculus of Variations to Optimal Control -- Chapter 4. The Maximum Principle -- Chapter 5. The Hamilton-Jacobi-Bellman Equation -- Chapter 6. The linear quadratic regulator -- Chapter 7. Advanced topics.
Summary: "This textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory, and is a self-contained resource for graduate students in engineering, applied mathematics, and related subjects. Designed specifically for a one-semester course, the book begins with calculus of variations, preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming and linear-quadratic optimal control"--Provided by publisher
    average rating: 0.0 (0 votes)
No physical items for this record

"This textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory, and is a self-contained resource for graduate students in engineering, applied mathematics, and related subjects. Designed specifically for a one-semester course, the book begins with calculus of variations, preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming and linear-quadratic optimal control"--Provided by publisher

Includes bibliographical references and index.

Chapter 1. Introduction -- Chapter 2. Calculus of Variations -- Chapter 3. From Calculus of Variations to Optimal Control -- Chapter 4. The Maximum Principle -- Chapter 5. The Hamilton-Jacobi-Bellman Equation -- Chapter 6. The linear quadratic regulator -- Chapter 7. Advanced topics.

Print version record.

English.

IEEE IEEE Xplore Princeton University Press eBooks Library

There are no comments for this item.

Log in to your account to post a comment.