Academic Press, 1970. — 299 p.
The object of this book is to present stochastic control theory — analysis, parametric optimization and optimal stochastic control. The treatment is limited to linear systems with quadratic criteria. It covers discrete time as well as continuous time systems.
The first three chapters provide motivation and background material on stochastic processes. Chapter 4 is devoted to analysis of dynamical systems whose inputs are stochastic processes. A simple version of the problem of optimal control of stochastic systems is discussed in Chapter 6; this chapter also contains an example of an industrial application of this theory. Filtering and prediction theory are covered in Chapter 7, and the general stochastic control problem for linear systems with quadratic criteria is treated in Chapter 8.
In each chapter we shall first discuss the discrete time version of a problem. We shall then turn to the continuous time version of the same problem. The continuous time problems are more difficult both analytically and conceptually. Chapter 6 is an exception because it deals only with discrete time systems.
Stochastic ControlTheory of Feedback Control
How to Characterize Disturbances
Stochastic Control Theory
Outline of the Contents of the Book
Bibliography and Comments
Stochastic ProcessesThe Concept of a Stochastic Process
Some Special Stochastic Processes
The Covariance Function
The Concept of Spectral Density
Analysis of Stochastic Processes
Bibliography and Comments
Stochastic State ModelsDiscrete Time Systems
Solution of Stochastic Difference Equations
Continuous Time Systems
Stochastic Integrals
Linear Stochastic Differential Equations
Nonlinear Stochastic Differential Equations
Stochastic Calculus—The Ito Differentiation Rule
Modeling of Physical Processes by Stochastic Differential Equations
Sampling a Stochastic Differential Equation
Bibliography and Comments
Analysis of Dynamical Systems Whose Inputs Are Stochastic ProcessesDiscrete Time Systems
Spectral Factorization of Discrete Time Processes
Analysis of Continuous Time Systems Whose Input Signals Are Stochastic Processes
Spectral Factorization of Continuous Time Processes
Bibliography and Comments
Parametric OptimizationEvaluation of Loss Functions for Discrete Time Systems
Evaluation of Loss Functions for Continuous Time Systems
Reconstruction of State Variables for Discrete Time Systems
Reconstruction of State Variables for Continuous Time Systems
Bibliography and Comments
Minimal Variance Control StrategiesA Simple Example
Optimal Prediction of Discrete Time Stationary Processes
Minimal Variance Control Strategies
Sensitivity of the Optimal System
An Industrial Application
Bibliography and Comments
Prediction And Filtering TheoryFormulation of Prediction and Estimation Problems
Preliminaries
State Estimation for Discrete Time Systems
Duality
State Estimation for Continuous Time Processes
Bibliography and Comments
Linear Stochastic Control TheoryFormulation
Preliminaries
Complete State Information
Incomplete State Information 1
Incomplete State Information 2
Continuous Time Problems
Bibliography and Comments