Publication | Closed Access
Discrete-time LQ-optimal control problems for infinite Markov jump parameter systems
224
Citations
32
References
1995
Year
Stochastic Hybrid SystemStochastic SystemMathematical Control TheoryStochastic Dynamical SystemMarkovian JumpsDiscrete-time Linear SystemsSystems EngineeringControllabilityStochastic ControlMarkov ChainMarkov Decision ProcessDynamic Optimization
Optimal control problems for discrete-time linear systems subject to Markovian jumps in the parameters are considered for the case in which the Markov chain takes values in a countably infinite set. Two situations are considered: the noiseless case and the case in which an additive noise is appended to the model. The solution for these problems relies, in part, on the study of a countably infinite set of coupled algebraic Riccati equations (ICARE). Conditions for existence and uniqueness of a positive semidefinite solution to the ICARE are obtained via the extended concepts of stochastic stabilizability (SS) and stochastic detectability (SD), which turn out to be equivalent to the spectral radius of certain infinite dimensional linear operators in a Banach space being less than one. For the long-run average cost, SS and SD guarantee existence and uniqueness of a stationary measure and consequently existence of an optimal stationary control policy. Furthermore, an extension of a Lyapunov equation result is derived for the countably infinite Markov state-space case.
| Year | Citations | |
|---|---|---|
Page 1
Page 1