An optimal control theory for discrete event systems

An optimal control theory for discrete event systems

0.00 Avg rating0 Votes
Article ID: iaor20011957
Country: United States
Volume: 36
Issue: 2
Start Page Number: 488
End Page Number: 541
Publication Date: Mar 1998
Journal: SIAM Journal on Control and Optimization
Authors: ,
Abstract:

In certain discrete event applications it may be desirable to find a particular controller, within the set of acceptable controllers, which optimizes some quantitative performance measure. In this paper we propose a theory of optimal control to meet such design requirements for deterministic systems. The discrete event system (DES) is modeled by a formal language. Event and cost functions are defined which induce costs on controlled system behavior. The event costs associated with the system behavior can be reduced, in general, only by increasing the control costs. Thus it is nontrivial to find the optimal amount of control to use, and the formulation captures the fundamental tradeoff motivating classical optimal control. Results on the existence of minimally restrictive optimal solutions are presented. Communication protocols are analyzed to motivate the formulation and demonstrate optimal controller synthesis. Algorithms for the computation of optimal controllers are developed for the special case of DES modeled by regular languages. It is shown that this framework generalizes some of the existing literature.

Reviews

Required fields are marked *. Your email address will not be published.