default search action
Annals of Operations Research, Volume 29
Volume 29, Number 1, 1991
- Eugene A. Feinberg:
Non-randomized strategies in stochastic decision processes. 315-332 - Sjur Didrik Flåm, Alain Fougères:
Infinite horizon programs; convergence of approximate solutions. 333-350 - Alain Haurie, Christian van Delft:
Turnpike properties for a class of piecewise deterministic systems arising in manufacturing flow control. 351-373 - Masami Kurano:
Average cost Markov decision processes under the hypothesis of Doeblin. 375-385 - Suresh P. Sethi, Gerhard Sorger:
A theory of rolling horizon decision making. 387-415 - I. M. Sonin:
On an extremal property of Markov chains and sufficiency of Markov strategies in Markov decision processes with the Dubins-Savage criterion. 417-426 - Vivek S. Borkar:
A remark on control of partially observed Markov chains. 429-438 - Emmanuel Fernández-Gaucherand, Aristotle Arapostathis, Steven I. Marcus:
On the average cost optimality equation and the structure of optimal policies for partially observable Markov decision processes. 439-469 - Enrique L. Sernik, Steven I. Marcus:
On the computation of the optimal cost function for discrete time Markov models with partial observations. 471-511 - Nico M. van Dijk:
On truncations and perturbations of Markov decision problems with an application to queueing network overflow control. 515-535 - Kevin D. Glazebrook:
Competing Markov decision processes. 537-563 - Karl Hinderer:
Increasing Lipschitz continuous maximizers of some dynamic programs. 565-585 - Rhonda Righter, Susan H. Xu:
Scheduling jobs on heterogeneous processors. 587-601 - Paul J. Schweitzer:
Block-scaling of value-iteration for discounted Markov renewal programming. 603-630 - Paul J. Schweitzer, Ushio Sumita, Katsuhisa Ohno:
Replacement process decomposition for discounted Markov renewal programming. 631-645
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.