Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Algorithm 972: jMarkov: An Integrated Framework for Markov Chain Modeling

Published: 09 January 2017 Publication History

Abstract

Markov chains (MC) are a powerful tool for modeling complex stochastic systems. Whereas a number of tools exist for solving different types of MC models, the first step in MC modeling is to define the model parameters. This step is, however, error prone and far from trivial when modeling complex systems. In this article, we introduce jMarkov, a framework for MC modeling that provides the user with the ability to define MC models from the basic rules underlying the system dynamics. From these rules, jMarkov automatically obtains the MC parameters and solves the model to determine steady-state and transient performance measures. The jMarkov framework is composed of four modules: (i) the main module supports MC models with a finite state space; (ii) the jQBD module enables the modeling of Quasi-Birth-and-Death processes, a class of MCs with infinite state space; (iii) the jMDP module offers the capabilities to determine optimal decision rules based on Markov Decision Processes; and (iv) the jPhase module supports the manipulation and inclusion of phase-type variables to represent more general behaviors than that of the standard exponential distribution. In addition, jMarkov is highly extensible, allowing the users to introduce new modeling abstractions and solvers.

Supplementary Material

ZIP File (972.zip)
Software for jMarkov: An Integrated Framework for Markov Chain Modeling

References

[1]
Advanced Logistic Department 2013. RAM Commander, Version 8.3. Advanced Logistic Department. Retrieved from http://aldservice.com/en/reliability-products/markov.html.
[2]
D. Applegate, W. Cook, S. Dash, and M. Mevenkamp. 2003. QSopt Reference Manual. Retrieved from http://www2.isye.gatech.edu/∼wcook/qsopt/index.html.
[3]
S. Asmussen, O. Nerman, and M. Olsson. 1996. Fitting phase type distributions via the EM algorithm. Scand. J. Stat. 23 (1996), 419,441.
[4]
J. J. Bartholdi and D. D. Eisenstein. 1996. A production line that balances itself. Oper. Res. 4, 2 (1996), 21--34.
[5]
R. Becker, S. Zilberstein, and V. Lesser. 2004. Decentralized Markov Decision Processes with Event-Driven Interactions. In Autonomous Agents 8 multi Agent Systems. AAMAS (July 2004).
[6]
R. Bellman. 1957. Dynamic Programming. Princeton University Press, Princeton, New Jersey.
[7]
D. Bello and G. Riaño. 2006. Linear programming solvers for Markov decision processes. In Proceedings of the 2006 IEEE Systems and Information Engineering Design Symposium. 93--98.
[8]
D. Bertsekas. 1995. Dynamic Programming and Optimal Control. Athena Scientific, Belmont, MA.
[9]
D. Bini, G. Latouche, and B. Meini. 2005. Numerical Methods for Structured Markov Chains. Oxford.
[10]
D. Bini, B. Meini, S. Steffé, and B. Van Houdt. 2009. Structured Markov chains solver: Tool extension. In Proceedings of the Fourth International ICST Conference on Performance Evaluation Methodologies and Tools. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), 20.
[11]
A. Bobbio, A. Horvath, and M. Telek. 2005. Matching three moments with minimal acyclic phase type distributions. Stoch. Models 21 (2005), 303--326.
[12]
Butools. 2016. Homepage. Retrieved from http://webspn.hit.bme.hu/∼telek/tools/butools/index.php.
[13]
I. Chadès, G. Chapron, M. J. Cros, F. Garcia, and R. Sabbadin. 2014. MDPtoolbox: A multi-platform toolbox to solve stochastic dynamic programming problems. Ecography 37, 9 (2014), 916--920.
[14]
G. Ciardo. 2000. Tools for formulating Markov models. In Computational Probability, Winfried K. Grassman (Ed.). Kluwer’s International Series in Operations Research and Management Science, Boston, MA.
[15]
G. Ciardo, R. L. Jones III, R. M. Marmorstein, A. S. Miner, and R. Siminiceanu. 2002. SMART: Stochastic model-checking analyzer for reliability and timing. In Proceedings of the International Conference on Dependable Systems and Networks, 2002 (DSN’02).
[16]
S. Cordwell. 2013. PyMDPtoolbox. Retrieved from http://code.google.com/p/pymdptoolbox/. (2013).
[17]
CPLEX Optimizer. 2015. Gurobi Optimizer Reference Manual. Retrieved from http://www-03.ibm.com/software/products/en/ibmilogcpleoptistud.
[18]
Dash Optimization. 2005. Xpress-BCL, Reference Manual (release 2.6 ed.). Dash optimization. Retrieved from http://www.dashoptimization.com.
[19]
L. De Alfaro. 1999. Computing Minimum and Maximum Reachability Times in Probabilistic Systems. Springer.
[20]
A. P. Dempster, N. M. Laird, and D. B. Rubin 1977. Maximum likelihood from incomplete data via the EM algorithm. J. Roy. Stat. Soc. Ser. B 39 (1977), 1--38.
[21]
N. J. Dingle, W. J. Knottenbelt, and T. Suto. 2009. PIPE2: A tool for the performance evaluation of generalised stochastic petri nets. SIGMETRICS Perform. Eval. Rev. 36, 4 (March 2009), 34--39.
[22]
R. El Abdouni Khayari, R. Sadre, and B. R. Haverkort. 2003. Fitting world-wide web request traces with the EM-algorithm. Perf. Eval. 52, 2 (2003), 175--191.
[23]
E. A. Feinberg. 2004. Continuous time discounted jump Markov decision processes: A discrete-event approach. Math. Operat. Res. 29, 3 (August 2004), 492--524.
[24]
A. Gomez, R. Mariño, R. Akhavan-Tabatabaei, A. Medaglia, and J. Mendoza. 2013. A unified framework for vehicle routing problems with stochastic travel and service times. In Proceedings of TRISTAN VIII.
[25]
Gurobi Optimizaton. 2014. Gurobi Optimizer Reference Manual. (2014). Retrieved from http://www.gurobi.com.
[26]
N. Hastings. 1971. Technical note. Bounds on the gain of a Markov decision process. Operat. Res. 19, 1 (1971), 240--244.
[27]
B. Heimsund. 2003. JMP-Sparce Matrix Library in Java. Technical Report. Univesity of Bergen, Bergen, Norway.
[28]
B. Heimsund. 2005. Matrix Toolkits for Java (MTJ). Retrieved from http://rs.cipr.uib.no/mtj/.
[29]
J. Hicklin, C. Moler, P. Webb, R. F. Boisvert, B. Miller, R. Pozo, and K. Remington. 2005. JAMA: A Java Matrix Package. (July 2005). MathWorks and the National Institute of Standards and Technology (NIST). Retrieved from http://math.nist.gov/javanumerics/jama/.
[30]
J. Hoey, R. St-Aubin, A. Hu, and C. Boutilier. 1999. SPUDD: Stochastic planning using decision diagrams. In Proceedings of the Fifteenth Conference on Uncertainty in Artificial Intelligence. Morgan Kaufmann, 279--288.
[31]
A. Horváth and M. Telek. 2002. Phfit: A general phase-type fitting tool. In Computer Performance Evaluation: Modelling Techniques and Tools. Springer, 82--91.
[32]
ITEM Software (USA) Inc. 2011. ITEM TOOLKIT, Version 7. ITEM Software (USA) Inc. Retrieved from http://www.itemuk.com/assets/docs/ToolKit_Manual.pdf.
[33]
jMarkov website. 2016. Homepage. Retrieved from https://projects.coin-or.org/jMarkov/.
[34]
jUnit. 2016. Homepage. Retrieved from http://junit.org/.
[35]
W. Knottenbelt. 1996. Generalised Markovian Analysis of Timed Transitions Systems. Master’s thesis. Department of Computer Science, Faculty of Science, University of Cape Town.
[36]
V. Kulkarni. 1995. Modeling and Analysis of Stochastic Systems. Chapman 8 Hall.
[37]
M. Kwiatkowska, G. Norman, and D. Parker. 2011. PRISM 4.0: Verification of probabilistic real-time systems. In Proc. 23rd International Conference on Computer Aided Verification (CAV’11) (LNCS), G. Gopalakrishnan and S. Qadeer (Eds.), Vol. 6806. Springer, 585--591.
[38]
G. Latouche and V. Ramaswami. 1999. Introduction to Matrix Analytic Methods in Stochastic Modeling. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA.
[39]
P. L’Ecuyer and E. Buist. 2005. Simulation in java with SSJ. In Proceedings of the 2005 Winter Simulation Conference.
[40]
J. D. C. Little. 1961. A proof for the queueing formula: L = λW. Operat. Res. 9 (1961), 383--387.
[41]
S. Mahadevan, N. Khaleeli, and N. Marchalleck. 1997. Designing Agent controllers using Discrete-Event Markov Models. (1997). Department of Computer Science, Michigan State University.
[42]
J. Medina, G. Riaño, and J. Villarreal. 2007. A dynamic programming model for structuring mortgage backed securities. In Proceedings of the 2007 IEEE Systems and Information Engineering Design Symposium.
[43]
K. Murphy. 2002. Markov Decision Process (MDP) Toolbox for Matlab. Retrieved from http://www.cs.ubc.ca/∼murphyk/Software/MDP/mdp.html/.
[44]
M. Neuts and M. Pagano. 1981. Generating random variates from a distribution of phase-type. In Proceedings of the Winter Simulation Conference.
[45]
M. F. Neuts. 1981. Matrix-Geometrix Solutions in Stochastic Models. The John Hopkings University Press.
[46]
M. Olsson. 1998. The EMpht-programme. Manual. Chalmers University of Technology, and Götborg University, Retrieved from http://www.maths.lth.se/matstat/staff/asmus/pspapers.html.
[47]
T. Osogami and M. Harchol. 2006. Closed form solutions for mapping general distributions to quasi-minimal PH distributions. Perf. Eval. 63 (2006), 524--552.
[48]
J. F. Pérez and G. Riaño. 2006. jPhase: An object-oriented tool for modeling phase-type distributions. In SMCtools’06: Proceedings From the 2006 Workshop on Tools for Solving Structured Markov Chains. ACM Press, New York.
[49]
M. L. Puterman. 1994. Markov Decision Processes: Discrete Stochastic Dynamic Programming. John Wiley 8 Sons, New York.
[50]
M. L. Puterman and M. C. Shin. 1978. Modified policy iteration algorithms for discounted Markov decision problems. Manag. Sci. 24, 11 (1978), 1127--1137.
[51]
P. Reinecke, T. Krauss, and K. Wolter. 2012. Hyperstar: Phase-type fitting made easy. In 2012 Ninth International Conference on Quantitative Evaluation of Systems (QEST). IEEE, 201--202.
[52]
G. Riaño and J. Góez. 2006. jMarkov: An object-oriented framework for modeling and analyzing Markov chains. In Proceedings of the SMCtools’06. ACM Press, New York, NY.
[53]
G. Riaño, J. Góez, and J. P. Alvarado. 2006. Modeling bucket brigades using phase-type distributions. In Memorias del XIII Congreso Latino-Iberoamericano de Investigación de Operaciones y Sistemas. Montevideo, 1--6.
[54]
G. Riaño. 2002. Transient Behavior of Stochastic Networks: Application to Production Planning with Load Dependent Lead Times. Ph.D. Dissertation. Georgia Institute of Technology.
[55]
G. Riaño, J. Góez, and J. F. Pérez. jMarkov User’s Guide. Retrieved from https://projects.coin-or.org/jMarkov/.
[56]
G. Riaño, A. Sarmiento, and D. F. Silva. jMDP User’s Guide. Retrieved from https://projects.coin-or.org/jMarkov/.
[57]
A. Riska and E. Smirni. 2007. ETAQA solutions for infinite Markov processes with repetitive structure. INFORMS J. Comput. 19, 2 (2007), 215--228.
[58]
R. Serfozo. 1979. An equivalence between continuous and discrete time Markov decision processes. Operat. Res. 27 (1979), 616--620.
[59]
D. F. Silva, C. Amaya, and G. Riaño. 2009. Continuous-time models for estimating picker blocking in order-picking-systems. In Proceedings of the IIE Conference.
[60]
W. Stewart. 1996. MARCA: Markov Cahin Analyzer, A software Package for Markov Modelling, Version 3.0. Retrieved from http://www.csc.ncsu.edu/faculty/stewart/.
[61]
Sun Microsystems. 2006. Java Technology. Retrieved from http://www.java.sun.com/.
[62]
F. Teichteil-Königsbuch, U. Kuter, and G. Infantes. 2010. Incremental plan aggregation for generating policies in MDPs. In Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1-Volume 1. International Foundation for Autonomous Agents and Multiagent Systems, 1231--1238.
[63]
M. Telek and A. Heindl. 2002. Matching moments for acyclic discrete and continuous phase-type distributions of second order. Int. J. Simul. 3, 3--4 (2002), 47--57.
[64]
A. Thümmler, P. Buchholz, and M. Telek. 2005. A novel approach for fitting probability distributions to real trace data with the EM algorithm. In Proceedings of the International Conference on Dependable Systems and Networks.
[65]
A. Thummler, P. Buchholz, and M. Telek. 2006. A novel approach for phase-type fitting with the EM algorithm. IEEE Trans. Depend. Sec. Comput. 3, 3 (2006), 245--258.
[66]
J. J. Ucrós Ospino. 2012. Application of phase-type distribution in cycle time estimation of queueing networks: A case of semiconductor manufacturing systems. Master thesis, Universidad de los Andes.

Cited By

View all
  • (2022)Fuzzy MAS for Power Prediction Based on the Markov Chains: Modeling and Simulation in a HEVDistributed Sensing and Intelligent Systems10.1007/978-3-030-64258-7_65(769-785)Online publication date: 28-Jun-2022
  • (2021)On the shortest $$\alpha$$-reliable path problemTOP10.1007/s11750-021-00592-3Online publication date: 8-Mar-2021
  • (2020)Modeling and simulation of an evolutionary approach based on MAS strategy: for intelligent energy management in EV2020 International Conference on Intelligent Systems and Computer Vision (ISCV)10.1109/ISCV49265.2020.9204060(1-8)Online publication date: Jun-2020
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Mathematical Software
ACM Transactions on Mathematical Software  Volume 43, Issue 3
September 2017
232 pages
ISSN:0098-3500
EISSN:1557-7295
DOI:10.1145/2988516
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 January 2017
Accepted: 01 October 2016
Revised: 01 April 2016
Received: 01 March 2015
Published in TOMS Volume 43, Issue 3

Permissions

Request permissions for this article.

Check for updates

Badges

Author Tags

  1. Markov chains
  2. Markov decision processes
  3. Stochastic modeling
  4. phase-type distributions
  5. quasi-birth-and-death processes

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Fuzzy MAS for Power Prediction Based on the Markov Chains: Modeling and Simulation in a HEVDistributed Sensing and Intelligent Systems10.1007/978-3-030-64258-7_65(769-785)Online publication date: 28-Jun-2022
  • (2021)On the shortest $$\alpha$$-reliable path problemTOP10.1007/s11750-021-00592-3Online publication date: 8-Mar-2021
  • (2020)Modeling and simulation of an evolutionary approach based on MAS strategy: for intelligent energy management in EV2020 International Conference on Intelligent Systems and Computer Vision (ISCV)10.1109/ISCV49265.2020.9204060(1-8)Online publication date: Jun-2020
  • (2018)Modelling and implementation of an energy management simulator based on agents using optimised fuzzy rulesInternational Journal of Innovative Computing and Applications10.5555/3302729.33027319:4(203-215)Online publication date: 1-Jan-2018
  • (2017)Cutting Latency Tail: Analyzing and Validating Replication without CancelingIEEE Transactions on Parallel and Distributed Systems10.1109/TPDS.2017.270626828:11(3128-3141)Online publication date: 1-Nov-2017

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media