Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3555858.3555898acmotherconferencesArticle/Chapter ViewAbstractPublication PagesfdgConference Proceedingsconference-collections
research-article
Open access

Static Analysis for Automated Identification of Valid Game Actions During Exploration

Published: 04 November 2022 Publication History

Abstract

Automated exploration has many important applications for testing and analysis of games. Techniques for automated exploration require the capability of identifying the set of available user actions at a given game state, then performing the action selected by the exploration logic. This has been traditionally supported by having the game developer provide an API for this purpose or randomly guessing inputs. In this paper we develop a program analysis based technique for performing an automated analysis of the input-handling logic of the game code, then using this information to provide the set of player actions available at a game state (as well as the device inputs that should be simulated to perform a chosen action). We focus on developing such a technique for games built with the Unity game engine. We implemented an automatic exploration tool based on our technique and evaluated its state exploration performance for six open-source Unity games. We found that our approach is competitive with manually specified actions and is fast enough to play the games in real time.

References

[1]
Aghyad Mohammad Albaghajati and Moataz Aly Kamaleldin Ahmed. 2020. Video Game Automated Testing Approaches: An Assessment Framework. IEEE Transactions on Games(2020). https://doi.org/10.1109/TG.2020.3032796
[2]
Darrell Bethea, Robert A. Cochran, and Michael K. Reiter. 2008. Server-Side Verification of Client Behavior in Online Games. ACM Trans. Inf. Syst. Secur. 14, 4, Article 32 (dec 2008), 27 pages. https://doi.org/10.1145/2043628.2043633
[3]
Kenneth Chang, Batu Aytemiz, and Adam M. Smith. 2019. Reveal-More: Amplifying Human Effort in Quality Assurance Testing Using Automated Exploration. In 2019 IEEE Conference on Games (CoG). https://doi.org/10.1109/CIG.2019.8848091
[4]
Chang-Sik Cho, Dong-Chun Lee, Kang-Min Sohn, Chang-Joon Park, and Ji-Hoon Kang. 2010. Scenario-Based Approach for Blackbox Load Testing of Online Game Servers. In 2010 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery. 259–265. https://doi.org/10.1109/CyberC.2010.54
[5]
Wontae Choi, George Necula, and Koushik Sen. 2013. Guided GUI Testing of Android Apps with Minimal Restart and Approximate Learning. SIGPLAN Not. 48, 10 (oct 2013), 623–640. https://doi.org/10.1145/2544173.2509552
[6]
L.A. Clarke. 1976. A System to Generate Test Data and Symbolically Execute Programs. IEEE Transactions on Software Engineering SE-2, 3 (1976), 215–222. https://doi.org/10.1109/TSE.1976.233817
[7]
Leonardo De Moura and Nikolaj Bjørner. 2008. Z3: An Efficient SMT Solver. In Proceedings of the Theory and Practice of Software, 14th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (Budapest, Hungary) (TACAS’08/ETAPS’08). Springer-Verlag, Berlin, Heidelberg, 337–340.
[8]
Jennifer Hernández Bécares, Luis Costero Valero, and Pedro Pablo Gómez Martín. 2017. An approach to automated videogame beta testing. Entertainment Computing 18 (2017), 79–92. https://doi.org/10.1016/j.entcom.2016.08.002
[9]
Sidra Iftikhar, Muhammad Zohaib Iqbal, Muhammad Uzair Khan, and Wardah Mahmood. 2015. An Automated Model Based Testing Approach for Platform Games. In Proceedings of the 18th International Conference on Model Driven Engineering Languages and Systems(Ottawa, Ontario, Canada) (MODELS ’15). IEEE Press, 426–435.
[10]
Oleksandra Keehl and Adam M. Smith. 2018. Monster Carlo: An MCTS-based Framework for Machine Playtesting Unity Games. In 2018 IEEE Conference on Computational Intelligence and Games (CIG). https://doi.org/10.1109/CIG.2018.8490363
[11]
Oleksandra Keehl and Adam M. Smith. 2019. Monster Carlo 2: Integrating Learning and Tree Search for Machine Playtesting. In 2019 IEEE Conference on Games (CoG). https://doi.org/10.1109/CIG.2019.8847989
[12]
Ahmed Khalifa, Aaron Isaksen, Julian Togelius, and Andy Nealen. 2016. Modifying MCTS for Human-like General Video Game Playing. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence (New York, New York, USA) (IJCAI’16). AAAI Press, 2514–2520.
[13]
John Levine, Clare Bates Congdon, Marc Ebner, Graham Kendall, Simon M. Lucas, Risto Miikkulainen, Tom Schaul, and Tommy Thompson. 2013. General Video Game Playing. In Artificial and Computational Intelligence in Games. Dagstuhl Follow-Ups, Vol. 6. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 77–83.
[14]
Diego Perez-Liebana, Spyridon Samothrakis, Julian Togelius, Tom Schaul, Simon M. Lucas, Adrien Couëtoux, Jerry Lee, Chong-U Lim, and Tommy Thompson. 2016. The 2014 General Video Game Playing Competition. IEEE Transactions on Computational Intelligence and AI in Games 8, 3(2016), 229–243. https://doi.org/10.1109/TCIAIG.2015.2402393
[15]
Johannes Pfau, Jan David Smeddinck, and Rainer Malaka. 2017. Automated Game Testing with ICARUS: Intelligent Completion of Adventure Riddles via Unsupervised Solving. In Extended Abstracts Publication of the Annual Symposium on Computer-Human Interaction in Play (Amsterdam, The Netherlands) (CHI PLAY ’17 Extended Abstracts). Association for Computing Machinery, New York, NY, USA, 153–164. https://doi.org/10.1145/3130859.3131439
[16]
Christopher Schaefer, Hyunsook Do, and Brian M. Slator. 2013. Crushinator: A Framework towards Game-Independent Testing. In Proceedings of the 28th IEEE/ACM International Conference on Automated Software Engineering (Silicon Valley, CA, USA) (ASE’13). IEEE Press, 726–729. https://doi.org/10.1109/ASE.2013.6693143
[17]
Tom Schaul. 2013. A video game description language for model-based or interactive learning. In 2013 IEEE Conference on Computational Inteligence in Games (CIG). https://doi.org/10.1109/CIG.2013.6633610
[18]
J Tuovenen, M Oussalah, and P Kostakos. 2019. MAuto: Automatic Mobile Game Testing Tool Using Image-Matching Based Approach. The Computer Games Journal 8, 3 (2019), 215–239.
[19]
Zeping Zhan, Batu Aytemiz, and Adam M. Smith. 2019. Taking the Scenic Route: Automatic Exploration for Videogames. In Proceedings of the 2nd Workshop on Knowledge Extraction from Games co-located with 33rd AAAI Conference on Artificial Intelligence, KEG@AAAI 2019, Honolulu, Hawaii, January 27th, 2019.
[20]
Xiaoxuan Zhang, Zeping Zhan, Misha Holtz, and Adam M. Smith. 2018. Crawling, Indexing, and Retrieving Moments in Videogames. In Proceedings of the 13th International Conference on the Foundations of Digital Games (Malmö, Sweden) (FDG ’18). Association for Computing Machinery, New York, NY, USA, Article 16, 10 pages. https://doi.org/10.1145/3235765.3235786
[21]
Yan Zheng, Xiaofei Xie, Ting Su, Lei Ma, Jianye Hao, Zhaopeng Meng, Yang Liu, Ruimin Shen, Yingfeng Chen, and Changjie Fan. 2019. Wuji: Automatic Online Combat Game Testing Using Evolutionary Deep Reinforcement Learning. In 2019 34th IEEE/ACM International Conference on Automated Software Engineering (ASE). 772–784. https://doi.org/10.1109/ASE.2019.00077

Cited By

View all
  • (2024)Program Synthesis Meets Visual What-Comes-Next PuzzlesProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering10.1145/3691620.3695015(418-429)Online publication date: 27-Oct-2024

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
FDG '22: Proceedings of the 17th International Conference on the Foundations of Digital Games
September 2022
664 pages
ISBN:9781450397957
DOI:10.1145/3555858
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 November 2022

Check for updates

Author Tags

  1. automated testing
  2. game testing
  3. program analysis

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

FDG22

Acceptance Rates

Overall Acceptance Rate 152 of 415 submissions, 37%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)214
  • Downloads (Last 6 weeks)34
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Program Synthesis Meets Visual What-Comes-Next PuzzlesProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering10.1145/3691620.3695015(418-429)Online publication date: 27-Oct-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media