Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3286960.3286975acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesaus-ceConference Proceedingsconference-collections
research-article

ArAl: An Online Tool for Source Code Snapshot Metadata Analysis

Published: 29 January 2019 Publication History

Abstract

Several systems that collect data from students' problem solving processes exist. Within computing education research, such data has been used for multiple purposes, ranging from assessing students' problem solving strategies to detecting struggling students. To date, however, the majority of the analysis has been conducted by individual researchers or research groups using case by case methodologies.
Our belief is that with increasing possibilities for data collection from students' learning process, researchers and instructors will benefit from ready-made analysis tools.
In this study, we present ArAl, an online machine learning based platform for analyzing programming source code snapshot data. The benefit of ArAl is two-fold. The computing education researcher can use ArAl to analyze the source code snapshot data collected from their own institute. Also, the website provides a collection of well-documented machine learning and statistics based tools to investigate possible correlation between different variables. The presented web-portal is available at online-analysis-demo.herokuapp.com. This tool could be applied in many different subject areas given appropriate performance data.

References

[1]
Alireza Ahadi, Arto Hellas, and Raymond Lister. 2017. A contingency table derived method for analyzing course data. ACM Transactions on Computing Education (TOCE) 17, 3 (2017), 13.
[2]
Alireza Ahadi and Raymond Lister. 2013. Geek Genes, Prior Knowledge, Stumbling Points and Learning Edge Momentum: Parts of the One Elephant?. In Proceedings of the Ninth Annual International ACM Conference on International Computing Education Research (ICER '13). ACM, New York, NY, USA, 123--128.
[3]
Alireza Ahadi, Raymond Lister, Heikki Haapala, and Arto Vihavainen. 2015. Exploring Machine Learning Methods to Automatically Identify Students in Need of Assistance. In Proceedings of the Eleventh Annual International Conference on International Computing Education Research (ICER '15). ACM, New York, NY, USA, 121--130.
[4]
Alireza Ahadi, Raymond Lister, and Donna Teague. 2014. Falling behind early and staying behind when learning to program. In Proceedings of the 25th Psychology of Programming Conference (PPIG '14).
[5]
Alireza Ahadi, Raymond Lister, and Arto Vihavainen. 2016. On the Number of Attempts Students Made on Some Online Programming Exercises During Semester and their Subsequent Performance on Final Exam Questions. In Proceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science Education. ACM, 218--223.
[6]
Evan Balzuweit and Jaime Spacco. 2013. SnapViz: Visualizing Programming Assignment Snapshots. In Proceedings of the 18th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '13). ACM, New York, NY, USA, 350--350.
[7]
Jens Bennedsen and Michael E Caspersen. 2007. Failure rates in introductory programming. ACM SIGCSE Bulletin 39, 2 (2007), 32--36.
[8]
Peter Brusilovsky, Stephen Edwards, Amruth Kumar, Lauri Malmi, Luciana Benotti, Duane Buck, Petri Ihantola, Rikki Prince, Teemu Sirkiä, Sergey Sosnovsky, Jaime Urquiza, Arto Vihavainen, and Michael Wollowski. 2014. Increasing Adoption of Smart Learning Content for Computer Science Education. In Proceedings of the Working Group Reports of the 2014 on Innovation & Technology in Computer Science Education Conference (ITiCSE-WGR '14). ACM, New York, NY, USA, 31--57.
[9]
Jacob Cohen, Patricia Cohen, Stephen G West, and Leona S Aiken. 1983. Applied multiple regression/correlation for the behavioral sciences. (1983).
[10]
Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H Edwards, Essi Isohanni, Ari Korhonen, Andrew Petersen, Kelly Rivers, et al. 2015. Educational data mining and learning analytics in programming: Literature review and case studies. In Proceedings of the 2015 ITiCSE on Working Group Reports. ACM, 41--63.
[11]
Matthew C Jadud. 2006. Methods and tools for exploring novice compilation behaviour. In Proceedings of the second international workshop on Computing education research. ACM, 73--84.
[12]
Athanasios Papoulis. 1990. Probability and Statistics. Prentence-Hall International Editions.
[13]
Andrew Petersen, Jaime Spacco, and Arto Vihavainen. 2015. An Exploration of Error Quotient in Multiple Contexts. In Proceedings of the 15th Koli Calling Conference on Computing Education Research (Koli Calling '15). ACM, New York, NY, USA, 77--86.
[14]
Maria Mercedes T Rodrigo, Emily Tabanao, Ma Beatriz E Lahoz, and Matthew C Jadud. 2009. Analyzing online protocols to characterize novice Java programmers. Philippine Journal of Science 138, 2 (2009), 177--190.
[15]
Christopher Watson, Frederick WB Li, and Jamie L Godwin. 2013. Predicting Performance in an Introductory Programming Course by Logging and Analyzing Student Programming Behavior. In Advanced Learning Technologies (ICALT), 2013 IEEE 13th International Conference on. IEEE, 319--323.
[16]
F. Yates. 1934. Contingency Tables Involving Small Numbers and the ÏĞ2 Test. Supplement to the Journal of the Royal Statistical Society 1, 2 (1934), 217--235.

Cited By

View all
  • (2022)CodeProcess Charts: Visualizing the Process of Writing CodeProceedings of the 24th Australasian Computing Education Conference10.1145/3511861.3511867(46-55)Online publication date: 14-Feb-2022
  • (2021)The importance of using the CodeInsights monitoring tool to support teaching programming in the context of a pandemic2021 IEEE Frontiers in Education Conference (FIE)10.1109/FIE49875.2021.9637292(1-8)Online publication date: 13-Oct-2021
  • (2019)A study on the students' perspective about the usage of a programming monitoring tool2019 International Symposium on Computers in Education (SIIE)10.1109/SIIE48397.2019.8970110(1-6)Online publication date: Nov-2019

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ACE '19: Proceedings of the Twenty-First Australasian Computing Education Conference
January 2019
131 pages
ISBN:9781450366229
DOI:10.1145/3286960
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • The University of Newcastle, Australia
  • CORE - Computing Research and Education
  • The University of Auckland

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 January 2019

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ACE'19
ACE'19: Twenty-First Australasian Computing Education Conference
January 29 - 31, 2019
NSW, Sydney, Australia

Acceptance Rates

ACE '19 Paper Acceptance Rate 15 of 36 submissions, 42%;
Overall Acceptance Rate 161 of 359 submissions, 45%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)3
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2022)CodeProcess Charts: Visualizing the Process of Writing CodeProceedings of the 24th Australasian Computing Education Conference10.1145/3511861.3511867(46-55)Online publication date: 14-Feb-2022
  • (2021)The importance of using the CodeInsights monitoring tool to support teaching programming in the context of a pandemic2021 IEEE Frontiers in Education Conference (FIE)10.1109/FIE49875.2021.9637292(1-8)Online publication date: 13-Oct-2021
  • (2019)A study on the students' perspective about the usage of a programming monitoring tool2019 International Symposium on Computers in Education (SIIE)10.1109/SIIE48397.2019.8970110(1-6)Online publication date: Nov-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media