Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3304221.3319769acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

Quantifying Activity and Collaboration Levels in Programming Assignments

Published: 02 July 2019 Publication History

Abstract

This paper presents an experience report from a third-year undergraduate compiler design course that is taught as part of a four year computer science degree. We analyse data from a study of practical assignments, evaluated in the context of take-home formative assignments and a supervised summative examination. We implement metrics to quantify the degrees of similarity between submissions for programming assignments, as well as measuring the level of activity. We present the results of our study, and discuss the utility of these metrics for our teaching practice.

References

[1]
Alireza Ahadi, Raymond Lister, and Arto Vihavainen. 2016. On the Number of Attempts Students Made on Some Online Programming Exercises During Semester and their Subsequent Performance on Final Exam Questions. In ACM Conference on Innovation and Technology in Computer Science Education . Arequipa, Peru, 218--223.
[2]
Luciana Benotti, Federico Aloi, Franco Bulgarelli, and Marcos J. Gomez. 2018. The Effect of a Web-based Coding Tool with Automatic Feedback on Students' Performance and Perceptions. In 49th ACM Technical Symposium on Computer Science Education. Baltimore, MD, USA, 2--7.
[3]
Benjamin S. Bloom, Thomas Hasting, and George Madaus. 1971. Handbook of formative and summative evaluation of student learning .McGraw-Hill, New York, USA.
[4]
Paul Clough. 2003. Old and new challenges in automatic plagiarism detection . Technical Report. Department of Information Studies, University of Sheffield.
[5]
Charlie Daly and Jane Horgan. 2005. A Technique for Detecting Plagiarism in Computer Code. Comput. J., Vol. 48, 6 (2005), 662--666.
[6]
Charlie Daly and John Waldron. 2004. Assessing the Assessment of Programming Ability. SIGCSE Bull., Vol. 36, 1 (March 2004), 210--213.
[7]
Arto Hellas, Juho Leinonen, and Petri Ihantola. 2017. Plagiarism in Take-home Exams: Help-seeking, Collaboration, and Systematic Cheating. In ACM Conference on Innovation and Technology in Computer Science Education . Bologna, Italy, 238--243.
[8]
Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H. Edwards, Essi Isohanni, Ari Korhonen, Andrew Petersen, Kelly Rivers, Miguel Ángel Rubio, Judy Sheard, Bronius Skupas, Jaime Spacco, Claudia Szabo, and Daniel Toll. 2015. Educational Data Mining and Learning Analytics in Programming: Literature Review and Case Studies. In ITiCSE on Working Group Reports . Vilnius, Lithuania, 41--63.
[9]
An Ju, Ben Mehne, Andrew Halle, and Armando Fox. 2018. In-class coding-based summative assessments: tools, challenges, and experience. In 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education . Larnaca, Cyprus, 75--80.
[10]
Hieke Keuning, Johan Jeuring, and Bastiaan Heeren. 2016. Towards a Systematic Review of Automated Feedback Generation for Programming Exercises. In ACM Conference on Innovation and Technology in Computer Science Education . Arequipa, Peru, 41--46.
[11]
Anthony Kleerekoper and Andrew Schofield. 2018. SQL tester: an online SQL assessment tool and its impact. In 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education . Larnaca, Cyprus, 87--92.
[12]
John Levine. 2009. Flex & Bison .O'Reilly Media, Sebastopol, CA, USA.
[13]
Phil Maguire, Rebecca Maguire, and Robert Kelly. 2017. Using automatic machine assessment to teach computer programming. Computer Science Education, Vol. 27, 3--4 (2017), 197--214.
[14]
Chanchal Kumar Roy and James R. Cordy. 2007. A Survey on Software Clone Detection Research . Technical Report No. 2007--541. School of Computing, Queen's University at Kingston, Ontario, Canada.
[15]
Judy Sheard, Simon, Matthew Butler, Katrina Falkner, Michael Morgan, and Amali Weerasinghe. 2017. Strategies for Maintaining Academic Integrity in First-Year Computing Courses. In ACM Conference on Innovation and Technology in Computer Science Education . Bologna, Italy, 244--249.
[16]
Kwangho Song, Jihong Min, Gayoung Lee, Sang Chul Shin, and Yoo-Sung Kim. 2015. An Improvement of Plagiarized Area Detection System Using Jaccard Correlation Coefficient Distance Algorithm. Computer Science and Information Technology, Vol. 3, 3 (2015), 76--80.
[17]
Narjes Tahaei and David C. Noelle. 2018. Automated Plagiarism Detection for Computer Programming Exercises Based on Patterns of Resubmission. In ACM Conference on International Computing Education Research. Espoo, Finland, 178--186.
[18]
A. Tversky. 1977. Features of similarity. Psychological Review, Vol. 84, 14 (1977), 327--352.
[19]
Chris Wilcox. 2016. Testing Strategies for the Automated Grading of Student Programs. In 47th ACM Technical Symposium on Computing Science Education. Memphis, Tennessee, USA, 437--442.
[20]
Burkhard C. Wü nsche, Zhen Chen, Lindsay Alexander Shaw, Thomas Suselo, Kai-Cheung Leung, Davis Dimalen, Wannes van der Mark, Andrew Luxton-Reilly, and Richard Lobb. 2018. Automatic assessment of OpenGL computer graphics assignments. In 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education . Larnaca, Cyprus, 81--86.
[21]
Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Judith Bishop. 2015. Educational Software Engineering: Where Software Engineering, Education, and Gaming Meet. In Computer Games and Software Engineering . CRC Press, 113--133.

Cited By

View all
  • (2020)Calibration and Analysis of Source Code Similarity Measures for Verilog Hardware Description Language ProjectsProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366928(420-426)Online publication date: 26-Feb-2020

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ITiCSE '19: Proceedings of the 2019 ACM Conference on Innovation and Technology in Computer Science Education
July 2019
583 pages
ISBN:9781450368957
DOI:10.1145/3304221
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 July 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. jaccard index
  2. program similarity
  3. programming assignments

Qualifiers

  • Research-article

Conference

ITiCSE '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 552 of 1,613 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2020)Calibration and Analysis of Source Code Similarity Measures for Verilog Hardware Description Language ProjectsProceedings of the 51st ACM Technical Symposium on Computer Science Education10.1145/3328778.3366928(420-426)Online publication date: 26-Feb-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media