Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Study on Improvement Characteristics of a Novel Geotextile with Stitched Transverse Ribs
Next Article in Special Issue
Gamification Based on User Types: When and Where It Is Worth Applying
Previous Article in Journal
Toward the Design of a Representative Heater for Boiling Flow Characterization under PWR’s Prototypical Thermalhydraulic Conditions
Previous Article in Special Issue
Exploring the Online Gamified Learning Intentions of College Students: A Technology-Learning Behavior Acceptance Model
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating Learner Engagement with Gamification in Online Courses

by
Anna Puig
1,2,*,†,
Inmaculada Rodríguez
1,3,†,
Álex Rodríguez
1,† and
Ianire Gallego
1,†
1
Departament de Matemàtiques i Informàtica, Universitat de Barcelona, Avda Corts Catalanes, 595, 08007 Barcelona, Spain
2
Institut de Matemàtica (IMUB Research Center), Universitat de Barcelona, Avda Corts Catalanes, 585, 08007 Barcelona, Spain
3
Institut of Complex Systems (UBICS Research Center), Universitat de Barcelona, Martí Franquès, 1, 08028 Barcelona, Spain
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Appl. Sci. 2023, 13(3), 1535; https://doi.org/10.3390/app13031535
Submission received: 19 December 2022 / Revised: 12 January 2023 / Accepted: 18 January 2023 / Published: 24 January 2023
(This article belongs to the Special Issue Gamification and Data-Driven Approaches in Education)
Figure 1
<p>Overview of the <span class="html-italic">StaticAG</span> method, which involves three steps: Step 1 obtains the scorings of the game elements depending on an individual Player Profile vector and the Matching Matrix; in Step 2 we normalise these scorings; and finally, in Step 3, we select the Game Element. We also display two Selection methods: Selection1, Lavoué-based [<a href="#B15-applsci-13-01535" class="html-bibr">15</a>], obtains the top-scored Game Element, while Selection 2 obtains a weighted random Game element according to the probabilities computed in Step 3.</p> ">
Figure 2
<p>Gamified MOOC itinerary: students’ statuses are depicted from 0 to 1 relying on the Learning Activities (Newbie, Medium, and Pro avatars are shown in different parts of the itinerary). Game Elements (GE) between activities are displayed using small red circles. We developed a new XBlock in the edX platform able to add a Game Element to the MOOC itinerary; see video <a href="https://drive.google.com/file/d/1W1NVmi2aJ9wSrYGgvaOJCW60QrFUL0KN/view?usp = share_link" target="_blank">Adding a Game element Xblock in edX</a>, accessed on 12 January 2023.</p> ">
Figure 3
<p>Gamified MOOC structure: on the left, the learning itinerary, on the right the embedded gamification in the MOOC (the dashboard and the game elements). See videos. See more details in: <a href="https://drive.google.com/file/d/1tkqPYT5d4hQ23W8X55icWPsAvAU6jKUm/view?usp = share_link" target="_blank">the initial Player Type Questionnaire</a>, <a href="https://drive.google.com/file/d/183eHJGl9NaL20rEOOyZvZR1iPfUSOKvP/view?usp = sharing" target="_blank">the Dashboard</a>, and <a href="https://drive.google.com/file/d/1gwu5hr6tnRL_rOiQkrLXVuiLrCA-Ijkh/view?usp = sharing" target="_blank">the Gamification Element Widgets</a>, accessed on 12 January 2023.</p> ">
Figure 4
<p>Test of the initial motivation of both experimental groups: DynamicAG and StaticAG. Measures: IM—Intrinsic Motivation, A—Amotivation, G—Gamer.</p> ">
Figure 5
<p>Top: <math display="inline"><semantics> <mrow> <mi>G</mi> <mi>E</mi> <mspace width="4pt"/> <mi>D</mi> <mi>i</mi> <mi>s</mi> <mi>p</mi> <mi>l</mi> <mi>a</mi> <mi>y</mi> <mi>s</mi> </mrow> </semantics></math> are collected using clicks on the button <span class="html-italic">I want an award</span>. Bottom left: <math display="inline"><semantics> <mrow> <mi>G</mi> <mi>E</mi> <mspace width="4pt"/> <mi>C</mi> <mi>l</mi> <mi>i</mi> <mi>c</mi> <mi>k</mi> <mi>s</mi> </mrow> </semantics></math> are monitored using clicks on the GE, while <math display="inline"><semantics> <mrow> <mi>E</mi> <mi>v</mi> <mi>a</mi> <mi>l</mi> <mi>u</mi> <mi>a</mi> <mi>t</mi> <mi>i</mi> <mi>o</mi> <mi>n</mi> <mi>s</mi> </mrow> </semantics></math> refer to the scorings of each displayed game element using stars. Bottom right: <math display="inline"><semantics> <mrow> <mi>D</mi> <mi>a</mi> <mi>s</mi> <mi>h</mi> <mi>b</mi> <mi>o</mi> <mi>a</mi> <mi>r</mi> <mi>d</mi> <mspace width="4pt"/> <mi>C</mi> <mi>l</mi> <mi>i</mi> <mi>c</mi> <mi>k</mi> <mi>s</mi> </mrow> </semantics></math> refer to those clicks made in the dashboard window.</p> ">
Figure 6
<p>Comparison of <math display="inline"><semantics> <mrow> <mi>D</mi> <mi>a</mi> <mi>s</mi> <mi>h</mi> <mi>b</mi> <mi>o</mi> <mi>a</mi> <mi>r</mi> <mi>d</mi> <mspace width="4pt"/> <mi>C</mi> <mi>l</mi> <mi>i</mi> <mi>c</mi> <mi>k</mi> <mi>s</mi> </mrow> </semantics></math>. Left side: histogram. Right side: the mean of the number of clicks and the variance of all students in each group. “# users” means number of users.</p> ">
Figure 7
<p>Comparison of <math display="inline"><semantics> <mrow> <mi>G</mi> <mi>E</mi> <mspace width="4pt"/> <mi>D</mi> <mi>i</mi> <mi>s</mi> <mi>p</mi> <mi>l</mi> <mi>a</mi> <mi>y</mi> <mi>s</mi> </mrow> </semantics></math>. Left side: histogram. Right side: the mean of the number of the game elements shown and the variance of all students in each group. “# users” means number of users.</p> ">
Figure 8
<p>Comparison of <math display="inline"><semantics> <mrow> <mi>G</mi> <mi>E</mi> <mspace width="4pt"/> <mi>C</mi> <mi>l</mi> <mi>i</mi> <mi>c</mi> <mi>k</mi> <mi>s</mi> </mrow> </semantics></math>. Left side: histogram. Right side: the mean of the number of interactions with elements and the variance of all students in each group. “# users” means number of users.</p> ">
Figure 9
<p>Comparison of Interaction Time. Left side: histogram. Right side: the average time students interacted with gamification elements and the variance of all students in each group. “# users” means number of users.</p> ">
Figure 10
<p><math display="inline"><semantics> <mrow> <mi>E</mi> <mi>v</mi> <mi>a</mi> <mi>l</mi> <mi>u</mi> <mi>a</mi> <mi>t</mi> <mi>i</mi> <mi>o</mi> <mi>n</mi> <mi>s</mi> </mrow> </semantics></math>, i.e., mean scoring of GEs grouped by GE type. The left side includes Changeable GEs (in both properties and appearances). In the middle, GEs that only change their properties according to the learning progress. On the right side, GEs with fixed appearances and properties.</p> ">
Versions Notes

Abstract

:
Several reasons underlie the low retention rates in MOOCs. These reasons can be analysed from different perspectives, either in terms of the course design or the enrolled students. On the student side, we find little social interaction, boredom, tiredness, and a lack of motivation and time. These challenges can be addressed by adaptive gamification that proposes the design of personalised, hedonic learning experiences. Studies to date have adopted either the one-fits-all approach or the adaptive approach. Nevertheless, the adaptive solutions have considered a static player profile throughout the entire experience. This paper presents the design and evaluation of a dynamic adaptive gamification approach which—based on students’ interactions with game elements and also their opinions about these elements—dynamically updates the students’ player profile to better figure out which game elements suit them. We evaluated the engagement of students with gamification elements by means of a course composed of a knowledge "pill" related to the topic of “recycling plastics from the sea”, offered through the nanoMOOCs learning platform. We propose metrics such as the mean number of interactions with the gamification dashboard, the time spent by participants with game elements, and the opinions of students about these elements to compare the Dynamic Adaptive Gamification (DynamicAG) and the Static Adaptive (StaticAG) approaches. An experimental study with 66 high school students showed significant differences between both approaches. Specifically, the DynamicAG group spent twice as much time with the Dashboard than the StaticAG group. Moreover, students in the DynamicAG group were more engaged with game elements (mean number of interactions = 12.13) than those in the StaticAG group (mean number of interactions = 3.21).

1. Introduction

Massive Open Online Courses (MOOCs) have been with us for decades [1] and, more recently, have been recognised as facilitators for the achievement to "ensure inclusive and equitable quality", the fourth goal of the United Nations (UN) 17 Sustainable Development Goals (SDGs) [2,3]. Indeed, the inherent characteristics of online learning platforms, such as interactivity, and their social and emotional dimensions, make them valuable learning tools for learners around the world. As far as interactivity is concerned, the growth of the digital ecosystem has favoured the apparition of numerous tools for creating dynamic and student-centered resources in MOOCS [4]. These resources include digital storytelling [5], infographics [6], interactive videos, and questionnaires, among others. Nevertheless, the social dimension has not yet been successfully exploited since research studies have revealed that, despite the huge number of students usually enrolled in MOOCS, students usually learn in isolation [7]. Regarding the emotional dimension, recent findings show that MOOC students value content quality, user interface design, and enjoyment [8,9].
Moreover, MOOCs have to contend with low retention rates and a high no-show rate. Bezerra et al. [10] divided the reasons that cause the high dropout rates into two groups. On the one hand, some reasons are inherent to the instructional design of the course—duration, content, length, and types of activity—including the fact that some of the students sign up to the course out of curiosity, because it is free, or because they are interested in particular sections of it. In response to this problem, the educational community has contributed with solutions such as the design of (short) "pills" of knowledge containing rich and compelling interactive content, as nanoMOOCs (nanoMOOCs are a new audio-visual educational format for knowledge pills; https://nanomoocs.cat/, accessed on 12 January 2023) do. On the other hand, other causes of low retention rates can be attributed to the student, such as little social interaction, boredom, tiredness, lack of motivation and time, and that the content does not suit their skills. Thus, the design of personalised solutions, the increase of cooperative activities between the students, as well as the design of gamified experiences, are potential solutions for addressing these challenges.
Adaptive gamification, i.e., the design of personalised emotional learning experiences through gamification, is the approach that this paper focuses on. Gamification is the integration of Game Elements (from now on also referred to as GE) in non-game contexts in order to increase user participation and engagement in activities [11]. In fact, research has stated that gamification had a positive impact on participation and retention in MOOCs [12]. However, integrating the game elements/mechanics in MOOCs as an effective way of enhancing student engagement remains a challenge.
Additionally, the literature presents gamified experiences that mostly adopt the one-fits-all approach [13], that is, an approach that is standard and not tailored to individual needs, considering thus that all users have the same gaming profile. In contrast, adaptive gamification has emerged as a means for adapting the gamified experience to user profiles [14,15]. These adaptive approaches use a static profile of the player, which is computed at the beginning of the experience through the use of a player type questionnaire. Then, the gamified system presents the users with game elements that fit their initial (static) playing profile.
In a previous research paper [16], we presented an adaptive gamification method that took this a step further, considering that the results of the player type questionnaire can be imprecise and also that player profiles can be dynamic. Thus, our method takes player profiles as initial information but also considers how these profiles change over time based on users’ interactions and opinions. Then, the users are provided with a personalized experience through the use of game elements that correspond to their dynamic playing profile. In [17], we presented a case study on the nanoMOOCs educational platform that provides the users with gamified knowledge pills, and performed a preliminary evaluation of the approach using a simulator with bots.
In this paper, we experimentally evaluate the contribution of the dynamic adaptive approach to participants’ experience with gamification. Concretely, we want to know if our method has an impact on the hedonic dimension of user experience (UX). To do so, we compared Dynamic Adaptive Gamification (DynamicAG) and Static Adaptive Gamification (StaticAG) in terms of the metrics that help us to measure user engagement with gamification. Note that this study is concerned with how students experienced (felt about) the gamification, i.e.: to what extent did they interact with the game elements that appear throughout the gamified course? Did students like the look and feel of the game elements? Therefore, the study of the impact of gamification on course completion and engagement with educational activities are beyond the scope of this study. We evaluated our method with 63 students at schools in the city of Viladecans, Barcelona (Spain). Our results show that the dynamic adaptive gamification group interacted significantly more than the group in the static adaptive gamification condition, with an increase of 40% in the mean number of interactions with game elements. Moreover, the dynamic adaptive strategy revealed a broader range of game elements to the users. Since the game elements were the same in both conditions, as expected, students in both groups had almost equal opinions regarding the elements’ look and feel.
The rest of the paper is structured as follows. The related work section reviews research works that aimed at providing the users with gamified personalized learning experiences. Next, we describe two adaptive gamification strategies, one that considers the profile of the player, but using a static player profile throughout the course, and our approach, which considers this profile as dynamic. The next section details the strategy followed to situate the game elements along the activities in the course. Then, the rewards logic defines the integration of students’ progress in the MOOC in terms of the gamification element rewards. The Materials and Methods section describes the details of the experiment to compare the StaticAG and DynamicAG approaches. The following sections present metrics, analyse the data gathered, and reflect upon the findings and limitations of our research. The last section provides conclusions drawn from this research and establishes directions for future work.

2. Related Work

The research literature on both MOOCs and blended learning environments has reported a positive influence of gamification on students’ motivation [18], performance [19,20,21], and collaboration [22,23]. In consequence, more and more commercial and open-source MOOC platforms are considering gamification either as a core feature or as a plug-in. For example, EdX’s Digital Badging system (https://blog.edx.org/digital-badges-on-the-open-edx-platform, accessed on 12 January 2023) awards participants with badges when they complete a course, and Moodle’s Ranking Block (https://moodle.org/plugins/block_ranking, accessed on 12 January 2023) shows a leaderboard displaying the points accumulated by students when performing educational activities.
However, the majority of MOOCs implement shallow gamification based simply on extrinsic motivation rewards like points, badges, or leaderboards. These basic approaches are evolving to include elements of entertainment, challenge and social interaction since these elements enhance students’ intention of continued usage of MOOCs as well as their course performance [24]. In this study, we opt for using deep gamification mechanics [25], based on intrinsic motivation rewards such as helping others and collaborative features like social status and social network.
Another challenge that gamified online courses face is to engage a variety of student personalities and preferences, and also to sustain participants’ engagement over time. A response to this challenge can be found in the diversity of participants and therefore in adaptive approaches as suggested by [26]. Actually, literature reviews dealing with player types research [27,28,29] have provided thorough overviews of theories of player types, such as the one proposed by Bartle [30], BrainHex [31], and Hexad [32]. Some of these taxonomies propose classifications focused solely on player types, others propose classifications that consider both learning (learner type) and gameplay aspects (player type), but all of them aimed to find a relationship between typologies of users and the game elements that best suit them [33]. Moreover, these works conclude that users exhibit features of more than one kind of player type, as indicated by the results of publicly available questionnaires (as seen in The Average Type Scores of the HEXAD questionnaire https://gamified.uk/UserTypeTest/user-type-test-results.php, accessed on 12 January 2023).
Approaches to personalised gamified learning experiences reveal two main aspects to consider: what to adapt and the criteria on which to base the adaptation. Regarding what to adapt, we can highlight the game elements to be shown to the student [34,35], the values of the rewards given to the student [36], and the look and feel of the game elements. In relation to the criteria, the adaptation can be based on students’ (learner and player) profiles [14,37], their progress throughout the course, and their interaction with the game elements [17]. In [38], the authors suggested using their theoretical framework to modify the system at runtime based on the user’s preference and behaviour, but with neither a real application nor an experimental study that put the theoretical framework into practice. Another study [15] proposed a matrix factorization method to obtain game element scores for each user and so show students the element with the highest score. Previously mentioned works used either standard or self-reported questionnaires to capture static data, thus not considering that the characteristics and preferences of individuals are dynamic in nature and can evolve over time, nor the possible lack of reliability in questionnaires [39]. Our adaptive method explores this gap and proposes to dynamically capture real-time data from user interactions and, based on the data, provides the user with a personalised experience [16,17].
Regarding the AI technique that supported the adaptation task in different studies, we mainly found ML (Machine Learning)-based solutions. One study [40] used machine learning to predict—based on the facial keypoint data of users—the results of a gamified activity. Other studies were not directly related to gamification in learning environments but to educational institutions or games. In [41], the authors proposed a CBR-based architecture for designing adaptive text-based computer games. Data mining was also used for measuring institutional performance based on key performance indicators such as the number of schools, teachers, school locations, enrolment, and available facilities [42]. Another approach, by Hallifax [43], combined two machine learning techniques, ANN and CBR, to adapt gamification and learning content based on the learner’s profile. In line with [15], our adaptive approach is based on matrix factorization, a well-known method of collaborative filtering in recommender systems.

3. Static Adaptive Gamification versus Dynamic Adaptive Gamification

In this section, we describe two Adaptive Gamification strategies, considering that diverse groups of users have differing motivations during an ongoing MOOC. These adaptive strategies are based on the HEXAD player types classification [44], which will guide the dynamic selection of the game elements to be shown during the MOOC. Adaptive Gamification strategies use player types to guide the selection of the game elements that will appear to the students during the MOOC, in contrast to “one-fits-all” strategies that consider that all the students have the same profile.
One of the first approaches to adaptive gamification was proposed by Lavoue et al. [15], who obtained a player profile at the beginning of the experiment, assuming that this player profile remained constant (static) over time. They relied on the BrainHex typology ([31]) which represents the player profile as a vector of values of player types, so several player types were considered, and not only the predominant one [45]. Moreover, they proposed a matrix factorization model to obtain the most suitable gamification element to be shown. They used two matrices, the first of which contains the player types of all the users, while the second matches the game elements and the player types (i.e., the Matching Matrix). The combination of these two matrices yield the game element scores for each user. Finally, the algorithm selects and shows the element with the highest score to the user.
We use the term Static Adaptive Gamification strategy, StaticAG, to refer to a modified version of Lavoue’s [15] original idea. In this static version [16], we keep her main idea to initially obtain a player profile and maintain it as constant throughout the experience. However, we used the HEXAD classification of player types ([32]), which is a slightly different classification compared to BrainHex, as used by them. While BrainHex is a classification oriented towards video game players, HEXAD focuses more on the psychological aspects of gamified virtual environments. We also consider the case of participants that do not want to play (non-player type). Then, in the StaticAG version, we define the Player Profile vector as a 7-length array containing the HEXAD classification player types plus a non-player type. Consequently, we added the set of game elements related to these new player types—which is a subset of those proposed by Tondello [44]. In Figure 1, the Player Profile is shown as a 7-length vector, where each component represents the ratings corresponding to each player type in the range [ 0 , 1 ] . For example, a HEXAD Individual Player Profile defined as (0.25, 0.15, 0.2, 0.4, 0.4, 0.1, 0) means that the student is 25% disruptor, 15% free spirit, 20% achiever, 40% player, 40% socializer, 10% philanthropist, and 0% no-player. Additionally, Figure 1 shows the list of the considered Game Elements related to those Player Types. Following these changes in the player type classification and the addition of the non-player type, we adapted the Matching Matrix proposed by Lavoué [15] in order to define the relationships between the new player types and the game elements most suitable for them (shown in Figure 1, where the higher the value of the cell, the more suitable the game element is). Note that empty cells contain zero values denoting the incompatibility of the game element with the player type. The combination of a Player Profile and the Matching Matrix produces one scored value per game element (Step 1 in Figure 1).
Note that the selection of the top-scored game element slightly constrains the experience of the user, who may end up losing their gamification engagement. In the static adaptive gamification approach, since the Player Profile does not fluctuate during the experience, the top-scored game element will always be the same and the user will therefore only interact with a single game element, even if that game element varies in appearance or reward values. In order to provide more variability in the gamified experience, we selected game elements by considering the scores of the game elements as probabilities. Thus, we used the normalised scores of the GE Scorings vector (Step 2 in Figure 1) as the probability that each game element is shown. We selected a weighted random value of the GE Probabilities vector shown in the Step 3 of Figure 1, instead of selecting the top-scored ones, as Lavoué did.
However, as the initial player profile may either be inaccurately defined by the user when filling in the questionnaire, or may evolve slightly over time [46], the Dynamic Adaptive Gamification strategy, DynamicAG, relies on the fact that the behaviour (i.e., interactions) of each student is useful for tuning the values of his/her player profile during the experience [16]. In our strategy, we forecast the Player Profile in terms of certain milestones in the MOOC based on the learner’s interactions and opinions about the Game Elements shown to him/her earlier in the MOOC [17]. Basically, the dynamic algorithm recomputes the player profile, i.e., all the player types, at the end of some learning activities in the MOOC in order to select the game element that is the best fit at that moment. Once the new Player Profile is obtained, we use the Matching Matrix, as in the static adaptive gamification strategy (shown in Figure 1, Step 1), to obtain the next game element to be shown to the user. Note that the Matching Matrix has as many rows as the number of considered Game Elements (GE), where each row contains values between 0 and 1, representing how this Game Element affects the corresponding player type. For instance, the GE development tool applies to the Disruptor player type (first row) and the Points Game Element applies to all the player types (seventh row).
The DynamicAG strategy is therefore especially useful in profiles that are mostly non-players. In this case, the system shows random game elements during the experience, giving the students some opportunities to interact with them and to join in the gamified experience. Therefore, we interpret non-interaction as the desire of the student not to play, and gamification is therefore disabled.

4. The Design of Adaptive Gamification in MOOCs

Embedding gamification in MOOCs to engage students is not an easy task. Gamification elements must be integrated in MOOCs in terms of graphic design, maintaining the same appearance as for other course activities, but they must also be strongly connected to current student progress in the MOOC. Moreover, as MOOCs support diverse player profiles, the selection of game elements during the course based on different player types is crucial for engaging students in the gamified experience.
In the following sections, we describe how we managed the appearance of the game elements during the course, as well as the rewards logic that integrates a student’s progress in the MOOC with the gamification element rewards.

4.1. Game Element Distribution in the MOOC

A common non-gamified MOOC is a course organised in a sequence of learning activities following an itinerary (see Figure 2). These learning activities are usually grouped in formative units to achieve diverse learning outcomes. Each formative unit contains formative activities, such as interactive info-graphics, presentations, videos, readings, etc., and graded activities, such as questionnaires and problems. When we consider gamifying a MOOC, the main objective is to engage students in the course by making boring activities more fun and rewarding according to their achievements. Thus, we propose the incorporation of game elements after the most difficult activities, such as those that require additional work from the student, which at the same time allows us to grant a prize based on a student’s effort or performance.
Additionally, we incorporated a dashboard so that the student could be informed at all times of his or her progress in the game (see top right of Figure 3). In this figure, we also show how the gamification is embedded in the learning itinerary. In Step 1, students answer a Player Type Questionnaire to define their Player Profile, which is needed for the both the StaticAG and DynamicAG adaptive strategies. students have the choice of not answering the questionnaire, in which case we consider the player profile to be a non-player. In [17], we proposed the use of the player type questionnaire [32] https://www.gamified.uk/UserTypeTest2016, accessed on 1 December 2022. Users see their initial Player Profiles, i.e., the results of the questionnaire in the Dashboard embedded in the course. The Dashboard highlights the Player Profile, showing the students their player types with the highest scores (see green circles). It also allows students to interact with already obtained game elements and to select their avatar. In Step 2, the course starts with the learning activities of the first formative unit. At the end of some activities, gamification becomes active when students ask for their award, and only when they claim it (Step 3), the gamification algorithm is triggered to suggest the game element to be displayed (Step 4). In this step, as we can see in the bottom right in Figure 3, the student can perform several actions: (i) ask for more information, (ii) check her game status on the dashboard, (iii) play with the game element (for instance, play the lottery), and (iv) rate the current game element with stars. All these interactions are recorded by the gamification system which, in the case of dynamic adaptive gamification, will be used to recompute the Player Profile. Finally, in Step 5, the control passes to the learning system, and the next activity in the formative unit of the course will follow.

4.2. Rewarding Logic

It is worth mentioning that it is not only important to know the next game element to show, but it is also crucial that the gamification reflects the achievements in the course. Thus, the gamification system uses the progress of the students and their learning outcomes in the graded activities to rate the value of the reward contained in the game elements.
Our rewarding logic is based on a global score computed as a weighted value of the performance in the learning activities previously performed by the user, and the student’s current status (as shown in Figure 2). We define the G l o b a l S c o r e in some activity t as:
G l o b a l S c o r e ( t ) = L e a r n i n g P e r f o r m a n c e ( t ) s t a t u s ( t )
where the L e a r n i n g P e r f o r m a n c e ( t ) is the normalised mean of the ratings obtained until the current activity t, i.e., with valued in the range [ 0 , 1 ] , and the s t a t u s ( t ) indicates where the game element appears in the learning itinerary. It is also a value between 0 and 1, where 0 indicates that the learner is at the beginning of the course, and 1 means that the student has completed all course activities. The s t a t u s allows us to better reward students as they approach to the end of the course. Moreover, we select the avatar look based on the s t a t u s ( t ) value, i.e., we can use the newbie, medium, and pro avatars when a student’s status values are in the intervals [ 0 , 0.5 ) , [ 0.5 , 0.75 ) and ( 0.75 , 1 ] , respectively (see the different looks in Figure 2).
The G l o b a l S c o r e ( t ) allows for the setup of some game elements, used either for computing their properties (i.e., rewards in lotteries) or also for guiding changes to their visual appearance (i.e., icons of badges and gifts). The properties of the game element that are susceptible to change are related to the type of game element: (i) the values of rewards for Points, Challenges, Easter Eggs, Lotteries, Unlockables, and Gifts; (ii) difficulty in Challenges; (iii) types of Badges, i.e, badges, stars, medals, cups; (iv) the number of elements allowed to be modified with the Development Tools; and (v) the number of recommended friends in social networks. The visual elements that can be modified are the colours of the Easter Eggs, the symbols of the icons in Points and Badges, and some animations in the Challenges.
In summary, Table 1 shows the game elements modifiable in terms of both their properties and appearance. For instance, challenges rely on the G l o b a l S c o r e to shape their visual look and feel, showing different numbers of mountains to climb, and also to setup levels of difficulty and the reward that the users receive. Some other game elements (see Table 2), such as lotteries, adjust only their rewards according to the G l o b a l S c o r e but maintain the same look and feel. Others only reflect information about the student’s s t a t u s in the learning itinerary, i.e., levels, and their gamification status, i.e., leaderboards, but without using the global score to configure their properties or appearance (see Table 3).

5. Materials and Methods

This section describes the experimental setup required to answer our research question regarding the impact of the DynamicAG approach on the engagement of the user with gamification. We first present the course that was used for testing our adaptive dynamic gamification approach. We then give a detailed description of other aspects of the experiment including ethical approval, recruitment of participants on the course, experimental and control groups, data that we collected based on our research question, and the statistical methods that we used to analyse the data.

5.1. The Gamified Course: “How to Recycle Plastics from the Sea?”

The course is designed by experts in pedagogy at the Edebe company based in Barcelona (Spain) https://edebe.com/, accessed on 12 January 2023. It is addressed to high-school students aged between 15 and 16. The goal of the course is to provide them with practical and theoretical knowledge about recycling plastics found in the sea and also to raise awareness about the problem of plastics spills. It contains four formative units, each one consisting of activities including readings, videos, interactive info-graphics, and questionnaires. These activities encourage students, through critical thinking, to reason about the problem of plastics in the sea and to propose creative and effective solutions.
The gamification of the course is designed so that the game elements appear when the participants have completed a series of activities. For example, a gamification element (GE) appears to motivate and reward the students when they have watched a video and then taken a quiz. As the gamification system is adaptive, different students are rewarded with different gamification elements depending on their player type.
The course is offered to students through the nanoMOOCS platform, which is based on the Open edX https://openedx.org/about-open-edx/, accessed on 12 January 2023. learning software technology. This technology supports extensions by using the so-called xBlocks, a component architecture designed to make it easier to create new online educational experiences, which in our case was used to implement the gamification of the course https://edx.readthedocs.io/projects/xblock-tutorial/en/latest/overview/introduction.html, accessed on 12 January 2023.

5.2. Ethical Considerations

Prior to the beginning of our study, we formulated a statement regarding the ethical implications of interactions with human beings, for the approval of an ethics committee. It included a description of the expected impact of the research on the participants, and information about the voluntary nature of participation in the study, the data to be collected and anonymously stored following standards of encryption and security, and the informed consent documents for participants. As the participants were not adults, we prepared two consent forms, one for the parents and another for the school directors. The institutional ethical review board of the University of Barcelona approved our study in December 2021.

5.3. Recruitment of Participants

As the study was planned to be held from January to February 2022. We started the recruitment three months before so that the course could fit into the schools’ educational scheduling. We looked for students who were currently in the 3rd and/or 4th year of high school, and we restricted the search to the city of Viladecans (Barcelona), since it is sufficiently large to enable us to recruit students from diverse social and educational backgrounds. In addition, thanks to our experience with previous projects, we knew that schools in Viladecans were open to participating in innovative educational initiatives.
Firstly, a call was made to all schools in the town to explain the main characteristics of the project. Subsequently, groups of students from five schools that voluntarily confirmed their interest were randomly selected and assigned to the experimental (DynamicAG) and control (StaticAG) groups. The initial number of participants in the dynamic adaptive gamification group was 46, and 51 in the static adaptive gamification group, but due to circumstances mainly related to the coronavirus epidemic (forced isolation for teachers and students) these numbers dropped to 30 and 33, respectively. Gender was fairly evenly distributed in both groups (44.68% females and 55.32% males).

5.4. Experiment Design

We adopted a pseudo-experimental design with a control group. The experiment was a between-subjects design where each participant in the same class took the course in one of the two conditions (dynamic adaptive gamification and static adaptive gamification). Once the groups of students were randomly assigned to either the experimental or the control group, to ensure that these groups were comparable, we conducted the SIMS motivation test with two additional questions with individuals in both groups before starting the experiment.
The Situational Motivation Scale (SIMS) questionnaire has the objective of measuring intrinsic motivation in students. Four questions refer to the measurement of intrinsic motivation while another four questions measure the level of amotivation. All of them were extracted from the SIMS Spanish version validated by [47]. We added two more questions to form an idea of whether there were significant differences in the preference for games/gamification between the subjects in the different experimental groups.
The data gathered with the SIMS questionnaire helped us to test the internal validity of the experiment since significant differences would have an impact on the interpretation of the results of the study. We observed that students started from a very similar state of motivation because no significant differences were found (see Figure 4). Specifically, although the DynamicAG group had a slightly higher intrinsic motivation (IM) than the other group, this difference was not significant (U Mann–Whitney, p-value = 0.9265460; p > 0.05 so we accept the null hypothesis). Amotivation was almost the same in both groups (U Mann–Whitney confirms this fact; p-value = 0.99498195). Similarly, the DynamicAG group started with higher results than the StaticAG group in the G-Gamer measure, but the difference was not significant either (U Mann–Whitney; p-value = 0.37294404). In the two groups, we observed that the “gamer motivation” (G) indicator was the highest, followed by the “intrinsic motivation” (IM) indicator and finally by the “amotivation” (A) indicator. The value of the G indicator reflects the willingness to “play” of both groups, which is desirable for performing our experiment.

5.5. Procedure

We created two versions of the course on the nanoMOOCs platform. They had identical activities but different gamification configurations: static adaptive and dynamic adaptive, respectively. Player profile updating was disabled in the course with static adaptive gamification. It is worth noting that participants were not able to distinguish the version of the course.
The experiment consisted of the following stages:
  • Presentation session in the schools. We visited the schools to instruct the students on platform registration as well as filling in the motivation pre-test and the HEXAD gamification user type questionnaire.
  • Course realisation. The students had 20 days to perform the course activities, which required a total of between 10 and 20 h. Since the course was online, participants were able to carry out the activities in any place where they had access to a computer with an Internet connection (such as at school, a library or their home). Although participating teachers were somewhat free to organise and monitor the progress of the educational content of the course, we provided them with a recommended plan for completing the course over the weeks of the experiment. During this stage we performed data collection.
  • Final session in the schools. This was performed face-to-face in one school and online in others, due to COVID-19 restrictions. We wanted to meet again with the participants to record their opinions and comments as well as to share the interesting data gathered throughout the experiment with the students.

5.6. Data Collection and Statistical Analysis

To answer our proposed research question regarding user engagement with gamification, we collected the following data (see Figure 5):
  • D a s h b o a r d C l i c k s : Number of interactions with the gamification dashboard by a student.
  • G E D i s p l a y s : Number of times that a Game Element was shown to a student during the experience.
  • G E C l i c k s : Number of user interactions with Game Elements during the whole experiment.
  • I n t e r a c t i o n T i m e : The total time spent with game elements by a student.
  • E v a l u a t i o n s : The mean score that a user gave to game elements during the course.
Note that we assume that the more engaged the students are with the gamification, the more they will interact with these GEs. Thus, all these metrics are related to the interaction with gamification elements and not to the interaction with other user interface elements.
Data gathered during the experiment were analysed using both descriptive statistics and non-parametric comparative tests to evaluate differences between the experimental and control groups. Differences in means were analysed in terms of normality using the Kolmogorov–Smirnov test and variance equality using the Levene test. The Brunner–Munzel test was used to compare means for non-normal distributions and unequal variances. In measurements with equal variances, mean comparisons were computed using U Mann–Whitney tests. Tests obtaining p-values lower than 0.05 were considered as statistically significant. The calculations were computed using the SciPy package [48].

6. Results

This section analyses the gamification data collected during the experiment. First, we report on how the individual player profiles changed in the DynamicAG experimental group. Recall that the player profile did not change in the StaticAG control group. We analysed the evolution of the player type, computing the difference between the player profile at the beginning and at the end of the course. We observed that the player profile changed in 40% of the participants (18 out of the total of 44). In fact, these differences were very small. Note that this value is greater than a maximum double-precision error, 10 16 , with the highest being 8.27 × 10 05 , but they are relevant enough to change the GE Probabilities associated with each Game Element (see Step 2 in Figure 1). Consequently, these changes had an impact on the selection of Game Elements in the DynamicAG group and thus, the overall experience of the students in this group. Specifically, these dynamic changes in players’ profiles occurred in participants with a high number of interactions and a high average time spent with the gamified experience.
Figure 6 presents a comparison of the D a s h b o a r d C l i c k s values between the StaticAG (blue bars) and DynamicAG (orange bars) groups. On the left, the histogram shows that the majority of the users in the StaticAG group interacted less with the Dashboard (all the values are between 0 and 10 clicks), while users in DynamicAG were more active in using the dashboard. Indeed, the comparison of the means (right side of the figure) confirms that the difference is significant (Brunner–Munzel test, p-value = 0.040 < 0.05; Cohen’s d = 0.5821). In relation to the dispersion in D a s h b o a r d C l i c k s , we analysed the data thoroughly and realised that it corresponded to students who played games regularly, suggesting that this element may have been familiar to them and that they were therefore more inclined to visit the dashboard.
Figure 7 presents a between-group comparison of the G E D i s p l a y s between groups. The figure shows the same trend as the previous one. Note that the display of the GEs requires the users to claim a reward by selecting the "I want an award" button (see on left side of Figure 5). Even though this button was equally available to both groups, the DynamicAG group was more engaged and participants in this group demanded awards more frequently. In fact, a comparison of the means (right side of the figure) shows a significant difference (Brunner–Munzel, p-value = 0.031749 < 0.05 ;Cohen’s d = 0.7175, between medium and large effect size).
Game elements are not only aesthetically appealing but are also interactive since they allow students to get rewards, e.g., the lottery, the gift, or sending a tip to workmates. Figure 8 presents the average number of users’ interactions with game elements during the whole experience ( G E C l i c k s ). The distribution of orange bars along the entire x axis indicates that students in the experimental condition (DynamicAG) interacted more with game elements than those in the control condition (Brunner–Munzel p-value = 0.011 < 0.05; Cohen’s d = 1.0001 large effect size). However, this metric gives low values in both conditions, which can be explained by the discoverability of the designed elements [49]. That is, the students did not notice that some elements were interactive.
Another measure of students’ engagement with gamification is the average time that they spent interacting with game elements ( I n t e r a c t i o n T i m e ). Figure 9 indicates a certain parallels with the previous figure ( G E C l i c k s ), especially in the low values for minutes. In fact, both measures are closely related (Mann–Whitney U, p-value = 0.27 > 0.05) because the more interactions students had with the game elements, the more time they spent on them. Nevertheless, measuring time is a non-trivial task as students may leave, for instance, the dashboard open while they are doing other things (such as getting up from the chair or chatting with other classmates). Note that in this case, we obtained some outliers (i.e., values higher than 70 min) as the right part of the figure reveals.
We also gathered the score that students gave to the game elements (see E v a l u a t i o n s in the middle of Figure 5) that appeared throughout the course. To do so, we computed the mean score of each mechanic (normalized between 0 and 1) as the quotient between the sum of each score, s i , and the number of times, n, the game element was shown. If a game element was shown more than one time to a particular user, we added the score for each game element occurrence.
i = 1 n s i n
In general, DynamicAG scores equalled or slightly exceeded those obtained in StaticAG. Participants in both groups scored the mechanics almost equally (DynamicAG = 0.456, StaticAG = 0.483); indeed, this difference is non significant (Mann–Whitney U, p-value = 0.699). To further analyse the E v a l u a t i o n s of Game Elements, Figure 10 shows a histogram of scores detailed by game elements. The majority of students in both conditions gave a medium score to GEs, but scores higher than 0.65 only occurred in the dynamic adaptive gamification.
As can be seen in the figure, several game elements have no score in StaticAG (badge, gift, lottery, development tool, knowledge share, lottery, and unlockable). The reason for this is that the fixed game elements corresponding to the initial player profile limited the number of mechanics shown to the students and, as a consequence, they were not able to score them. This was not the case for the DynamicAG group, where all the 14 game elements were shown and scored by participants.
Specifically, game elements as shown in Table 1 with both changeable properties and appearances, such as Easter eggs and challenges, scored well. The low score of the badge element can be due to its visual design, which was similar, and probably the students were not able to perceive the differences. Thus, they believed that they were receiving the same badge all the time. Nevertheless, this is a supposition that should be proved. Moreover, there are game elements with changeable properties but fixed visual appearance (see Table 2), such as unlockable, lottery, development tools, and gifts, that obtained low scores. We think that, again, the discoverability of the elements may be the reason behind these results. If the users had been aware of how they functioned, their opinion could have been more positive. Finally, game elements targeting socializers (social status, social network) and the leaderboard were rated positively by participants.

7. Discussion

Overall, our results show that approaches to gamification that take into account the player profiles have a positive impact on the user experience. Indeed, these results are in line with previous research [15]. However, we went further, updating the player profile during the experience. Our findings demonstrate that the dynamic approach, compared to a static player profile, fits students’ preferences better, as proved by the differences found in the metrics of engagement with gamification.
Moreover, we used weighted random probabilities to select, from the entire set of GEs, the one to show to the student in both the static and dynamic approaches (see Selection 1 in Figure 1), making the static approach more “dynamic” than a purely static approach that would have selected the game element with the highest score (see Selection 1 in Figure 1). Notice that our study could not evaluate the behaviour of non-player profiles because there were no participants who chose not to “play”, as expected with 15–16-year-old participants. Even so, the results of the DynamicAG group surpassed those in StaticAG.
Throughout the development of this project we learned some lessons to share regarding different aspects: (i) the setup of the DynamicAG approach, (ii) the instructional design of the course, and (iii) the design of gamification elements.
First, in relation to the DynamicAG approach, there are certain parameters used to change the Player Profile depending on the interactions and opinions of the students regarding game elements. To avoid high fluctuations in the Player Profile values, our method manages small values of these two parameters. However, these small values provoked few variations in the player profile. Consequently, it is necessary to revisit the method for tuning parameter values.
Second, another aspect to be considered is the influence of the instructional design. We proposed this experiment in a nanoMOOC, which is a short and very focused learning itinerary. Therefore, the adaptive gamification method had to tune the Player Profile with limited data in a limited time. A possible extension is to maintain the gamification across several courses and thereby collect new data to continually improve the Player Profile model.
Third, an important factor is the design and distribution of the game elements across the span of the instructional activities. In the gamification design, we considered the incorporation of game elements after the most arduous activities. However, the gamification elements also require feedback regarding the student performance in those activities in order to adapt their visual aspect or to change their properties, such as rewards. This fact implies that activities that cannot be tracked by the learning platform, such as reading a pdf, cannot be rewarded with a game element. Moreover, both adaptive gamification approaches automatically select the game elements to display, but the DynamicAG performed better. First, it distributed a wider range of GEs than the StaticAG. Second, the GEs were generally scored higher by students. Future studies may analyse the impact that the changes in appearance and properties of GEs have on the whole experience.
Additionally, the architecture of the OpenEdX platform has some limitations. For instance, when students finish an activity, they must select the I want an award button to display the Game Element, causing users to probably overlook such game elements.
Another challenge to face in the context of online courses is how to measure the “real”/“effective” time spent by students on activities: tracking the presence of the student in the course implies detecting the active window, capturing the students’ attention in this active window, and capturing the students’ face, among others. Thus, we recommend a careful choice of the learning platform, since technological limitations can complicate the seamless integration of gamification strategies.

8. Conclusions and Future Work

This paper presents the design and the evaluation of a new strategy to adapt gamification in MOOCs. We also propose some guidelines for design and on integrating adaptive gamification in the itinerary of a course. Notice that these guidelines are general enough to be used in courses from different areas of knowledge. The dynamic adaptive gamification strategy was evaluated in a real scenario of a course related to the topic of recycling plastics from the sea. This course is offered through the nanoMOOCs online learning platform. A total of 66 students from 6 high schools was distributed among two versions of the course, Static and Dynamic Adaptive gamification. Our preliminary results highlight the benefits of the Dynamic Gamification approach, showing a positive impact on the user experience, with more engagement with gamification. In fact, the DynamicAG group interacted with the Dashboard twice as much as the StaticAG group. Additionally, students in the DynamicAG group were more engaged with game elements, since the mean number of interactions was 12.13 compared to 3.21 in the StaticAG group.
As future work we propose to aggregate user data gathered along several courses in the learning platform, so that our method may recompute the player profiles more precisely. Learning platforms do not facilitate gathering information about performance and students’ feelings on some activities such as reading a pdf. Thus, capturing emotions could be considered for distributing GEs in the learning itinerary, and also to enhance the reward logic. Another open line of research is related to the parameters used to predict changes in a player profile. Similarly to other studies that performed opinion extraction in governance [50] and business [51] contexts, we could gather and analyse student opinions in natural language, instead of using a five star evaluation.

Author Contributions

Conceptualization, A.P., I.R., Á.R. and I.G; methodology, A.P., I.R., Á.R. and I.G.; software, Á.R.; validation, A.P., I.R., Á.R. and I.G.; formal analysis, A.P., I.R., Á.R. and I.G.; investigation, A.P., I.R., Á.R. and I.G.; writing—original draft preparation, A.P. and I.R.; writing—review and editing, A.P. and I.R.; supervision, A.P. and I.R.; project administration, A.P. and I.R.; funding acquisition, A.P. and I.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by ACCIO project number COMRDI18-1-0010, PID2021-124361OB-C33 (FairTrans project) and PID2019-104156GB-I00 (CI-SUSTAIN project).

Institutional Review Board Statement

The study was approved by the Institutional Bioethics Board of University of Barcelona (Date of approval 21 December 2021) for studies involving humans.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent was obtained from the students(s) to publish this paper.

Data Availability Statement

The data is not available due to privacy restrictions.

Acknowledgments

Thanks to Raquel Mayordomo from the Edebé board for her support in the organization of the experiments in Viladecans. Thanks also to the Viladecans Council and the schools “Escola Goar”, “Escola Sagrada Familia”, “INS Torres Rojas”, “Escola J. Mestres i Busquets”, and “Escola Teide”, their directors, teachers, and all the participants in this study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MOOCsMassive Open Online Courses
nanoMOOCsnano Massive Open Online Courses
GEGame Elements
StaticAGStatic Adaptive Gamification
DynamicAGDynamic Adaptive Gamification
D a s h b o a r d C l i c k s Number of user interactions with the Dashboard
G E D i s p l a y s Number of times that any Game Element is shown to a student
G E C l i c k s Number of user interactions with Game Elements
I n t e r a c t i o n T i m e Time that a student spent with Game Elements
E v a l u a t i o n s Scoring mean of the user opinions about Game Elements

References

  1. Conole, G.G. MOOCs as disruptive technologies: Strategies for enhancing the learner experience and quality of MOOCs. Rev. Educ. Distancia (RED) 2013, 39. [Google Scholar] [CrossRef]
  2. Ferguson, T.; Roofe, C.G. SDG 4 in higher education: Challenges and opportunities. Int. J. Sustain. High. Educ. 2020, 21, 959–975. [Google Scholar] [CrossRef]
  3. Lambert, S.R. Do MOOCs contribute to student equity and social inclusion? A systematic review 2014–18. Comput. Educ. 2020, 145, 103693. [Google Scholar] [CrossRef]
  4. Dede, C.; Whitehouse, P.; Brown-L’Bahy, T. Designing and studying learning experiences that use multiple interactive media to bridge distance and time. Curr. Perspect. Appl. Inf. Technol. 2002, 1, 1–30. [Google Scholar]
  5. Van Gils, F. Potential applications of digital storytelling in education. In Proceedings of the 3rd Twente Student Conference on IT, University of Twente, Department of Electrical Engineering, Mathematics and Computer Science, Twente, The Netherlands, 17–18 February 2005; Volume 7. [Google Scholar]
  6. Tarkhova, L.; Tarkhov, S.; Nafikov, M.; Akhmetyanov, I.; Gusev, D.; Akhmarov, R. Infographics and their application in the educational process. Int. J. Emerg. Technol. Learn. (IJET) 2020, 15, 63–80. [Google Scholar] [CrossRef]
  7. Kim, D.; Lee, Y.; Leite, W.L.; Huggins-Manley, A.C. Exploring student and teacher usage patterns associated with student attrition in an open educational resource-supported online learning platform. Comput. Educ. 2020, 156, 103961. [Google Scholar] [CrossRef]
  8. Tao, D.; Fu, P.; Wang, Y.; Zhang, T.; Qu, X. Key characteristics in designing massive open online courses (MOOCs) for user acceptance: An application of the extended technology acceptance model. Interact. Learn. Environ. 2022, 30, 882–895. [Google Scholar] [CrossRef]
  9. Khalil, M.; Wong, J.; de Koning, B.; Ebner, M.; Paas, F. Gamification in MOOCs: A review of the state of the art. In Proceedings of the 2018 IEEE Global Engineering Education Conference (EDUCON), Santa Cruz de Tenerife, Spain, 17–20 April 2018; pp. 1629–1638. [Google Scholar] [CrossRef] [Green Version]
  10. Bezerra, L.N.; Silva, M.T. A review of literature on the reasons that cause the high dropout rates in the MOOCS. Rev. Espac. 2017, 38, 11. [Google Scholar]
  11. Deterding, S.; Khaled, R.; Nacke, L.E.; Dixon, D. Gamification: Toward a definition. In Proceedings of the CHI 2011 Gamification Workshop Proceedings, Vancouver, BC, Canada, 7–12 May 2011; Volume 12, pp. 1–79. [Google Scholar]
  12. Jarnac de Freitas, M.; Mira da Silva, M. Systematic literature review about gamification in MOOCs. Open Learn. J. Open Distance Learn. 2020, 38, 1–23. [Google Scholar] [CrossRef]
  13. Morschheuser, B.; Hamari, J.; Werder, K.; Abe, J. How to Gamify? A Method for Designing Gamification. In Proceedings of the Hawaii International Conference on System Sciences (HICSS), Waikoloa Village, HI, USA, 4–7 January 2017; Volume 50. [Google Scholar] [CrossRef] [Green Version]
  14. Hallifax, S.; Serna, A.; Marty, J.C.; Lavoué, É. Adaptive gamification in education: A literature review of current trends and developments. In Proceedings of the European Conference on Technology Enhanced Learning, Delft, The Netherlands, 16–19 September 2019; pp. 294–307. [Google Scholar]
  15. Lavoué, E.; Monterrat, B.; Desmarais, M.; George, S. Adaptive Gamification for Learning Environments. IEEE Trans. Learn. Technol. 2019, 12, 16–28. [Google Scholar] [CrossRef]
  16. Rodríguez, I.; Puig, A.; Rodriguez, A. We Are Not the Same Either Playing: A Proposal for Adaptive Gamification. In Proceedings of the CCIA 23rd International Conference of the Catalan Association for Artificial Intelligence, Lleida, Spain, 20–22 October 2021; pp. 185–194. [Google Scholar]
  17. Rodríguez, I.; Puig, A.; Rodríguez, À. Towards Adaptive Gamification: A Method Using Dynamic Player Profile and a Case Study. Appl. Sci. 2022, 12, 486. [Google Scholar] [CrossRef]
  18. Barata, G.; Gama, S.; Jorge, J.; Gonçalves, D. Studying student differentiation in gamified education: A long-term study. Comput. Hum. Behav. 2017, 71, 550–585. [Google Scholar] [CrossRef]
  19. Lister, M. Gamification: The effect on student motivation and performance at the post-secondary level. Issues Trends Learn. Technol. 2015, 3. [Google Scholar] [CrossRef]
  20. Qiao, S.; Yeung, S.S.S.; Zainuddin, Z.; Ng, D.T.K.; Chu, S.K.W. Examining the effects of mixed and non-digital gamification on students’ learning performance, cognitive engagement and course satisfaction. Brit. J. Educat. Technol. 2022. [Google Scholar] [CrossRef]
  21. Taşkın, N.; Kılıç Çakmak, E. Effects of gamification on behavioral and cognitive engagement of students in the online learning environment. Int. J. Human–Comput. Interact. 2022, 1–12. [Google Scholar] [CrossRef]
  22. Knutas, A.; Ikonen, J.; Nikula, U.; Porras, J. Increasing collaborative communications in a programming course with gamification: A case study. In Proceedings of the 15th International Conference on Computer Systems and Technologies, Ruse, Bulgaria, 27–28 June 2014; pp. 370–377. [Google Scholar]
  23. Antonaci, A.; Klemke, R.; Specht, M. The effects of gamification in online learning environments: A systematic literature review. Informatics 2019, 6, 32. [Google Scholar] [CrossRef] [Green Version]
  24. Yang, Q.; Lee, Y.C. The critical factors of student performance in MOOCs for sustainable education: A case of Chinese universities. Sustainability 2021, 13, 8089. [Google Scholar] [CrossRef]
  25. Gurjanow, I.; Oliveira, M.; Zender, J.; Santos, P.A.; Ludwig, M. Mathematics trails: Shallow and deep gamification. Int. J. Serious Games 2019, 6, 65–79. [Google Scholar] [CrossRef]
  26. Yamani, H.A. A Conceptual Framework for Integrating Gamification in eLearning Systems Based on Instructional Design Model. Int. J. Emerg. Technol. Learn. 2021, 16, 14–33. [Google Scholar] [CrossRef]
  27. Kocadere, S.A.; Çağlar, Ş. Gamification from player type perspective: A case study. J. Educ. Technol. Soc. 2018, 21, 12–22. [Google Scholar]
  28. Bennani, S.; Maalel, A.; Ben Ghezala, H. Adaptive gamification in E-learning: A literature review and future challenges. Comput. Appl. Eng. Educ. 2022, 30, 628–642. [Google Scholar] [CrossRef]
  29. Oliveira, W.; Hamari, J.; Shi, L.; Toda, A.M.; Rodrigues, L.; Palomino, P.T.; Isotani, S. Tailored gamification in education: A literature review and future agenda. Educ. Inf. Technol. 2022, 1–34. [Google Scholar] [CrossRef]
  30. Bartle, R. Hearts, clubs, diamonds, spades: Players who suit MUDs. J. MUD Res. 1996, 1, 19. [Google Scholar]
  31. Nacke, L.E.; Bateman, C.; Mandryk, R.L. BrainHex: A neurobiological gamer typology survey. Entertain. Comput. 2014, 5, 55–62. [Google Scholar] [CrossRef]
  32. Marczewski, A. Even Ninja Monkeys Like to Play: Gamification, Game Thinking and Motivational Design; CreateSpace Independent Publishing Platform: Scotts Valley, CA, USA, 2015. [Google Scholar]
  33. Klock, A.C.T.; Gasparini, I.; Pimenta, M.S.; Hamari, J. Tailored gamification: A review of literature. Int. J. Human Comput. Stud. 2020, 144, 102495. [Google Scholar] [CrossRef]
  34. Fadhil, A.; Villafiorita, A. An adaptive learning with gamification & conversational UIs: The rise of CiboPoliBot. In Proceedings of the Adjunct Publication of the 25th Conference on User Modeling, Adaptation and Personalization, Bratislava, Slovakia, 9–12 July 2017; 2017; pp. 408–412. [Google Scholar]
  35. Torio, J.; Bigueras, R.; Maligat Jr, D.; Arispe, M.; Cruz, J.D. An Adaptive Gamification Learning Approach on Digital Logic Gates: LogIO. Adv. Asp. Eng. Res. 2021, 12, 54–62. [Google Scholar]
  36. Zichermann, G.; Cunningham, C. Gamification by Design: Implementing Game Mechanics in Web and Mobile Apps; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2011. [Google Scholar]
  37. Hassan, M.A.; Habiba, U.; Majeed, F.; Shoaib, M. Adaptive gamification in e-learning based on students’ learning styles. Interact. Learn. Environ. 2021, 29, 545–565. [Google Scholar] [CrossRef]
  38. El Gammal, W.; Sherief, N.; Abdelmoez, W. User-based Adaptive Software Development for Gamified Systems. In Proceedings of the 2020 3rd International Conference on Geoinformatics and Data Analysis, Marseille, France, 17–19 April 2020; pp. 115–122. [Google Scholar]
  39. Sabourin, J.; Lester, J. Affect and Engagement in Game-BasedLearning Environments. IEEE Transac. Affect. Comput. 2014, 5, 45–56. [Google Scholar] [CrossRef]
  40. Lopez, C.; Tucker, C. Towards Personalized Adaptive Gamification. IEEE Transact. Games 2020, 12, 155–168. [Google Scholar] [CrossRef] [Green Version]
  41. Assiroj, P.; Warnars, H.L.H.S.; Heryadi, Y.; Trisetyarso, A.; Suparta, W.; Abbas, B.S. Adaptive Game Design using Case-based Reasoning Method for High Performance Computing Learning. In Proceedings of the Indonesian Association for Pattern Recognition International Conference (INAPR), Jakarta, Indonesia, 7–8 September 2018; pp. 177–181. [Google Scholar]
  42. Alam, T.M.; Mushtaq, M.; Shaukat, K.; Hameed, I.A.; Umer Sarwar, M.; Luo, S. A Novel Method for Performance Measurement of Public Educational Institutions Using Machine Learning Models. Appl. Sci. 2021, 11, 9296. [Google Scholar] [CrossRef]
  43. Hallifax, S. Adaptive Gamification of Digital Learning Environments. Ph.D. Thesis, Université Jean Moulin Lyon 3, Lyon, France, 2020. [Google Scholar]
  44. Tondello, G.F.; Wehbe, R.; Diamond, L.; Busch, M.; Marczewski, A.; Nacke, L.E. The Gamification User Types Hexad Scale. In Proceedings of the CHI PLAY 2016, Austin, TX, USA, 16–19 October 2016; pp. 229–243. [Google Scholar]
  45. Hallifax, S.; Serna, A.; Marty, J.; Lavoué, G.; Lavoué, E. Factors to consider for tailored gamification. In Proceedings of the CHI PLAY 2019, Barcelona, Spain, 22–25 October 2019; pp. 559–572. [Google Scholar] [CrossRef] [Green Version]
  46. Charles, D.; Black, M. Dynamic Player Modelling: A Framework for Player-centred Digital Games. Proceedings of 5th International Conference on Computer Games: Artificial Intelligence, Design and Education (CGAIDE’04), Wolverhampton, UK, 8–10 November 2004; pp. 29–35. [Google Scholar]
  47. Martín-Albo, J.; Núñez, J.L.; Navarro, J.G. Validation of the Spanish version of the Situational Motivation Scale (EMSI) in the educational context. Span. J. Psychol. 2009, 12, 799–807. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; Bright, J.; et al. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nat. Methods 2020, 17, 261–272. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Mackamul, E. Improving the Discoverability of Interactions in Interactive Systems. In Proceedings of the CHI Conference on Human Factors in Computing Systems Extended Abstracts, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–5. [Google Scholar]
  50. Shaukat Dar, K.; Mahboob Alam, T.; Ahmed, M.; Luo, S.; Hameed, I.; Iqbal, M.; Li, J.; Iqbal, M. A Model to Enhance Governance Issues through Opinion Extraction. In Proceedings of the 2020 11th IEEE Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, Canada, 4–7 November 2020. [Google Scholar] [CrossRef]
  51. Păvăloaia, V.D.; Teodor, E.M.; Fotache, D.; Danileţ, M. Opinion mining on social media data: Sentiment analysis of user preferences. Sustainability 2019, 11, 4459. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Overview of the StaticAG method, which involves three steps: Step 1 obtains the scorings of the game elements depending on an individual Player Profile vector and the Matching Matrix; in Step 2 we normalise these scorings; and finally, in Step 3, we select the Game Element. We also display two Selection methods: Selection1, Lavoué-based [15], obtains the top-scored Game Element, while Selection 2 obtains a weighted random Game element according to the probabilities computed in Step 3.
Figure 1. Overview of the StaticAG method, which involves three steps: Step 1 obtains the scorings of the game elements depending on an individual Player Profile vector and the Matching Matrix; in Step 2 we normalise these scorings; and finally, in Step 3, we select the Game Element. We also display two Selection methods: Selection1, Lavoué-based [15], obtains the top-scored Game Element, while Selection 2 obtains a weighted random Game element according to the probabilities computed in Step 3.
Applsci 13 01535 g001
Figure 2. Gamified MOOC itinerary: students’ statuses are depicted from 0 to 1 relying on the Learning Activities (Newbie, Medium, and Pro avatars are shown in different parts of the itinerary). Game Elements (GE) between activities are displayed using small red circles. We developed a new XBlock in the edX platform able to add a Game Element to the MOOC itinerary; see video Adding a Game element Xblock in edX, accessed on 12 January 2023.
Figure 2. Gamified MOOC itinerary: students’ statuses are depicted from 0 to 1 relying on the Learning Activities (Newbie, Medium, and Pro avatars are shown in different parts of the itinerary). Game Elements (GE) between activities are displayed using small red circles. We developed a new XBlock in the edX platform able to add a Game Element to the MOOC itinerary; see video Adding a Game element Xblock in edX, accessed on 12 January 2023.
Applsci 13 01535 g002
Figure 3. Gamified MOOC structure: on the left, the learning itinerary, on the right the embedded gamification in the MOOC (the dashboard and the game elements). See videos. See more details in: the initial Player Type Questionnaire, the Dashboard, and the Gamification Element Widgets, accessed on 12 January 2023.
Figure 3. Gamified MOOC structure: on the left, the learning itinerary, on the right the embedded gamification in the MOOC (the dashboard and the game elements). See videos. See more details in: the initial Player Type Questionnaire, the Dashboard, and the Gamification Element Widgets, accessed on 12 January 2023.
Applsci 13 01535 g003
Figure 4. Test of the initial motivation of both experimental groups: DynamicAG and StaticAG. Measures: IM—Intrinsic Motivation, A—Amotivation, G—Gamer.
Figure 4. Test of the initial motivation of both experimental groups: DynamicAG and StaticAG. Measures: IM—Intrinsic Motivation, A—Amotivation, G—Gamer.
Applsci 13 01535 g004
Figure 5. Top: G E D i s p l a y s are collected using clicks on the button I want an award. Bottom left: G E C l i c k s are monitored using clicks on the GE, while E v a l u a t i o n s refer to the scorings of each displayed game element using stars. Bottom right: D a s h b o a r d C l i c k s refer to those clicks made in the dashboard window.
Figure 5. Top: G E D i s p l a y s are collected using clicks on the button I want an award. Bottom left: G E C l i c k s are monitored using clicks on the GE, while E v a l u a t i o n s refer to the scorings of each displayed game element using stars. Bottom right: D a s h b o a r d C l i c k s refer to those clicks made in the dashboard window.
Applsci 13 01535 g005
Figure 6. Comparison of D a s h b o a r d C l i c k s . Left side: histogram. Right side: the mean of the number of clicks and the variance of all students in each group. “# users” means number of users.
Figure 6. Comparison of D a s h b o a r d C l i c k s . Left side: histogram. Right side: the mean of the number of clicks and the variance of all students in each group. “# users” means number of users.
Applsci 13 01535 g006
Figure 7. Comparison of G E D i s p l a y s . Left side: histogram. Right side: the mean of the number of the game elements shown and the variance of all students in each group. “# users” means number of users.
Figure 7. Comparison of G E D i s p l a y s . Left side: histogram. Right side: the mean of the number of the game elements shown and the variance of all students in each group. “# users” means number of users.
Applsci 13 01535 g007
Figure 8. Comparison of G E C l i c k s . Left side: histogram. Right side: the mean of the number of interactions with elements and the variance of all students in each group. “# users” means number of users.
Figure 8. Comparison of G E C l i c k s . Left side: histogram. Right side: the mean of the number of interactions with elements and the variance of all students in each group. “# users” means number of users.
Applsci 13 01535 g008
Figure 9. Comparison of Interaction Time. Left side: histogram. Right side: the average time students interacted with gamification elements and the variance of all students in each group. “# users” means number of users.
Figure 9. Comparison of Interaction Time. Left side: histogram. Right side: the average time students interacted with gamification elements and the variance of all students in each group. “# users” means number of users.
Applsci 13 01535 g009
Figure 10. E v a l u a t i o n s , i.e., mean scoring of GEs grouped by GE type. The left side includes Changeable GEs (in both properties and appearances). In the middle, GEs that only change their properties according to the learning progress. On the right side, GEs with fixed appearances and properties.
Figure 10. E v a l u a t i o n s , i.e., mean scoring of GEs grouped by GE type. The left side includes Changeable GEs (in both properties and appearances). In the middle, GEs that only change their properties according to the learning progress. On the right side, GEs with fixed appearances and properties.
Applsci 13 01535 g010
Table 1. Details of the Game Element Properties and Appearances changeable as a function of the global score values. These game elements will change their properties and their appearance depending on the global scores.
Table 1. Details of the Game Element Properties and Appearances changeable as a function of the global score values. These game elements will change their properties and their appearance depending on the global scores.
Game ElementsChangeable PropertiesChangeable Visual Appearance
ChallengesDifficulty, reward valueApplsci 13 01535 i001
PointsReward valueApplsci 13 01535 i002
BadgesBadge typesApplsci 13 01535 i003
Easter EggsReward valueApplsci 13 01535 i004
Table 2. Details of the Game Element Properties changeable as a function of the global score values. These game elements have the same appearance during the whole itinerary and the global score will change only the values of their properties.
Table 2. Details of the Game Element Properties changeable as a function of the global score values. These game elements have the same appearance during the whole itinerary and the global score will change only the values of their properties.
Game ElementsChangeable PropertiesFixed Visual Appearances
Development ToolsNumber of elements allowed to be modifiedApplsci 13 01535 i005
LotteriesReward valueApplsci 13 01535 i006
Gift OpenersNumber of openable giftsApplsci 13 01535 i007
UnlockablesReward valueApplsci 13 01535 i008
GiftsReward value allowed to sendApplsci 13 01535 i009
Social NetworkNumber of recommended friendsApplsci 13 01535 i010
Table 3. Game Elements that shows the student’s status in the learning process and in the gamification.
Table 3. Game Elements that shows the student’s status in the learning process and in the gamification.
Applsci 13 01535 i011Applsci 13 01535 i012Applsci 13 01535 i013Applsci 13 01535 i014
LeaderboardsProgression LevelsKnowledge SharingSocial Status
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Puig, A.; Rodríguez, I.; Rodríguez, Á.; Gallego, I. Evaluating Learner Engagement with Gamification in Online Courses. Appl. Sci. 2023, 13, 1535. https://doi.org/10.3390/app13031535

AMA Style

Puig A, Rodríguez I, Rodríguez Á, Gallego I. Evaluating Learner Engagement with Gamification in Online Courses. Applied Sciences. 2023; 13(3):1535. https://doi.org/10.3390/app13031535

Chicago/Turabian Style

Puig, Anna, Inmaculada Rodríguez, Álex Rodríguez, and Ianire Gallego. 2023. "Evaluating Learner Engagement with Gamification in Online Courses" Applied Sciences 13, no. 3: 1535. https://doi.org/10.3390/app13031535

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop