Abstract
In the transition from secondary to higher education, students are expected to develop a set of learning skills. This paper reports on a dashboard implemented and designed to support this development, hereby bridging the gap between Learning Analytics research and the daily practice of supporting students. To demonstrate the scalability and usefulness of the dashboard, this paper reports on an intervention with 1406 first-year students in 12 different programs. The results show that the dashboard is perceived as clear and useful. While students not accessing the dashboard have lower learning skills, they make more use of the extra remediation possibilities in the dashboard.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
This paper discusses the implementation and evaluation of a Learning Analytics (LA) intervention, which resulted in a student dashboard that provided 1406 first-year STEM (Science, Technology, Engineering, and Mathematics) students at the University of Leuven with actionable feedback on their learning skills. The study is an attempt to bridge the gap between LA research and the daily practice of supporting students in their first year of higher education. LA research is rigorous, open to scientific scrutiny, but studies are often limited to smaller groups of students or course settings favorable to the experiment. Massive Open Online Courses (MOOC’s) and other born-digital forms of e-learning are attractive producers of data. Applying LA to the traditional context of higher education is challenging, as many interactions and learning activities are face-to-face, offline, and difficult to capture digitally. Therefore, the search for available digital traces to support personalized, actionable feedback to larger groups of students is of particular interest. The digital traces used in the dashboard discussed in this paper build on the work of educational scientists related to the assessment and development of learning skills. More specifically, the experiment uses data gathered from the Learning and Study Strategies Inventory (LASSI), a questionnaire measuring learning skills [16].
2 Related Work
LA aims to improve learning and learning environments by collecting and analyzing the traces produced by learners [13]. Data from a wide variety of sources can be studied using statistical and data mining techniques. To deliver the insights derived therefrom back to these stakeholders, the data can be visualized using learning dashboards. First overviews of such LA Dashboards (LAD’s) have been presented in [14, 15]. Examples include Course Signals [1], SAM [4], and StepUp! [11]. Course Signals is a prominent example that predicts and visualizes learning outcomes based on three data sources: grades in the course so far, time on task, and past performance. If grades in the course so far are below a certain threshold specified by the teacher, a student will see a red color signal for the course. When they are above the threshold, past performance in other courses and time on task are used to calculate whether a student is on track (green light) or whether she may need to improve her activities for the course (orange light). As discussed in [15], research has shown that such a dashboard has an impact on student retention and drop-out, although this particular example is one of the few dashboards that has been thoroughly evaluated to assess the impact on learning effectiveness.
A first systematic review of LADs was conducted by Schwendimann, et al. [12]. The authors analyzed 53 articles. Their findings include that dashboards are mostly developed for instructors, and that impact on learning is rarely assessed. A recent systematic review of LADs has been presented by [2]. The authors have reviewed 94 papers that use LA to support students. A key outcome of this analysis is that “None of the studies included in the student use category broke down student use by demographic, learner characteristics, or student achievement levels. In order to better personalize recommendations and dashboards to students, we need to put more emphasis on understanding student use of these systems”. In addition, while the majority information in LAD’s is typically represented in a visual way, it has been noted [10] that complementary textual information can provide additional guidance. Leitner et al. [7] examined the state of the art of LA by analyzing 101 articles. One of the limitations they find in current research, is the limit in group size and the question if existing work has the potential to scale beyond small group sizes to a wider context.
3 Situation of This Study
The aim of this study is to learn about the use of a dashboard in a realistic context to provide first-year students in higher education with actionable insight on their learning skills and possible actions to improve these skills. This is translated to the following objectives.
-
(1)
To demonstrate and test the feasibility of a scalable approach to learning analytics, targeting a sizable group of students in STEM study programs.
-
(2)
To construct a dashboard based on information that is readily available within the institution, but not yet shared with students.
-
(3)
To collect user metrics and feedback to assess perceived usefulness and usability and to uncover areas for further research.
We apply the six critical dimensions of the LA framework of Greller and Draschler [5] to offer a shareable description of our intervention.
The stakeholders, both data clients and data subjects, are first-year students. The intervention uses two populations. Firstly, all students in the first year of a particular bachelor program, such that a student’s learning skills can be compared to his/her peer students. Secondly, first-year students in previous academic years, such that the relation between a student’s learning skills and first-year academic achievement (percentage of obtained study points) can be shown.
The objective of the dashboard is to unveil information on learning skills to students. The data was gathered for a study considering the relation between learning skills and study success, but was not offered to the data subjects, the students themselves. The dashboard combines reflection and prediction [5]. Concerning reflection, students receives feedback on their learning skills and the comparison with peers students (Fig. 1). Concerning prediction, the dashboard uses a “mild” form. To make students reflect on the importance of learning skills in their learning process, the study efficiency of the students of the previous year is shown in relation to their learning skills (Fig. 1).
This intervention takes advantage of linking the data of learning skills, gathered using paper-and-pencil questionnaires, and data from the university’s data warehouse regarding the study points obtained by a student. Regarding instruments, the intervention does not rely on advanced technology and rather provides a visualization of the underlying raw data to students.
Now, we discuss internal and external limitations. Regarding conventions, both privacy and ethics are important. Students were asked to consent on the use of the learning skill questionnaire data for research, feedback, and the connection to their study results. The ethical soundness of the intervention was supported by the inclusion of study counselors and advisors in the development of the dashboard. Regarding the time scale, the intervention is just-in-time: students received feedback in the middle of the first semester. Regarding the limitations of the first-year students regarding interpreting LA data, the dashboard uses simple visualizations complemented with textual explanations.
4 Learning Skills Dashboard
4.1 Learning Skills
In the transition from secondary to higher education, students are expected to develop a set of learning skills that will help them in their learning path, as well as in their future professional career. Higher education institutions provide information and activities to support students to improve their learning skills, e.g. through coaching, counseling, or training sessions. To direct these efforts and to measure their effectiveness, institutions need to assess the level of learning skills of their students. The Learning and Study Strategies Inventory (LASSI) is a diagnostic instrument that can be used to measure a student’s level of learning skills [6]. Based on a 60 (third edition) item questionnaire, a LASSI test reveals strengths an weaknesses of an individual, and relates this to the scores of other students. Being both a diagnostic and prescriptive instrument, LASSI does not focus on student characteristics that are invariable or difficult to change, such as gender, socioeconomic status, or ethnic background. Rather, it delivers indicators for areas that offer a perspective for growth and mitigation. Student’s learning and study strategies are summarized in ten scales, each targeting a specific skill shown to be relevant to study success [16]. According to its publisher, the test is currently being used in over 3000 institutions worldwide [6].
This paper reports on an interactive dashboard that provides students with individualized feedback on their learning skills as measured through LASSI. The feedback targets five scales that were shown earlier to be best predictive for study success for our specific target group of STEM students: performance anxiety, concentration, motivation, the use of test strategies, and time management [9].
4.2 Dashboard Design
Figure 1 provides a screenshot of student’s view on the dashboard. The main components of the dashboard are introduced below.
The dashboard is divided into six tabs. On access, the first tab shows an introductory text, explaining the purpose and components of the intervention and the origin of the data it is based on. The other five tabs, alphabetically ordered, provide a separate space for each of the five learning skills.
The dashboard contains a visualization of the data underlying the intervention that allows students to compare their learning skills with those of peers in the same program. A simple unit chart (Fig. 1) uses dots to represent the number of students within the respective norm scales for each of the five included learning skills. Each dot represents a single student and is attributed to one of five norm scales, ranging from very weak over weak, average and good to very good. The norm group that applies to the active student, is marked with a blue border and background hatching. For each skill, a second unit chart (Fig. 1) relates the skill level of previous year’s students of the study program with their study success obtained in the first year. Again, each dot represents one student. In addition, the color of the dots represents the study success of these students using three categories depending on the percentage of obtained study points (orange < 30%, yellow \(\ge \) 30% and < 80%; green \(\ge \) 80%). The colors are adapted for student with color vision deficiency [8]. To support the interpretation of the graphs, textual explanation is provided. To ease the interpretation of the unit charts, dots are grouped in clusters of 25 and an explicit mention of the number of students in each norm group is provided. As reception of the colored dots to clarify the relation between the norm scale groups of last year students and their study success was mixed among the domain experts, an alternative bar chart representation was added, which appears when the students touch (mobile) or hover over (desktop) the dot representation (Fig. 2).
For each of the five skills, the dashboard provides detailed textual guidance for remediation. The advice included simple tips, signposts to extensive information and existing improvement activities provided by the institution, and an invitation to make a personal appointment with a student adviser. To avoid cluttering the initial message in the dashboard, this actionable improvement guidance is not shown at first sight. Rather, at the bottom of each academic skill tab, a button labeled “Okay, what now?" can be clicked to make the extra content visible (Fig. 1).
All textual content, including introduction, learning skill information and improvement tips on each tab, is adapted to the study program and situation of the student based on experience from the field using text parameterization. We invited study counselors from participating study programs to adapt messages based on their expertise. To facilitate this process, messages are chunked into parts and made editable using Markdown, a lightweight text markup language, extended with our own dashboard-specific features like @studentName@ to insert the name of the student or @yourGroup@ to embed a part of the chart legend within the text.
4.3 Data Sources and System Infrastructure
The central (SAP) ERP infrastructure is system of record for all official data on students, programs, courses, and results. The LASSI test data is collected and processed separately and was made available to the project in a comma-separated values (CSV) file. Data from these two sources was consolidated and loaded into a relational database using an Extract, Load, Transform (ELT) process.
The dashboard is accessible indirectly through the university’s reverse proxy infrastructure, enforcing authentication by a central single sign-on system (Shibboleth). This improves security and provides student with a familiar entry point, similar to other campus software. All data access requires invoking a database stored procedure that writes the request into an audit table. The dashboard was set up as a Single Page Application (SPA) that stands on its own once loaded. Each user action, like opening a tab or clicking a button, is transmitted back to the web server using a simple AJAX call with a dummy response. These events are stored into the web server’s access log, to be handled by an extraction routine that operates outside busy hours.
5 Results
5.1 Target Group and Data Collection
1406 first year students in 12 different STEM programs received a personalized invitation by email to access the dashboard, stating that it provides actionable feedback on their learning skills based on a pen and paper questionnaire they earlier filled out in class. Students that did not complete the survey or did not consent to the usage of their data for research were excluded. Apart from the background recording of user activity, the dashboard contains a short feedback form with three questions: (1) I find this information useful; (2) I find this information clear; (3) I would like to receive more of this type of information.
5.2 Dashboard Interaction
1135 (80.7%) of the students clicked on the link in the invitation email and entered the dashboard. The click-through rate differs between study programs and ranges from 63.5% to 89.1% (Fig. 3). 67.7% of the students that did click through, did so within the 48 h after the dashboard was launched, 81.2% within 72 h and 98.1% within 168 h (Fig. 4).
Click-through per study program, expressed as the percentage of invited students. The 12 study programs are grouped as follows: Bio-Engineering; CBBGG (Chemistry, Biology, Biochemistry-Biotechnology, Geography, Geology), Engineering Science, Engineering Science: Architecture, Engineering Technology, and MIP (Mathematics, Informatics, Physics). The width of the bars is proportional to the number of students in the grouped study programs.
Most students clicked through using a desktop browser (74.6%) or a smartphone (22.9%) (Table 1). The use of tablets and other media devices was limited. Students using a non-desktop device are clicking through faster, as indicated by the initial peak in their user share (Fig. 4).
5.3 Feedback
Although the effort required to answer the three survey questions was minimal, only 14.7% of accessing students provided feedback on all three questions. Students using mobile devices such as smartphone and tablet are less tempted to provide feedback than desktop users (Table 1). Most of the students that provided feedback indicated that they find the dashboard useful (71%) and clear (89%). The preference for more information of the same type is also positive, but less pronounced (55%). Figure 5 summarizes the student feedback.
5.4 Student Profile and Behavior
On average, the 80.7% of the students clicking through to the dashboard, have a higher score on the learning skill scores (Fig. 6). This difference is significant for each of the learning skills as shown by a one-directional Mann-Whitney test (p-values 0.01921* for concentration, 0.01043* for anxiety, 0.00223** for motivation, 0.00001**** for test strategy and 0.00104** for time management).
Of the 1135 students that did click through to the dashboard, 399 (35.2%) clicked on ‘Okay, what now?’ to read the improvement tips on the concentration tab, 200 (17.6%) on the anxiety tab, 172 (15.2%) on the motivation tab, 173 (15.2%) on the test strategies tab and 229 (20.2%) on the time management tab. We compared the proportion of students viewing the tips for concentration (first tab) to every other learning skill using a Kruskal-Wallis test, applying a multiple comparison according to Dunn with a Bonferroni correction yielding p-values below \(2\mathrm {e}{-16}\). Tips for concentration have significantly higher views compared to the other tips.
On average, students that read the improvement tips for a specific learning skill, tend to have a lower corresponding learning skill score (Fig. 7). This result is found to be significant for each of the learning skills when applying a one-directional Mann-Whitney test (p-values 0.01681* for concentration, <0.00001**** for anxiety, 0.00360** for motivation, <0.00001**** for test strategy and 0.00016*** for time management).
6 Discussion, Conclusion, and Future Work
In this paper we presented a dashboard to provide actionable feedback on learning skills to students. Our aims were to study the feasibility of deploying such dashboards in a scalable way, to assess the potential of available data and to collect feedback and metrics about utilization, usability and perceived usefulness. Below, we discuss the results from this study with respect to these aims.
One of the objectives was to explore LA applications that are scalable and widely applicable. The dashboard demonstrates this ability, as it based on data that is already available in digital format within a typical higher-education institution (grades) or can easily become available (LASSI questionnaires). This data however, was not yet being fed back to students in a direct, coherent, and personalized way. From a technical perspective, we have chosen to avoid to reinvent the wheel and to rely on existing IT services within the organization when possible. We believe that this increases the acceptability of the solution. Potential scalability issues were avoided by keeping the transactional load limited by preparing most of the data in advance, while deferring the processing of event data until after the peak utilization.
We involved domain experts and practitioners early on in the process and relied on them for the preparation and distribution of the dashboard. We enabled student counselors to adapt the messages delivered to the student based on the study program and individual learning skills. We believe that this approach may enhance the acceptance of the dashboard within the institution, while at the same time improving its overall quality. We noticed that the click-through rate differs between study programs. Further involvement of stakeholders of the respective study programs may help to explain this difference in a follow-up study.
A simple visual representation using unit charts was chosen due to the limited statistical background in the target group of first-year students. For example, in some cases, the size of a norm group can be small. A typical representation with bar or column graph may be misleading, if the reader does not take the absence of statistical significance in consideration. In this case, a unit representation depicts a small number of dots (students) set against a larger population of peers. This should appeal to an intuitive caution not to rush into conclusions. However, the usefulness of unit charts is disputed. Some argue that the visualization, while appealing because of their conceptually simplicity, should be avoided in favor of bar chars, because the latter display the same information without slowing the reader down by encouraging [3] counting. The goal of our dashboard, however, is not to bring a message across as quickly as possible. We expect that the notion that each dot represents not just a number, but a real individual, a peer student, may contribute to the purpose of provoking self-reflection. The validation of this hypothesis is subject of future work.
The proportion of students providing feedback is limited. Although the questions had a prominent place in the dashboard design, there may be an underling usability problem that discourages students to provide feedback, especially for students accessing the dashboard using a mobile device as the feedback rate is even lower for these users. In a subsequent study, feedback gathered from focus groups may help to complement the embedded feedback instrument. In their feedback, students tend to appreciate the usefulness and clearness of the dashboard. When asked if they would like to receive similar (more) feedback however, the response is more moderately positive. The wording of the last question may have been ambiguous as it may refer to more feedback about learning skills in particular as much as it may refer to a more general interest in any type of learning related data, as was intended.
An order effect may explain the increased number of students reading the tips on the concentration tab. Students are likely to try out what happens when clicking on the first instance of the “Okay, what now?” button they encounter, regardless of their specific interest or profile. Based on this finding, which we initially did not intended to look into, we recommend study of an optimized order of the dashboard content by putting the variables that the student needs the most improvement for first.
An interesting finding, is that students that click through to the dashboard, have higher learning skills scores on average. LAD design should take into consideration that reaching different target groups may require different approaches and levels of effort, especially when targeting students with an at-risk profile. On the other hand, this observation may also point out that students with stronger profiles should not be overlooked in the design and that LA should not restrict itself to the at-risk profiles. On the other hand, once students clicked through to the dashboard, we noticed that those that engage more (view the improvement tips for a particular learning skill), are students with a higher “need” (lower scores for the corresponding learning skill on average). Therefore, we conclude from this study that the biggest challenge is to get at-risk students on the dashboard rather than to keep them engaged on the dashboard, which is subject of further research.
References
Arnold, K.E., Pistilli, M.D.: Course signals at Purdue: using learning analytics to increase student success. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, pp. 267–270. ACM (2012)
Bodily, R., Verbert, K.: Trends and issues in student-facing learning analytics reporting systems research. In: Proceedings of the 7th International Learning Analytics and Knowledge Conference, pp. 1–10 (2017)
Few, S.: Show me the Numbers: Designing Tables and Graphs to Enlighten. Analytics Press, Burlingame (2012)
Govaerts, S., Verbert, K., Duval, E., Pardo, A.: The student activity meter for awareness and self-reflection. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems, pp. 869–884. ACM (2012)
Greller, W., Drachsler, H.: Translating learning into numbers: a generic framework for learning analytics. Educ. Technol. Soc. 15(3), 42–57 (2012)
H&H Publishing: LASSI, dutch version:\(\copyright \) H&H Publishing Company, Inc., 1231 Kapp Drive, Clearwater, Florida 33765. Authors: Weinstein, Claire Ellen (1987-2002-2016). Dutch version: Lacante, Lens, Briers (1999, 2017). http://www.hhpublishing.com/_assessments/lassi
Leitner, P., Khalil, M., Ebner, M.: Learning analytics in higher education-a literature review. In: Peña-Ayala, A. (ed.) Learning Analytics: Fundaments, Applications, and Trends, vol. 94, pp. 1–23. Springer, Cham (2017). doi:10.1007/978-3-319-52977-6_1
Okabe, M., Ito, K.: How to make figures and presentations that are friendly to color blind people. University of Tokyo (2002)
Pinxten, M.: At-risk at the gate: prediction of study success of first-year science and engineering students in an open-admission university in Flanders. Any incremental validity of study strategies? (submitted for publication)
Ramos-Soto, A., Lama, M., Vazquez-Barreiros, B., Bugarin, A., Mucientes, M., Barro, S.: Towards textual reporting in learning analytics dashboards. In: 2015 IEEE 15th International Conference on Advanced Learning Technologies, pp. 260–264, July 2015
Santos, J.L., Govaerts, S., Verbert, K., Duval, E.: Goal-oriented visualizations of activity tracking: a case study with engineering students. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, pp. 143–152. ACM (2012)
Schwendimann, B.A., Rodríguez-Triana, M.J., Vozniuk, A., Prieto, L.P., Boroujeni, M.S., Holzer, A., Gillet, D., Dillenbourg, P.: Understanding learning at a glance: An overview of learning dashboard studies. In: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, pp. 532–533. ACM (2016)
Siemens, G., Long, P.: Penetrating the fog: analytics in learning and education. EDUCAUSE Rev. 46(5), 30 (2011)
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics dashboard applications. Am. Behav. Sci. 57(10), 1500–1509 (2013)
Verbert, K., Govaerts, S., Duval, E., Santos, J.L., Van Assche, F., Parra, G., Klerkx, J.: Learning dashboards: an overview and future research opportunities. Pers. Ubiquit. Comput. 18(6), 1499–1514 (2014)
Weinstein, C.E., Zimmerman, S., Palmer, D.: Assessing learning strategies: the design and development of the LASSI. In: Learning and Study Strategies: Issues in Assessment, Instruction, and Evaluation, pp. 25–40 (1988)
Acknowledgement
This research is co-funded by the Erasmus+ program of the European Union (562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD). We thank Kurt De Wit, Tine Overloop and Maarten Pinxten for their assistance and advice and ICTS for providing the technical infrastructure to support the dashboard.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., De Laet, T. (2017). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness. In: Zaphiris, P., Ioannou, A. (eds) Learning and Collaboration Technologies. Technology in Education. LCT 2017. Lecture Notes in Computer Science(), vol 10296. Springer, Cham. https://doi.org/10.1007/978-3-319-58515-4_18
Download citation
DOI: https://doi.org/10.1007/978-3-319-58515-4_18
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58514-7
Online ISBN: 978-3-319-58515-4
eBook Packages: Computer ScienceComputer Science (R0)