Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article
Open access

EduSense: Practical Classroom Sensing at Scale

Published: 09 September 2019 Publication History

Abstract

Providing university teachers with high-quality opportunities for professional development cannot happen without data about the classroom environment. Currently, the most effective mechanism is for an expert to observe one or more lectures and provide personalized formative feedback to the instructor. Of course, this is expensive and unscalable, and perhaps most critically, precludes a continuous learning feedback loop for the instructor. In this paper, we present the culmination of two years of research and development on EduSense, a comprehensive sensing system that produces a plethora of theoretically-motivated visual and audio features correlated with effective instruction, which could feed professional development tools in much the same way as a Fitbit sensor reports step count to an end user app. Although previous systems have demonstrated some of our features in isolation, EduSense is the first to unify them into a cohesive, real-time, in-the-wild evaluated, and practically-deployable system. Our two studies quantify where contemporary machine learning techniques are robust, and where they fall short, illuminating where future work remains to bring the vision of automated classroom analytics to reality.

Supplementary Material

ahuja (ahuja.zip)
Supplemental movie, appendix, image and software files for, EduSense: Practical Classroom Sensing at Scale

References

[1]
A. L. Abrahamson (1998). "An Overview of Teaching and Learning Research with Classroom Communication Systems," International Conference on Teaching of Mathematics, (1998), Village of Pythagorion, Samos, Greece.
[2]
I. Arroyo, D.G. Cooper, W. Burleson, B.P. Woolf, K. Muldner and R. Christopherson (2009). Emotion Sensors Go To School. In Proceedings of the 2009 conference on Artificial Intelligence in Education. IOS Press, Amsterdam, The Netherlands, The Netherlands, 17--24.
[3]
A.E. Austin (2002). Preparing the Next Generation of Faculty: Graduate School as Socialization to the Academic Career. The Journal of Higher Education, 73 (1), 94--122.
[4]
T. Baltrušaitis, A. Zadeh, Y.C. Lim, and L.P. Morency (2018) OpenFace 2.0: Facial Behavior Analysis Toolkit. IEEE International Conference on Automatic Face and Gesture Recognition.
[5]
R. Beckwith, G. Theocharous, D. Avrahami, M. and Philipose. Tabletop (2010). ESP: Everyday Sensing and Perception in the Classroom. Intel® Technology Journal, Volume 14, Issue 1, pages 18--33.
[6]
N. Bosch, Y. Chen and S. D'Mello (2014). It's written on your face: detecting affective states from facial expressions while learning computer programming. In Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014) Switzerland: Springer International Publishing, pp. 39--44.
[7]
N. Bosch, S.K. D'Mello, J. Ocumpaugh, R.S. Baker V. and Shute (2016). Using Video to Automatically Detect Learner Affect in Computer-Enabled Classrooms. In Proc. of ACM Trans. Interact. Intell. Syst. 6, 2, Article 17 (July 2016), 26 pages.
[8]
G. Bradski (2000). The OpenCV Library. Dr. Dobb's Journal of Software Tools. Article 2236121.
[9]
K.T. Brinko (1993). The Practice of Giving Feedback to Improve Teaching: What Is Effective? The Journal of Higher Education, 64(5), 574.
[10]
S.E. Brownell and K.D. Tanner (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and...tensions with professional identity? CBE---Life Sciences Education, 11(Winter), 339--346.
[11]
M. Bubel, R. Jiang, C.H. Lee, W. Shi and A. Tse. 2016. AwareMe: addressing fear of public speech through awareness. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 68--73
[12]
J.E. Caldwell (2007) Clickers in the large classroom: current research and best-practice tips. CBE--- Life Sciences Education 6, 1, 9--20.
[13]
Zhe Cao, Tomas Simon, Shih-En Wei, and Yaser Sheikh. (2017). Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[14]
C.D. Cazden (2001). Variations in lesson structure. In Classroom discourse: The language of teaching and learning (pp. 53--79).
[15]
G. Chen, S.N. Clarke & L.B. Resnick (2015). Classroom Discourse Analyzer (CDA): A Discourse Analytic Tool for Teachers. Technology, Instruction, Cognition & Learning, 10(2).
[16]
M.T.H. Chi and R. Wylie (2014). The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educational Psychologist, 49(4), 219--243.
[17]
A. Cross, E. Cutrell and W. Thies (2012). Low-cost audience polling using computer vision. In Proceedings of the 25th annual ACM symposium on User interface software and technology (UIST '12). ACM, New York, NY, USA, 45--54.
[18]
S. D'Mello, N. Blanchard, R. Baker, J. Ocumpaugh & K. Brawner (2014). I feel your pain: A selective review of affect-sensitive instructional strategies. In Design Recommendations for Intelligent Tutoring Systems, Volume 2: Instructional Management, pp. 35--48.
[19]
S. D'Mello, A.M Olney, N. Blanchard, B. Samei, X. Sun, B. Ward, and S. Kelly (2015). Multimodal Capture of Teacher-Student Interactions for Automated Dialogic Analysis in Live Classrooms. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (ICMI '15). ACM, New York, NY, USA, 557--566.
[20]
S. Draper, J. Cargill, and Q. Cutts (2002). "Electronically Enhanced Classroom Interaction," Australian Journal of Educational Technology, pp. 13--23.
[21]
C. Fies and N. Jill (2006). "Classroom Responses Systems: A Review of the Literature," Journal of Science Education and Technology, 101--09.
[22]
C.J. Finelli, M. Ott, A.C. Gottfried, C. Hershock, C. O'neal, & M. Kaplan (2008). Utilizing instructional consultations to enhance the teaching performance of engineering faculty. Journal of Engineering Education, 97(4), 397--411.
[23]
J. Flachsbart, D. Franklin, and K. Hammond (2000). Improving human computer interaction in a classroom environment using computer vision. In Proceedings of the 5th international conference on Intelligent user interfaces (IUI '00). ACM, New York, NY, USA, 86--93.
[24]
S. Freeman, S.L. Eddy, M. McDonough, M.K. Smith, N. Okoroafor, H. Jordt and M.P. Wenderoth (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410--8415.
[25]
J.E. Gain (2013) Using poll sheets and computer vision as an inexpensive alternative to clickers. In Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference (SAICSIT '13). ACM, New York, NY, USA, 60--63.
[26]
D. Gerritsen, J, Zimmerman, & A. Ogan (2018). Towards a Framework for Smart Classrooms that Teach Instructors to Teach. In International Conference of the Learning Sciences (Vol. 3).
[27]
D. Gerritsen (2018) "A socio-technical approach to feedback and instructional development for teaching assistants" PhD diss., Carnegie Mellon University, 2018,
[28]
G. Gibbs, & M. Coffey (2004). The impact of training of university teachers on their teaching skills, their approach to teaching and the approach to learning of their students. Active learning in higher education, 5(1), 87--100.
[29]
C. Gormally, M. Evans, & P. Brickman (2014). Feedback about Teaching in Higher Ed: Neglected Opportunities to Promote Change. Cell Biology Education, 13(2), 187--199.
[30]
D. Gottlieb & H. Moreira (2012). Should Educational Policies Be Regressive?. Journal of Public Economic Theory, 14(4), 601--623.
[31]
J.F. Grafsgaard, J.B. Wiggins, K.E. Boyer, E.N. Wiebe, and J.C. Lester (2013). Automatically recognizing facial expression: Predicting engagement and frustration. In Proceedings of the 6th International Conference on Educational Data Mining
[32]
M. Hassib, S. Schneegass, P. Eiglsperger, N. Henze, A. Schmidt, and F. Alt. EngageMeter: A system for implicit audience engagement sensing using electroencephalography. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 5114--5119. ACM, 2017.
[33]
P.L. Hardré (2005). Instructional design as a professional development tool-of-choice for graduate teaching assistants. Innovative Higher Education, 30(3), 163--175.
[34]
P.L. Hardré & A.O Burris (2012). What contributes to teaching assistant development: Differential responses to key design features. Instructional Science, 40(1), 93--118.
[35]
C. Henderson, A. Beach, & N. Finkelstein (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952--984.
[36]
M.W. Hesler (1972). An Investigation of Instructor Use of Space. Dissertation Abstracts International, 1972, 33, 3055A
[37]
K. Holstein, G. Hong, M. Tegene, B. McLaren, and V. Aleven, 2018. The classroom as a dashboard: co-designing wearable cognitive augmentation for K-12 teachers. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge. ACM, 79--88.
[38]
K, Holstein, B. McLaren, and V. Aleven (2018). Student learning benefits of a mixed-reality teacher awareness tool in AI-enhanced classrooms. In International Conference on Artificial Intelligence in Education. Springer, 154--168.
[39]
D.R. Ilgen, C.D. Fisher, & M.S. Taylor (1979). Consequences of individual feedback on behavior in organizations. Journal of Applied Psychology, 64(4), 349--371.
[40]
J. Ingram, & V. Elliott (2014). Turn taking and "wait time" in classroom interactions. Journal of Pragmatics, 62, 1--12.
[41]
J. Jesna, A.S. Narayanan, & K. Bijlani (2016) Automatic Hand Raise Detection by Analyzing the Edge Structures. In Emerging Research in Computing, Information, Communication and Applications (ERCICA '16). Springer. pp 171--180.
[42]
JobScheduler. Software- und Organisations-Service (SOS). Last retrieved September 12, 2018. Thttp://www.sos-berlin.com/jobscheduler
[43]
A. Kapoor, W. Burleson, R.W. Picard (2007). Automatic prediction of frustration, International Journal of Human-Computer Studies, v.65 n.8, p.724--736, August.
[44]
V. Kazemi, and J. Sullivan (2014) One Millisecond Face Alignment with an Ensemble of Regression Trees. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[45]
D.E. King (2009). Dlib-ml: A Machine Learning Toolkit. J. Mach. Learn. Res. 10 (December 2009), 1755--1758.
[46]
P. Koshland-Crane (2008). The effect of professional development of nonverbal communication behaviors of participants' regognition and understanding of these behaviors. (University of San Francisco)
[47]
K. Kurihara, M. Goto, J. Ogata, Y. Matsusaka, and T. Igarashi (2007). Presentation Sensei: a presentation training system using speech and image processing. In Proceedings of the 9th international conference on Multimodal interfaces. ACM, 358--365.
[48]
G. Laput, K. Ahuja, M. Goel, and C. Harrison. (2018). Ubicoustics: Plug-and-Play Acoustic Activity Recognition. In Proc. of the annual ACM symposium on User interface software and technology (UIST '18). ACM, New York, NY, USA, 45--54.
[49]
L.R. Larson & M.D. Lovelace (2013). Evaluating the Efficacy of Questioning Strategies in Lecture-Based Classroom Environments: Are We Asking the Right Questions? Journal of Excellence in College Teaching, 24, 1--18.
[50]
V. Lepetit, F. Moreno-Noguer, & P. Fua (2009). EPnP: Efficient perspective-n-point camera pose estimation. International Journal of Computer Vision, 81(2), 155--166.
[51]
J. Lin, F. Jiang & R. Shen. (2018). Hand-Raising Gesture Detection in Real Classroom. 6453--6457.
[52]
G. Littlewort, J. Whitehill, T. Wu, I. Fasel, M. Frank, J. Movellan, and M. Bartlett (2011). The computer expression recognition toolbox (CERT). IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011). pp. 298--305.
[53]
P. Martins. "Anthropometric Model". Last retrieved May 15, 2018. aifi.isr.uc.pt/Downloads/OpenGL/glAnthropometric3DModel.cpp
[54]
D. Maynes-Aminzade, R. Pausch, and S. Seitz (2002). Techniques for Interactive Audience Participation. In Proceedings of the 4th IEEE International Conference on Multimodal Interfaces (ICMI '02). IEEE Computer Society, Washington, DC, USA, 15-.
[55]
A. Mehrabian (1971). Silent messages (Vol. 8). Belmont, CA: Wadsworth.
[56]
J. McShannon, P. Hynes, N. Nirmalakhandan, G. Venkataramana, C. Ricketts, A. Ulery, & R. Steiner (2006). Gaining retention and achievement for students program: A faculty development program. Journal of Professional Issues in Engineering Education and Practice, 132(3), 204--208.
[57]
M. Miura and T. Nakada (2012). Device-Free Personal Response System Based on Fiducial Markers. In Proceedings of the 2012 IEEE Seventh International Conference on Wireless, Mobile and Ubiquitous Technology in Education (WMUTE '12). IEEE Computer Society, Washington, DC, USA, 87--91.
[58]
S. Mota and R.W. Picard (2003). Automated Posture Analysis for Detecting Learner's Interest Level. In Proc. of Conference on Computer Vision and Pattern Recognition Workshop, Madison, Wisconsin, USA, 2003, pp. 49--49.
[59]
C.E. Nunn (1996). Discussion in the College Classroom: Triangulating Observational and Survey Results. The Journal of Higher Education, 67(3), 243--266.
[60]
NVIDIA Visual Profiler. Last Retrieved, September 13, 2018. https://developer.nvidia.com/nvidia-visual-profiler.
[61]
T. Parker, O. Hoopes, & D. Eggett (2011). The effect of seat location and movement or permanence on student-initiated participation. College teaching, 59(2), 79--84.
[62]
R. W. Picard (2011). Measuring affect in the wild. In Proc. of International Conference on Affective Computing and Intelligent Interaction (pp. 3--3). Springer, Berlin, Heidelberg.
[63]
M. Raca and P. Dillenbourg (2013). System for assessing classroom attention. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK '13), ACM, New York, NY, USA, 265--269.
[64]
M. Raca and P. Dillenbourg (2015). "Camera-based estimation of student's attention in class." EPFL Thesis 6745. urn: urn:nbn:ch:belepfl-thesis6745-4
[65]
V.P. Richmond (2002). Teacher nonverbal immediacy. Communication for teachers, 65, 82.
[66]
V.P. Richmond, J.S. Gorham, & J.C. McCroskey (1987). The relationship between selected immediacy behaviors and cognitive learning. Annals of the International Communication Association, 10(1), 574--590.
[67]
K.D. Roach (1991). Graduate teaching assistants' use of behavior alteration techniques in the university classroom. Communication Quarterly, 39(2), 178--188.
[68]
S. Robbins (2011). Beyond clickers: using ClassQue for multidimensional electronic classroom interaction. In Proceedings of the 42nd ACM technical symposium on Computer science education (SIGCSE '11). ACM, New York, NY, USA, 661--666.
[69]
K.A. Rocca (2010). Student Participation in the College Classroom: An Extended Multidisciplinary Literature Review. Communication Education, 59(2), 185--213.
[70]
J.M. Seals & P.A. Kaufman (1975). Effects of nonverbal behavior on student attitudes in the college classroom. Humanist Educator, 14(2), 51--55.
[71]
T. Soukupová, & J. Cech. (2016). Real-time eye blink detection using facial landmarks. In Proc. of 21st Computer Vision Winter Workshop. Rimske Toplice, Slovenia, February 3--5, 2016.
[72]
A. Stes, M. Min-Leliveld, D. Gijbels, & P. Van Petegem (2010). The impact of instructional development in higher education: The state-of-the-art of the research. Educational Research Review, 5(1), 25--49.
[73]
R. Stiefelhagen (2002). Tracking Focus of Attention in Meetings. In Proceedings of the 4th IEEE International Conference on Multimodal Interfaces (ICMI '02). IEEE Computer Society, Washington, DC, USA, 273-.
[74]
A. Stolcke, N. Coccaro, R. Bates, P. Taylor, C. Van Ess-Dykema, K. Ries, E. Shriberg, D. Jurafsky R. Martin and M. Meteer (2000). Dialogue act modeling for automatic tagging and recognition of conversational speech. Computational Linguistics, 26(3): 339--373.
[75]
H. Trinh, R. Asadi, D. Edge and T.W. Bickmore (2017). RoboCOP: A Robotic Coach for Oral Presentations. PACM Interact. Mob. Wearable Ubiquitous Technol. 1, 2, Article 1 (June 2017), 22 pages.
[76]
J. Whitehill, Z. Serpell, Y.-C. Lin, A. Foster & J.R. Movellan (2014). The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing, 5(1), 86--98.
[77]
F. Xhakaj, V. Aleven and B. McLaren (2016). How Teachers Use Data to Help Students Learn: Contextual Inquiry for the Design of a Dashboard. In Adaptive and Adaptable Learning. Springer International Publishing, 340--354.
[78]
F. Xhakaj, V. Aleven and B. McLaren (2017). Effects of a Teacher Dashboard for an Intelligent Tutoring System on Teacher Knowledge, Lesson Planning, Lessons and Student Learning. In Data Driven Approaches in Digital Education. Springer International Publishing, 315--329.
[79]
N. Yannier, K. Koedinger, and S. Hudson (2013), "Tangible Collaborative Learning with a Mixed-Reality Game: EarthShake", Artificial Intelligence in Education, 131--140.
[80]
J. Zaletelj and A. Košir (2017). Predicting students' attention in the classroom from Kinect facial and body features. EURASIP Journal on Image and Video Processing. Volume 1, pages 80.
[81]
C.P. Zeki (2009). The importance of non-verbal communication in classroom management. Procedia-Social and Behavioral Sciences, 1(1), 1443--1449.

Cited By

View all
  • (2024)Evaluating the accuracy of automated processing of child and adult language production in preschool classroomsFrontiers in Psychology10.3389/fpsyg.2024.132266515Online publication date: 26-Jun-2024
  • (2024)Uncovering insights from big data: change point detection of classroom engagementSmart Learning Environments10.1186/s40561-024-00317-611:1Online publication date: 1-Jul-2024
  • (2024)Classroom Sensing Tools: Revolutionizing Classroom-Based Research in the 21st CenturyTopics in Early Childhood Special Education10.1177/0271121423122080044:3(229-242)Online publication date: 11-Jan-2024
  • Show More Cited By

Index Terms

  1. EduSense: Practical Classroom Sensing at Scale

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
    Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 3, Issue 3
    September 2019
    1415 pages
    EISSN:2474-9567
    DOI:10.1145/3361560
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 September 2019
    Published in IMWUT Volume 3, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Audio
    2. Classroom
    3. Computer Vision
    4. Instructor
    5. Machine Learning
    6. Pedagogy
    7. Sensing
    8. Speech Detection
    9. Teacher

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)664
    • Downloads (Last 6 weeks)87
    Reflects downloads up to 13 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Evaluating the accuracy of automated processing of child and adult language production in preschool classroomsFrontiers in Psychology10.3389/fpsyg.2024.132266515Online publication date: 26-Jun-2024
    • (2024)Uncovering insights from big data: change point detection of classroom engagementSmart Learning Environments10.1186/s40561-024-00317-611:1Online publication date: 1-Jul-2024
    • (2024)Classroom Sensing Tools: Revolutionizing Classroom-Based Research in the 21st CenturyTopics in Early Childhood Special Education10.1177/0271121423122080044:3(229-242)Online publication date: 11-Jan-2024
    • (2024)EduLive: Re-Creating Cues for Instructor-Learners Interaction in Educational Live Streams with Learners' Transcript-Based AnnotationsProceedings of the ACM on Human-Computer Interaction10.1145/36869608:CSCW2(1-33)Online publication date: 8-Nov-2024
    • (2024)ClassID: Enabling Student Behavior Attribution from Ambient Classroom Sensing SystemsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595868:2(1-28)Online publication date: 15-May-2024
    • (2024)VizGroup: An AI-assisted Event-driven System for Collaborative Programming Learning AnalyticsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676347(1-22)Online publication date: 13-Oct-2024
    • (2024)Getting it Just Right: Towards Balanced Utility, Privacy, and Equity in Shared Space SensingACM Transactions on Internet of Things10.1145/36484795:2(1-26)Online publication date: 29-Feb-2024
    • (2024)TeamSlides: a Multimodal Teamwork Analytics Dashboard for Teacher-guided Reflection in a Physical Learning SpaceProceedings of the 14th Learning Analytics and Knowledge Conference10.1145/3636555.3636857(112-122)Online publication date: 18-Mar-2024
    • (2024)ERUDITE: Human-in-the-Loop IoT for an Adaptive Personalized Learning SystemIEEE Internet of Things Journal10.1109/JIOT.2023.334346211:8(14532-14550)Online publication date: 15-Apr-2024
    • (2024)The Use of Artificial Intelligence Techniques in Smart Classrooms is in Its InfancyIEEE Access10.1109/ACCESS.2024.345437212(125179-125193)Online publication date: 2024
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media