Miao et al., 2023 - Google Patents
Analysis of facial expressions to estimate the level of engagement in online lecturesMiao et al., 2023
View PDF- Document ID
- 5763017454718614472
- Author
- Miao R
- Kato H
- Hatori Y
- Sato Y
- Shioiri S
- Publication year
- Publication venue
- IEEE Access
External Links
Snippet
The present study aimed to develop a method for estimating students' attentional state from facial expressions during online lectures. We estimated the level of attention while students watched a video lecture by measuring reaction time (RT) to detect a target sound that was …
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
- G09B5/065—Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/04—Detecting, measuring or recording bioelectric signals of the body of parts thereof
- A61B5/0476—Electroencephalography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times; Devices for evaluating the psychological state
- A61B5/168—Evaluating attention deficit, hyperactivity
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Niehorster et al. | The impact of slippage on the data quality of head-worn eye trackers | |
Lin et al. | Mental effort detection using EEG data in E-learning contexts | |
Stöckli et al. | Facial expression analysis with AFFDEX and FACET: A validation study | |
Woźniak et al. | Prioritization of arbitrary faces associated to self: An EEG study | |
Matlovic et al. | Emotions detection using facial expressions recognition and EEG | |
US10089895B2 (en) | Situated simulation for training, education, and therapy | |
US20180285442A1 (en) | Systems and methods for sensory and cognitive profiling | |
Val-Calvo et al. | Affective robot story-telling human-robot interaction: exploratory real-time emotion estimation analysis using facial expressions and physiological signals | |
Chang et al. | Physiological emotion analysis using support vector regression | |
De Carolis et al. | “Engaged Faces”: Measuring and Monitoring Student Engagement from Face and Gaze Behavior | |
Spapé et al. | The semiotics of the message and the messenger: How nonverbal communication affects fairness perception | |
Danner et al. | Automatic facial expressions analysis in consumer science | |
Haines et al. | Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research | |
Miao et al. | Analysis of facial expressions to estimate the level of engagement in online lectures | |
Chen et al. | Dyadic affect in parent-child multimodal interaction: Introducing the dami-p2c dataset and its preliminary analysis | |
Acarturk et al. | Gaze aversion in conversational settings: An investigation based on mock job interview | |
Matlovič | Emotion Detection using EPOC EEG device | |
Varela et al. | Looking at faces in the wild | |
Shaw et al. | Cognitive-aware lecture video recommendation system using brain signal in flipped learning pedagogy | |
Westermann et al. | Measuring facial mimicry: Affdex vs. EMG | |
Banire et al. | One size does not fit all: detecting attention in children with autism using machine learning | |
Taherisadr et al. | Erudite: Human-in-the-loop iot for an adaptive personalized learning system | |
Calot et al. | Multimodal biometric recording architecture for the exploitation of applications in the context of affective computing | |
Miao et al. | Analysis of facial expressions for the estimation of concentration on online lectures | |
Turan et al. | Facial expressions of comprehension (FEC) |