Castellano et al., 2007 - Google Patents
Recognising human emotions from body movement and gesture dynamicsCastellano et al., 2007
View PDF- Document ID
- 2836426535172749543
- Author
- Castellano G
- Villalba S
- Camurri A
- Publication year
- Publication venue
- International conference on affective computing and intelligent interaction
External Links
Snippet
We present an approach for the recognition of acted emotional states based on the analysis of body movement and gesture expressivity. According to research showing that distinct emotions are often associated with different qualities of body movement, we use non …
- 238000004458 analytical method 0 abstract description 18
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
- G06K9/00268—Feature extraction; Face representation
- G06K9/00281—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6288—Fusion techniques, i.e. combining data from various sources, e.g. sensor fusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00362—Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Castellano et al. | Recognising human emotions from body movement and gesture dynamics | |
Kumar et al. | A multimodal framework for sensor based sign language recognition | |
Leo et al. | Computer vision for assistive technologies | |
Karg et al. | Recognition of affect based on gait patterns | |
Saini et al. | Kinect sensor-based interaction monitoring system using the BLSTM neural network in healthcare | |
Monkaresi et al. | Automated detection of engagement using video-based estimation of facial expressions and heart rate | |
Gunes et al. | Bodily expression for automatic affect recognition | |
Valstar et al. | Fully automatic recognition of the temporal phases of facial actions | |
Zhao et al. | Multimodal gait recognition for neurodegenerative diseases | |
Gunes et al. | Categorical and dimensional affect analysis in continuous input: Current trends and future directions | |
Zhang et al. | Intelligent affect regression for bodily expressions using hybrid particle swarm optimization and adaptive ensembles | |
Coppola et al. | Social activity recognition based on probabilistic merging of skeleton features with proximity priors from rgb-d data | |
Happy et al. | Automated alertness and emotion detection for empathic feedback during e-learning | |
Al Osman et al. | Multimodal affect recognition: Current approaches and challenges | |
Liang et al. | Barehanded music: real-time hand interaction for virtual piano | |
Müller et al. | Emotion recognition from embedded bodily expressions and speech during dyadic interactions | |
Wang et al. | Automated student engagement monitoring and evaluation during learning in the wild | |
Samadani et al. | Affective movement recognition based on generative and discriminative stochastic dynamic models | |
Randhavane et al. | The liar's walk: Detecting deception with gait and gesture | |
Roy et al. | A novel technique to develop cognitive models for ambiguous image identification using eye tracker | |
Patwardhan | Multimodal mixed emotion detection | |
Amara et al. | Emotion recognition for affective human digital twin by means of virtual reality enabling technologies | |
Cheng et al. | Computer-aided autism spectrum disorder diagnosis with behavior signal processing | |
Yin | Real-time continuous gesture recognition for natural multimodal interaction | |
Zhang et al. | Engagement estimation of the elderly from wild multiparty human–robot interaction |