Manns et al., 2021 - Google Patents
Identifying human intention during assembly operations using wearable motion capturing systems including eye focusManns et al., 2021
View PDF- Document ID
- 1026500941863376979
- Author
- Manns M
- Tuli T
- Schreiber F
- Publication year
- Publication venue
- Procedia CIRP
External Links
Snippet
Simulating human motion behavior in assembly operations helps to create efficient collaboration plans for humans and robots. However, identifying human intention may require high quality human motion capture data in order to discriminate micro-actions and …
- 230000000694 effects 0 abstract description 63
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00624—Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | A deep learning-enhanced Digital Twin framework for improving safety and reliability in human–robot collaborative manufacturing | |
Manns et al. | Identifying human intention during assembly operations using wearable motion capturing systems including eye focus | |
Xiao et al. | Vision-based method for tracking workers by integrating deep learning instance segmentation in off-site construction | |
Voronin et al. | Action recognition for the robotics and manufacturing automation using 3-D binary micro-block difference | |
Li et al. | Transfer learning-enabled action recognition for human-robot collaborative assembly | |
Urgo et al. | A human modelling and monitoring approach to support the execution of manufacturing operations | |
Wang et al. | Immersive human–computer interactive virtual environment using large-scale display system | |
Moutinho et al. | Deep learning-based human action recognition to leverage context awareness in collaborative assembly | |
Núnez et al. | Real-time human body tracking based on data fusion from multiple RGB-D sensors | |
Liu et al. | Obstacle avoidance through gesture recognition: Business advancement potential in robot navigation socio-technology | |
Papanagiotou et al. | Egocentric gesture recognition using 3D convolutional neural networks for the spatiotemporal adaptation of collaborative robots | |
Kozamernik et al. | Visual quality and safety monitoring system for human-robot cooperation | |
Yang et al. | Skeleton-based hand gesture recognition for assembly line operation | |
Zelenskii et al. | Control of collaborative robot systems and flexible production cells on the basis of deep learning | |
Terreran et al. | Multi-view human parsing for human-robot collaboration | |
Kim et al. | Visual multi-touch air interface for barehanded users by skeleton models of hand regions | |
Jha et al. | FI-CAP: Robust framework to benchmark head pose estimation in challenging environments | |
Fernando et al. | Computer vision based privacy protected fall detection and behavior monitoring system for the care of the elderly | |
Park et al. | HMPO: Human motion prediction in occluded environments for safe motion planning | |
Li et al. | A comparison of human skeleton extractors for real-time human-robot interaction | |
Moughlbay et al. | Reliable workspace monitoring in safe human-robot environment | |
Vincze et al. | Integrated vision system for the semantic interpretation of activities where a person handles objects | |
Piciarelli et al. | An augmented reality system for technical staff training | |
Abdul-Khalil et al. | A review on object detection for autonomous mobile robot | |
Halder et al. | Natural Interaction Modalities for Human-CPS Interaction in Construction Progress Monitoring |