Nothing Special   »   [go: up one dir, main page]

Manns et al., 2021 - Google Patents

Identifying human intention during assembly operations using wearable motion capturing systems including eye focus

Manns et al., 2021

View PDF
Document ID
1026500941863376979
Author
Manns M
Tuli T
Schreiber F
Publication year
Publication venue
Procedia CIRP

External Links

Snippet

Simulating human motion behavior in assembly operations helps to create efficient collaboration plans for humans and robots. However, identifying human intention may require high quality human motion capture data in order to discriminate micro-actions and …
Continue reading at www.sciencedirect.com (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00355Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement

Similar Documents

Publication Publication Date Title
Wang et al. A deep learning-enhanced Digital Twin framework for improving safety and reliability in human–robot collaborative manufacturing
Manns et al. Identifying human intention during assembly operations using wearable motion capturing systems including eye focus
Xiao et al. Vision-based method for tracking workers by integrating deep learning instance segmentation in off-site construction
Voronin et al. Action recognition for the robotics and manufacturing automation using 3-D binary micro-block difference
Li et al. Transfer learning-enabled action recognition for human-robot collaborative assembly
Urgo et al. A human modelling and monitoring approach to support the execution of manufacturing operations
Wang et al. Immersive human–computer interactive virtual environment using large-scale display system
Moutinho et al. Deep learning-based human action recognition to leverage context awareness in collaborative assembly
Núnez et al. Real-time human body tracking based on data fusion from multiple RGB-D sensors
Liu et al. Obstacle avoidance through gesture recognition: Business advancement potential in robot navigation socio-technology
Papanagiotou et al. Egocentric gesture recognition using 3D convolutional neural networks for the spatiotemporal adaptation of collaborative robots
Kozamernik et al. Visual quality and safety monitoring system for human-robot cooperation
Yang et al. Skeleton-based hand gesture recognition for assembly line operation
Zelenskii et al. Control of collaborative robot systems and flexible production cells on the basis of deep learning
Terreran et al. Multi-view human parsing for human-robot collaboration
Kim et al. Visual multi-touch air interface for barehanded users by skeleton models of hand regions
Jha et al. FI-CAP: Robust framework to benchmark head pose estimation in challenging environments
Fernando et al. Computer vision based privacy protected fall detection and behavior monitoring system for the care of the elderly
Park et al. HMPO: Human motion prediction in occluded environments for safe motion planning
Li et al. A comparison of human skeleton extractors for real-time human-robot interaction
Moughlbay et al. Reliable workspace monitoring in safe human-robot environment
Vincze et al. Integrated vision system for the semantic interpretation of activities where a person handles objects
Piciarelli et al. An augmented reality system for technical staff training
Abdul-Khalil et al. A review on object detection for autonomous mobile robot
Halder et al. Natural Interaction Modalities for Human-CPS Interaction in Construction Progress Monitoring