Haddadi et al., 2013 - Google Patents
Analysis of task-based gestures in human-robot interactionHaddadi et al., 2013
View PDF- Document ID
- 504742058680007739
- Author
- Haddadi A
- Croft E
- Gleeson B
- MacLean K
- Alcazar J
- Publication year
- Publication venue
- 2013 IEEE International Conference on Robotics and Automation
External Links
Snippet
New developments, innovations, and advancements in robotic technology are paving the way for intelligent robots to enable, support, and enhance the capabilities of human workers in manufacturing environments. We envision future industrial robot assistants that support …
- 230000003993 interaction 0 title abstract description 23
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/406—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Prati et al. | How to include User eXperience in the design of Human-Robot Interaction | |
Dobra et al. | Technology jump in the industry: human–robot cooperation in production | |
Hentout et al. | Human–robot interaction in industrial collaborative robotics: a literature review of the decade 2008–2017 | |
Unhelkar et al. | Human-aware robotic assistant for collaborative assembly: Integrating human motion prediction with planning in time | |
Tsarouchi et al. | Human–robot interaction review and challenges on task planning and programming | |
Romero et al. | Towards an operator 4.0 typology: a human-centric perspective on the fourth industrial revolution technologies | |
Fong et al. | Multi-robot remote driving with collaborative control | |
Stadler et al. | Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control | |
Fang et al. | Novel AR-based interface for human-robot interaction and visualization | |
Chen et al. | Robots for humanity: A case study in assistive mobile manipulation | |
Moon et al. | Design and impact of hesitation gestures during human-robot resource conflicts | |
Coupeté et al. | Gesture recognition using a depth camera for human robot collaboration on assembly line | |
Haddadi et al. | Analysis of task-based gestures in human-robot interaction | |
Martín-Barrio et al. | Application of immersive technologies and natural language to hyper-redundant robot teleoperation | |
Hentout et al. | Key challenges and open issues of industrial collaborative robotics | |
Choi et al. | Preemptive motion planning for human-to-robot indirect placement handovers | |
Rivera-Pinto et al. | Toward programming a collaborative robot by interacting with its digital twin in a mixed reality environment | |
Arntz et al. | A virtual sandbox approach to studying the effect of augmented communication on human-robot collaboration | |
Hart et al. | Gesture, gaze, touch, and hesitation: Timing cues for collaborative work | |
Hertkorn | Shared grasping: A combination of telepresence and grasp planning | |
Aleotti et al. | Evaluation of virtual fixtures for a robot programming by demonstration interface | |
Vanc et al. | Communicating human intent to a robotic companion by multi-type gesture sentences | |
Sylari et al. | Hand gesture-based on-line programming of industrial robot manipulators | |
Sonawani et al. | Projecting robot intentions through visual cues: Static vs. dynamic signaling | |
Sheikholeslami et al. | Exploring the effect of robot hand configurations in directional gestures for human-robot interaction |