Pelz et al., 1999 - Google Patents
Development of a virtual laboratory for the study of complex human behaviorPelz et al., 1999
View PDF- Document ID
- 9221180726192177487
- Author
- Pelz J
- Hayhoe M
- Ballard D
- Shrivastava A
- Bayliss J
- von der Heyde M
- Publication year
- Publication venue
- Stereoscopic Displays and Virtual Reality Systems VI
External Links
Snippet
The study of human perception has evolved from examining simple tasks executed in reduced laboratory conditions to the examination of complex, real-world behaviors. Virtual environments represent the next evolutionary step by allowing full stimulus control and …
- 230000006399 behavior 0 abstract description 20
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
- G06F19/34—Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
- G06F19/3418—Telemedicine, e.g. remote diagnosis, remote control of instruments or remote monitoring of patient carried devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times; Devices for evaluating the psychological state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/04—Detecting, measuring or recording bioelectric signals of the body of parts thereof
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Si-Mohammed et al. | Towards BCI-based interfaces for augmented reality: feasibility, design and evaluation | |
Perrott et al. | Aurally aided visual search under virtual and free-field listening conditions | |
Whitton et al. | Comparing VE locomotion interfaces | |
Duan et al. | Design of a multimodal EEG-based hybrid BCI system with visual servo module | |
Kim et al. | Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking | |
Lewis et al. | Human factors consideration in clinical applications of virtual reality | |
Zhu et al. | Novel eye gaze tracking techniques under natural head movement | |
EP0634031B1 (en) | Apparatus and method for eye tracking interface | |
CA2680462C (en) | Method for real time interactive visualization of muscle forces and joint torques in the human body | |
Park et al. | Eye-controlled human/computer interface using the line-of-sight and the intentional blink | |
Schneider et al. | Eye movement driven head-mounted camera: it looks where the eyes look | |
Bang et al. | New computer interface combining gaze tracking and brainwave measurements | |
Choi et al. | Neural applications using immersive virtual reality: a review on EEG studies | |
Rechy-Ramirez et al. | Impact of commercial sensors in human computer interaction: a review | |
Leeb et al. | Navigation in virtual environments through motor imagery | |
Liu et al. | A novel brain-controlled wheelchair combined with computer vision and augmented reality | |
Pelz et al. | Development of a virtual laboratory for the study of complex human behavior | |
Feick et al. | Investigating noticeable hand redirection in virtual reality using physiological and interaction data | |
Zhang et al. | Using the motion of the head-neck as a joystick for orientation control | |
Cambuzat et al. | Immersive Teleoperation of the Eye Gaze of Social Robots-Assessing Gaze-Contingent Control of Vergence, Yaw and Pitch of Robotic Eyes | |
Tada et al. | Quantifying motor and cognitive function of the upper limb using mixed reality smartglasses | |
Batmaz et al. | Effects of image size and structural complexity on time and precision of hand movements in head mounted virtual reality | |
Ciger et al. | Evaluation of gaze tracking technology for social interaction in virtual environments | |
Twardon et al. | Gaze-contingent audio-visual substitution for the blind and visually impaired | |
Venkatakrishnan et al. | Give me a hand: Improving the effectiveness of near-field augmented reality interactions by avatarizing users' end effectors |