WO2019015765A1 - Apparatus for measuring or training an emotion or behavior - Google Patents
Apparatus for measuring or training an emotion or behavior Download PDFInfo
- Publication number
- WO2019015765A1 WO2019015765A1 PCT/EP2017/068409 EP2017068409W WO2019015765A1 WO 2019015765 A1 WO2019015765 A1 WO 2019015765A1 EP 2017068409 W EP2017068409 W EP 2017068409W WO 2019015765 A1 WO2019015765 A1 WO 2019015765A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- test area
- subject
- stimulation means
- marker
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4884—Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4005—Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
- A61B5/4023—Evaluating sense of balance
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4836—Diagnosis combined with treatment in closed-loop systems or methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
Definitions
- the invention relates to tools for the measuring or training an emotion or behavior of a subject, and for the alignment of stimulation means in general.
- a human may be susceptible to vertigo and fear when exposed to height or depth as a stimulus, or may be susceptible to kinetosis when different senses of his body convey contradictory information about motion to his brain.
- a human subject there is no practical way to test a human subject whether he is susceptible to vertigo, kinetosis, unconditioned anxiety or other strong reactions because the corresponding stimulus cannot be created in a laboratory setting in a manner that appears "convincing" to the subject's brain.
- the subject therefore only learns that he is susceptible when a real situation containing a sufficient amount of the stimulus arises. This may have unpleasant consequences (e.g., if a boat trip is spent vomiting), and may also be a safety hazard (e.g., if a construction worker is suddenly overwhelmed by vertigo and unable to move any further on his own).
- EPM Elevated plus maze
- the animal is placed at the intersection point, and it is measured whether the animal ventures onto the open arms or whether an aversion against heights or open spaces coerces the animal into the apparent safety of the enclosed arms.
- This test is used commercially, e.g., in the screening of drug compounds that are supposed to relieve anxiety.
- the EPM is not practicable.
- the open arms would have to be elevated so high above the ground that safety measures against injury by falling off the EPM would have to be put into place. These safety measures would in turn tell the subject's brain that there is no danger, negating the effect of the intended stimulus.
- the objective of the invention to provide a tool with which a human subject can be tested for reactions to stimuli that previously could not be created in a laboratory setting in a sufficiently realistic manner.
- the invention provides an apparatus and a corresponding computer product, as well as an alignment method and a corresponding computer program product.
- the invention provides an apparatus for measuring or training an emotion or behavior of a subject, wherein said emotion or behavior is triggered or fostered by a specific sensory stimulus.
- the apparatus comprises:
- test area comprising at least a first area and a second area between which the subject is able to move;
- stimulation means that are configured to deliver a sensory impression to the subject, wherein in said sensory impression, the first area and the second area are associated with different amounts of the sensory stimulus. It was found that by augmenting the test area with the stimulation means, surprisingly, the sensory stimulus may be delivered to the subject so convincingly that it is taken for real by the subject's brain. In other words, the stimulation means uncouple the amount of the sensory stimulus delivered from the physical properties of the test area. In addition, compared with just taking the subject outside to a setting where the stimulus is present, the amount of the stimulus is more controllable. This means that the apparatus can be used to measure the degree of the emotion or behavior more accurately: the reaction of the subject can be measured as a function of the amount of stimulus delivered.
- the reaction of the subject may depend on the amount of stimulus in a non-linear manner. For example, there may be a threshold amount required to trigger any reaction at all.
- the possibility to control the amount of the stimulus also has the effect that the apparatus can be used for training. For example, a worker intending to work at heights may be exposed to increasing levels of vertigo- inducing height as a stimulus in order to become immune against vertigo.
- the test area including the first area and the second area may be drawn or projected on a surface, such as a floor.
- the first area may then, for example, be surrounded by pictures containing a first amount of the sensory stimulus, and the second area may be surrounded by pictures containing a second, higher amount of the sensory stimulus.
- the pictures therefore serve as the stimulation means.
- the sensory impression may include a representation of the test area that is spatially aligned to the test area.
- the stimulus may be pointed out even more.
- the representation of the first area may be something that suggests solid ground (e.g., a pavement)
- the representation of the second area may be something that suggests a height (e.g., an open mesh flooring or a diving board).
- the sensory impression may also be designed around the test area, i.e., it may be designed not to contain a representation of the test area.
- the test area comprises at least one structure that is physically distinguishable from the surrounding environment, so as to be stepped onto or into by the subject.
- the structure may be walk-on-able by the subject.
- the test area may be distinguishable from the surrounding environment in a haptic manner, i.e., the subject may be able to feel a boundary of the test area somehow.
- the test area may be surrounded by some sort of feelable barrier against which the subject can brush when moving.
- the structure may also, for example, be a protrusion above the surrounding environment, a recess within the surrounding environment, or have a ground with a texture that is different from the texture of the ground in the surrounding environment, so that the subject may feel when his feet hit or cross the boundary of the test area, or stand on an edge of the test area. If the test area is made physically distinguishable by these or other means, the sensory stimulus is delivered in a more convincing manner, i.e., the brain of the subject will be more likely to take the stimulus for real: the immersion of the subject in the test scenario is higher.
- the test area may comprise at least two elongate structures that intersects at least at one point. The subject may then, e.g., be placed at the intersection point, and the decision when and where the subject moves may be evaluated as a measure for the emotion or behavior under test.
- the stimulation means are configured to deliver a sensory impression of a chasm surrounding at least one of the first area and the second area, so as to convey height or depth as the sensory stimulus.
- height and depth are specific stimuli that previously could not be conveyed in a sufficiently realistic manner in a laboratory setting because the required safety measures would have at least partially negated the very effect of the stimulus.
- the stimulation means may be configured to deliver any kind of stimulus, depending on which emotion or behavior is to be measured or trained.
- the emotion or behavior may be a phobic reaction (due to, e.g., bananaphobia), so the stimulus may, for example, be the presence of the object of the phobia (e.g., bananas) in pictures.
- a visual stimulus is a prime example, but it is not required that the stimulus be a visual one.
- the stimulation means may also be configured to deliver a sound, a smell, a taste, a haptic stimulation, a temperature or an electric stimulation to the subject.
- the apparatus comprises a physical barrier to hinder the subject from leaving the test area in at least one of the first area and the second area.
- the effect of this is two-fold.
- the subject can be prevented from accidentally leaving the test area in a case where he cannot see the boundary of the test area, e.g., if he is wearing a goggle-type augmented-reality or virtual-reality device that delivers the sensory impression while at the same time blocking direct sight of the surrounding environment.
- the impression of the elevated-plus maze namely the distinction between open and closed arms, can be created.
- the stimulation means are configured to deliver a sensory impression that comprises said barrier.
- the barrier may be included in a visible sensory impression, and/or the subject may be given some haptic feedback when approaching or hitting the barrier.
- the stimulation means comprise at least one display or projector configured to display at least part of the sensory impression in the surroundings of the test area.
- This kind of impression is very strong because the major part of the sensory information that is delivered to the human brain enters via the eyes, and this information is delivered most rapidly.
- the apparatus further comprises at least one sensor to detect the position of the subject in the test area. Specifically, this sensor may be used to distinguish whether the subject is in the first area or in the second area. The information captured with the sensor may be used for the quantitative measurement of the emotion or behavior, e.g., by studying the time- position profile of the subject.
- the information may, e.g., be measured how long the subject hesitates to change from one area into the next and how fast the subject then moves.
- the information may also be fed back to the stimulation means to alter the sensory impression.
- the visual and/or auditory impression of falling rocks may specifically be triggered when the subject crosses from the first area to the second area.
- the apparatus further comprises at least one sensor to detect the gaze direction of the subject.
- the gaze direction is another reaction of the subject to the sensory stimulus that may be an early sign before the subject moves from one area to the other. Furthermore, making the sensory impression dependent on the gaze direction may cause the delivery of the sensory stimulus contained therein to be even more convincing and realistic.
- other sensors may be used to measure any other suitable biochemical and psychophysiological quantities that may serve as indicators for the emotion and/or behavior.
- the stimulation means comprise a goggle-type wearable augmented or virtual reality device configured to deliver a sensory impression that is dependent on the position of the subject in the test area and/or on the gaze direction of the subject.
- the impact may be increased further by delivering audio via headphones that block other noise from the surrounding environment, and/or play sound that corresponds to the sensory impression.
- the wearable device may contain, or be tracked by, sensors for determining the position, and/or the gaze direction, of the subject.
- the wearable device may come with a scanning device that is adapted to track the wearable device in space.
- the sensory impression further comprises sound, a blowing wind, and/or heat.
- a blowing wind is specifically advantageous to further strengthen an impression of height, because in real life, a greater height is associated with more wind.
- Heat can be achieved with infrared lights that are mounted to the ceiling to strengthen the impression of being outside (simulating sun beams, a fire or other heat sources).
- a computer may be used as the central element in the apparatus.
- the computer may, for example, create, and control the delivery of, the sensory impression.
- the computer may be used to record the position, and/or the gaze direction, of the subject, and any feedback of this position or gaze direction onto the sensory impression. Therefore, a fair amount, or all, of the functionality of the apparatus may be implemented in software running on the computer.
- This software is a product of its own that may be sold separately to empower a computer that is coupled to stimulation means with the functionality of the invention.
- the invention therefore also relates to a computer program product, comprising machine-readable instructions which, when executed by a computer communicatively coupled to stimulation means, causes a combination of the stimulation means and a test area to form an apparatus according to the invention.
- the sensory impression should be spatially aligned to the test area. If there is a noticeable misalignment, this may convey to the brain of the subject that the sensory impression is not real, negating the intended effect.
- the invention therefore also provides a method to align the sensory impression delivered by stimulation means to a test area. This method comprises the steps of:
- the main a priori information used in this method is the position of the marker relative to an origin in the test area.
- This position may, for example, be determined by measuring. However, this is not the only manner.
- the position may, for example, also be determined by inserting the marker into a receptacle in the test area. The marker can then be removed and re-inserted and will always be at the same defined position of the receptacle.
- the dimensions of the test area may be pre-measured, or it may be automatically determined, e.g., from a photograph.
- the alignment may also be performed adaptively without a marker.
- a camera may register boundaries of the test area and then successively vary the positions and sizes of elements of the sensory impression until the boundaries of the sensory impression are flush with the boundaries of the test area.
- a handheld controller of an augmented or virtual reality device that serves as stimulation means is used as at least one marker.
- One or even two controllers are typically packaged with each such device, and because all interaction of between the user and the apparatus is via the wearable device itself, the controllers are not needed for this purpose.
- Augmented or virtual reality devices typically have a built-in tracking function that specifically tracks the controllers, so by using a controller as a marker, best use can be made of the existing hardware. If a marker is used, preferably, at least two markers are fixed in different positions in the test area. In this manner, in addition to the position, the orientation of the test area may be determined as well.
- the method further comprises: providing, in the test area, a fixture that comprises an inverse shape of at least a portion of the marker.
- a fixture that comprises an inverse shape of at least a portion of the marker.
- Such fixture may, for example, be 3D-printed.
- the alignment can make use of existing hardware that comes with an augmented or virtual reality device. It can therefore be implemented in software that is a product of its own.
- the invention therefore also relates to a computer program product, comprising machine-readable instructions which, when executed by a computer communicatively coupled to an augmented or virtual reality device serving as stimulation means, and/or by an augmented or virtual reality device, cause the computer, and/or the augmented virtual reality device, to perform at least the tracking of the marker (if present) and the determining of the position and orientation of the test area.
- Figure 1 Exemplary embodiment of an apparatus 1
- Figure 2 Delivery of different amount of the stimulus 22 to the subject 2;
- Figure 1 illustrates an exemplary embodiment of the apparatus 1.
- the test area 3 consists of a wooden cross comprising two bars 33 and 34 intersecting in a center 35 where the subject 2 is standing in the snapshot shown in Figure 1. Outside the area of the center 35, on top of the first bar 33, there is a first area 31 , and on top of the second bar 34, there is a second area 32.
- a sensory impression 41 that is delivered to the subject 2 differs between the first area 31 and the second area 32 in the content of height as the stimulus 22:
- the visual sensory impression 41 contains a railing 36, 36a that suggests safety from heights.
- the visual sensory impression 41 contains a chasm 22a with sharp spikes 22b at its bottom that will spear through anyone falling on top of them. This suggests a great height and danger.
- the sensory impression 41 is delivered through a goggle-type wearable device 43 that serves as the stimulation means 4.
- the wearable device 43 comprises a display 42.
- the subject 2 is wearing headphones 43a that are actuated by the wearable device 43.
- the position and orientation of the wearable device 43 are tracked by means of two laser scanning lighthouses 43b and 43 c.
- the lighthouses
- 43b and 43c therefore serve as a sensor 51 that detects the position of the subject 2 in the test area 3, as well as a sensor 52 that detects the gaze direction of the subject 2.
- Figure 2 illustrates the effect that the different sensory impressions 41 associated with the first area 31 and the second area 32 have on the subject 2.
- the sensory impression 41 contains only very little of the stimulus 22, here: height.
- the sensory impression 41 contains very much of this stimulus.
- the amount of stimulus 22 contained therein is converted into emotion or behavior 21 within the brain of the subject 2.
- Figure 3 illustrates the alignment of the test area 3 with respect to the sensory impression 41 delivered by the stimulation means 4 in the example apparatus 1 according to Figure 1.
- handheld controllers 44, 45 of an augmented or virtual reality device that serves as stimulation means 4 are used.
- a fixture 46 that comprises an inverse shape of at least a portion of a handheld controller 44, 45 is placed in the test area 3.
- the controller 44, 45 is placed into the fixture 46, so its position relative to an origin in the test area 3 can be very easily determined in step 120: it follows from the known position of the fixture 46.
- the dimensions of the test area 3 are determined. For example, these dimensions may be pre-stored, but the dimensions may, e.g., also be actively determined by scanning.
- step 140 the position of the handheld controller 44, 45 relative to the stimulation means 4 is tracked by the laser scanning lighthouses 43b and 43 c, by radio or by any other suitable means. From this position in combination with the position of the controller 44, 45 relative to the origin of the test area and the dimensions of the test area, the position and orientation of the test area are determined in step 150. List of reference signs
- test area 3 150 determining position and orientation of test area 3
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Psychiatry (AREA)
- Software Systems (AREA)
- Psychology (AREA)
- Human Computer Interaction (AREA)
- Hospice & Palliative Care (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- Social Psychology (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Physiology (AREA)
- Primary Health Care (AREA)
- Remote Sensing (AREA)
- Epidemiology (AREA)
- Radar, Positioning & Navigation (AREA)
- Architecture (AREA)
- Ophthalmology & Optometry (AREA)
- Educational Technology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
An apparatus(1) for measuring or training an emotion or behavior (21) of a subject (2), wherein said emotion or behavior (21) is triggered or fostered by a specific sensory stimulus (22) and wherein the apparatus (1) comprises: a test area (3) comprising at least a first area (31) and a second area (32) between which the subject(2) is able to move; and stimulation means (4) that are configured to deliver a sensory impression (41) to the subject (2), wherein in said sensory impression (41), the first area (31) and the second area (32) are associated with different amounts of the sensory stimulus (22). A method (100) to align the sensory impression (41) delivered by stimulation means (4) to a test area (3).
Description
Apparatus for measuring or training an emotion or behavior
The invention relates to tools for the measuring or training an emotion or behavior of a subject, and for the alignment of stimulation means in general.
Background
There are certain stimuli to which certain humans react with a strong emotion or behavior, such as anxiety. For example, a human may be susceptible to vertigo and fear when exposed to height or depth as a stimulus, or may be susceptible to kinetosis when different senses of his body convey contradictory information about motion to his brain. Presently, there is no practical way to test a human subject whether he is susceptible to vertigo, kinetosis, unconditioned anxiety or other strong reactions because the corresponding stimulus cannot be created in a laboratory setting in a manner that appears "convincing" to the subject's brain. The subject therefore only learns that he is susceptible when a real situation containing a sufficient amount of the stimulus arises. This may have unpleasant consequences (e.g., if a boat trip is spent vomiting), and may also be a safety hazard (e.g., if a construction worker is suddenly overwhelmed by vertigo and unable to move any further on his own).
For rodents and other small laboratory animals, a test tool is available. It is known under the term "elevated plus maze", EPM, and comprises a plus-shaped structure with two open and two enclosed arms. The animal is placed at the intersection point, and it is measured whether the animal ventures onto the open arms or whether an aversion against heights or open spaces coerces the animal into the apparent safety of the enclosed arms. This test is used commercially, e.g., in the screening of drug compounds that are supposed to relieve anxiety. For humans, the EPM is not practicable. To create a sufficiently strong stimulus, the open arms would have to be elevated so high above the ground that safety measures against injury by falling off the EPM would have to be put into place. These safety
measures would in turn tell the subject's brain that there is no danger, negating the effect of the intended stimulus.
Objective of the invention
It is therefore the objective of the invention to provide a tool with which a human subject can be tested for reactions to stimuli that previously could not be created in a laboratory setting in a sufficiently realistic manner. To this end, the invention provides an apparatus and a corresponding computer product, as well as an alignment method and a corresponding computer program product. Advantageous embodiments are given in the respective dependent claims.
Disclosure of the invention
The invention provides an apparatus for measuring or training an emotion or behavior of a subject, wherein said emotion or behavior is triggered or fostered by a specific sensory stimulus. The apparatus comprises:
a test area comprising at least a first area and a second area between which the subject is able to move; and
stimulation means that are configured to deliver a sensory impression to the subject, wherein in said sensory impression, the first area and the second area are associated with different amounts of the sensory stimulus. It was found that by augmenting the test area with the stimulation means, surprisingly, the sensory stimulus may be delivered to the subject so convincingly that it is taken for real by the subject's brain. In other words, the stimulation means uncouple the amount of the sensory stimulus delivered from the physical properties of the test area.
In addition, compared with just taking the subject outside to a setting where the stimulus is present, the amount of the stimulus is more controllable. This means that the apparatus can be used to measure the degree of the emotion or behavior more accurately: the reaction of the subject can be measured as a function of the amount of stimulus delivered. The reaction of the subject may depend on the amount of stimulus in a non-linear manner. For example, there may be a threshold amount required to trigger any reaction at all. The possibility to control the amount of the stimulus also has the effect that the apparatus can be used for training. For example, a worker intending to work at heights may be exposed to increasing levels of vertigo- inducing height as a stimulus in order to become immune against vertigo.
In a very simple exemplary embodiment, the test area including the first area and the second area may be drawn or projected on a surface, such as a floor. The first area may then, for example, be surrounded by pictures containing a first amount of the sensory stimulus, and the second area may be surrounded by pictures containing a second, higher amount of the sensory stimulus. The pictures therefore serve as the stimulation means.
The sensory impression may include a representation of the test area that is spatially aligned to the test area. By suitably adapting this representation, the stimulus may be pointed out even more. For example, when height or depth is conveyed as a stimulus, the representation of the first area may be something that suggests solid ground (e.g., a pavement), while the representation of the second area may be something that suggests a height (e.g., an open mesh flooring or a diving board).
However, the sensory impression may also be designed around the test area, i.e., it may be designed not to contain a representation of the test area.
In a specially advantageous embodiment of the invention, the test area comprises at least one structure that is physically distinguishable from the surrounding
environment, so as to be stepped onto or into by the subject. Specifically, the structure may be walk-on-able by the subject. For example, the test area may be distinguishable from the surrounding environment in a haptic manner, i.e., the subject may be able to feel a boundary of the test area somehow. For example, the test area may be surrounded by some sort of feelable barrier against which the subject can brush when moving. The structure may also, for example, be a protrusion above the surrounding environment, a recess within the surrounding environment, or have a ground with a texture that is different from the texture of the ground in the surrounding environment, so that the subject may feel when his feet hit or cross the boundary of the test area, or stand on an edge of the test area. If the test area is made physically distinguishable by these or other means, the sensory stimulus is delivered in a more convincing manner, i.e., the brain of the subject will be more likely to take the stimulus for real: the immersion of the subject in the test scenario is higher. As a prime example, in a specially advantageous embodiment of the invention, the test area may comprise at least two elongate structures that intersects at least at one point. The subject may then, e.g., be placed at the intersection point, and the decision when and where the subject moves may be evaluated as a measure for the emotion or behavior under test.
In a further specially advantageous embodiment of the invention, the stimulation means are configured to deliver a sensory impression of a chasm surrounding at least one of the first area and the second area, so as to convey height or depth as the sensory stimulus. As outlined before, height and depth are specific stimuli that previously could not be conveyed in a sufficiently realistic manner in a laboratory setting because the required safety measures would have at least partially negated the very effect of the stimulus.
However, the stimulation means may be configured to deliver any kind of stimulus, depending on which emotion or behavior is to be measured or trained. For example,
the emotion or behavior may be a phobic reaction (due to, e.g., bananaphobia), so the stimulus may, for example, be the presence of the object of the phobia (e.g., bananas) in pictures. A visual stimulus is a prime example, but it is not required that the stimulus be a visual one. For example, the stimulation means may also be configured to deliver a sound, a smell, a taste, a haptic stimulation, a temperature or an electric stimulation to the subject.
Also, combinations of different kinds of stimuli are possible and advantageous. For example, when there is the visual impression of a chasm, this may be augmented by the sound of falling rocks.
In a specially advantageous embodiment of the invention, the apparatus comprises a physical barrier to hinder the subject from leaving the test area in at least one of the first area and the second area. The effect of this is two-fold. First, the subject can be prevented from accidentally leaving the test area in a case where he cannot see the boundary of the test area, e.g., if he is wearing a goggle-type augmented-reality or virtual-reality device that delivers the sensory impression while at the same time blocking direct sight of the surrounding environment. Second, the impression of the elevated-plus maze, namely the distinction between open and closed arms, can be created.
The physical barrier, however, is not required to be actually present in order to achieve said effects. In a further specially advantageous embodiment of the invention, the stimulation means are configured to deliver a sensory impression that comprises said barrier. For example, the barrier may be included in a visible sensory impression, and/or the subject may be given some haptic feedback when approaching or hitting the barrier.
In a specially advantageous embodiment of the invention, the stimulation means comprise at least one display or projector configured to display at least part of the
sensory impression in the surroundings of the test area. This kind of impression is very strong because the major part of the sensory information that is delivered to the human brain enters via the eyes, and this information is delivered most rapidly. In a specially advantageous embodiment of the invention, the apparatus further comprises at least one sensor to detect the position of the subject in the test area. Specifically, this sensor may be used to distinguish whether the subject is in the first area or in the second area. The information captured with the sensor may be used for the quantitative measurement of the emotion or behavior, e.g., by studying the time- position profile of the subject. It may, e.g., be measured how long the subject hesitates to change from one area into the next and how fast the subject then moves. The information may also be fed back to the stimulation means to alter the sensory impression. In the example of height as a stimulus detailed above, the visual and/or auditory impression of falling rocks may specifically be triggered when the subject crosses from the first area to the second area.
In a further specially advantageous embodiment of the invention, the apparatus further comprises at least one sensor to detect the gaze direction of the subject. The gaze direction is another reaction of the subject to the sensory stimulus that may be an early sign before the subject moves from one area to the other. Furthermore, making the sensory impression dependent on the gaze direction may cause the delivery of the sensory stimulus contained therein to be even more convincing and realistic. Alternatively or in combination, other sensors may be used to measure any other suitable biochemical and psychophysiological quantities that may serve as indicators for the emotion and/or behavior.
To provide a sensory impression with a high impact, in a specially advantageous embodiment of the invention, the stimulation means comprise a goggle-type
wearable augmented or virtual reality device configured to deliver a sensory impression that is dependent on the position of the subject in the test area and/or on the gaze direction of the subject. The impact may be increased further by delivering audio via headphones that block other noise from the surrounding environment, and/or play sound that corresponds to the sensory impression. At the same time, the wearable device may contain, or be tracked by, sensors for determining the position, and/or the gaze direction, of the subject. For example, the wearable device may come with a scanning device that is adapted to track the wearable device in space. Preferably, the sensory impression further comprises sound, a blowing wind, and/or heat. A blowing wind is specifically advantageous to further strengthen an impression of height, because in real life, a greater height is associated with more wind. Heat can be achieved with infrared lights that are mounted to the ceiling to strengthen the impression of being outside (simulating sun beams, a fire or other heat sources).
A computer may be used as the central element in the apparatus. The computer may, for example, create, and control the delivery of, the sensory impression. At the same time, the computer may be used to record the position, and/or the gaze direction, of the subject, and any feedback of this position or gaze direction onto the sensory impression. Therefore, a fair amount, or all, of the functionality of the apparatus may be implemented in software running on the computer. This software is a product of its own that may be sold separately to empower a computer that is coupled to stimulation means with the functionality of the invention. The invention therefore also relates to a computer program product, comprising machine-readable instructions which, when executed by a computer communicatively coupled to stimulation means, causes a combination of the stimulation means and a test area to form an apparatus according to the invention.
To have a maximum effect, the sensory impression should be spatially aligned to the test area. If there is a noticeable misalignment, this may convey to the brain of the subject that the sensory impression is not real, negating the intended effect. The invention therefore also provides a method to align the sensory impression delivered by stimulation means to a test area. This method comprises the steps of:
fixing at least one marker in the test area;
determining the position of the marker relative to an origin in the test area; determining the dimensions of the test area;
tracking, by the stimulation means, a position of the marker relative to the stimulation means; and
determining, from the position of the marker relative to the stimulation means in combination with the position of the marker relative to the origin of the test area and the dimensions of the test area, the position and orientation of the test area. The main a priori information used in this method is the position of the marker relative to an origin in the test area. This position may, for example, be determined by measuring. However, this is not the only manner. The position may, for example, also be determined by inserting the marker into a receptacle in the test area. The marker can then be removed and re-inserted and will always be at the same defined position of the receptacle.
The dimensions of the test area may be pre-measured, or it may be automatically determined, e.g., from a photograph. The alignment may also be performed adaptively without a marker. For example, a camera may register boundaries of the test area and then successively vary the positions and sizes of elements of the sensory impression until the boundaries of the sensory impression are flush with the boundaries of the test area.
In a specially advantageous embodiment of the invention, a handheld controller of an augmented or virtual reality device that serves as stimulation means is used as at least one marker. One or even two controllers are typically packaged with each such device, and because all interaction of between the user and the apparatus is via the wearable device itself, the controllers are not needed for this purpose. Augmented or virtual reality devices typically have a built-in tracking function that specifically tracks the controllers, so by using a controller as a marker, best use can be made of the existing hardware. If a marker is used, preferably, at least two markers are fixed in different positions in the test area. In this manner, in addition to the position, the orientation of the test area may be determined as well.
Advantageously, the method further comprises: providing, in the test area, a fixture that comprises an inverse shape of at least a portion of the marker. Such fixture may, for example, be 3D-printed.
Advantageously, the alignment can make use of existing hardware that comes with an augmented or virtual reality device. It can therefore be implemented in software that is a product of its own. The invention therefore also relates to a computer program product, comprising machine-readable instructions which, when executed by a computer communicatively coupled to an augmented or virtual reality device serving as stimulation means, and/or by an augmented or virtual reality device, cause the computer, and/or the augmented virtual reality device, to perform at least the tracking of the marker (if present) and the determining of the position and orientation of the test area.
Description of the Figures
Further advantageous embodiments will now be described in detail using Figures without any intention to limit the scope of the invention. The Figures show:
Figure 1 : Exemplary embodiment of an apparatus 1;
Figure 2: Delivery of different amount of the stimulus 22 to the subject 2;
Figure 1 illustrates an exemplary embodiment of the apparatus 1. The test area 3 consists of a wooden cross comprising two bars 33 and 34 intersecting in a center 35 where the subject 2 is standing in the snapshot shown in Figure 1. Outside the area of the center 35, on top of the first bar 33, there is a first area 31 , and on top of the second bar 34, there is a second area 32. A sensory impression 41 that is delivered to the subject 2 differs between the first area 31 and the second area 32 in the content of height as the stimulus 22: In the first area 31, the visual sensory impression 41 contains a railing 36, 36a that suggests safety from heights. In the second area 32, the visual sensory impression 41 contains a chasm 22a with sharp spikes 22b at its bottom that will spear through anyone falling on top of them. This suggests a great height and danger.
The sensory impression 41 is delivered through a goggle-type wearable device 43 that serves as the stimulation means 4. The wearable device 43 comprises a display 42. In addition, the subject 2 is wearing headphones 43a that are actuated by the wearable device 43. The position and orientation of the wearable device 43 are tracked by means of two laser scanning lighthouses 43b and 43 c. The lighthouses
43b and 43c therefore serve as a sensor 51 that detects the position of the subject 2 in the test area 3, as well as a sensor 52 that detects the gaze direction of the subject 2.
Figure 2 illustrates the effect that the different sensory impressions 41 associated with the first area 31 and the second area 32 have on the subject 2. In the first area
31, the sensory impression 41 contains only very little of the stimulus 22, here: height. By contrast, in the second area 32, the sensory impression 41 contains very much of this stimulus. When the sensory impression 41 is delivered to the subject 2, the amount of stimulus 22 contained therein is converted into emotion or behavior 21 within the brain of the subject 2.
Figure 3 illustrates the alignment of the test area 3 with respect to the sensory impression 41 delivered by the stimulation means 4 in the example apparatus 1 according to Figure 1. For the alignment, handheld controllers 44, 45 of an augmented or virtual reality device that serves as stimulation means 4 are used. As the first step of the alignment method 100, a fixture 46 that comprises an inverse shape of at least a portion of a handheld controller 44, 45 is placed in the test area 3. In step 110, the controller 44, 45 is placed into the fixture 46, so its position relative to an origin in the test area 3 can be very easily determined in step 120: it follows from the known position of the fixture 46.
In step 130, the dimensions of the test area 3 are determined. For example, these dimensions may be pre-stored, but the dimensions may, e.g., also be actively determined by scanning.
In step 140, the position of the handheld controller 44, 45 relative to the stimulation means 4 is tracked by the laser scanning lighthouses 43b and 43 c, by radio or by any other suitable means. From this position in combination with the position of the controller 44, 45 relative to the origin of the test area and the dimensions of the test area, the position and orientation of the test area are determined in step 150.
List of reference signs
1 measurement or training apparatus
2 subject
21 emotion or behavior of subject
22 sensory stimulus giving rise to emotion or behavior 21
22a chasm as stimulus 22 conveying height
22b spikes as stimulus 22 conveying danger
3 test area
31 first area in test area 3
32 second area in test area 3
33, 34 structures in test area 3
35 intersection point of structures 33, 34
36 physical barrier of test area 3
36a impression of physical barrier 36
4 stimulation means
41 sensory impression delivered by means 4 to subject 2
42 display or projector displaying impression 41
43 goggle-type wearable device in stimulation means 4
43a headphones
44, 45 marker for alignment 100
46 fixture for marker 44, 45
51 sensor to detect position of subject 2
52 sensor to detect gaze direction of subject 2
100 alignment method
105 providing fixture 46
110 fixing marker 44 in test area 3
120 determining position of marker 44, 45
130 determining dimensions of test area 3
140 tracking position of marker 44, 45
150 determining position and orientation of test area 3
Claims
1. An apparatus (1) for measuring or training an emotion or behavior (21) of a subject (2), wherein said emotion or behavior (21) is triggered or fostered by a specific sensory stimulus (22) and wherein the apparatus (1) comprises:
a test area (3) comprising at least a first area (31) and a second area (32) between which the subject (2) is able to move; and
stimulation means (4) that are configured to deliver a sensory impression (41) to the subject (2), wherein in said sensory impression (41), the first area (31) and the second area (32) are associated with different amounts of the sensory stimulus (22).
2. The apparatus (1) according to claim 1, wherein the test area (3) comprises at least one structure (33, 34) that is physically distinguishable from the surrounding environment, so as to be stepped onto or into by the subject (2).
3. The apparatus (1) according to any one of claims 1 or 2, wherein the test area (3) comprises at least two elongate structures (33, 34) that intersect at least at one point (35).
4. The apparatus (1) according to any one of claims 1 to 3, wherein the stimulation means (4) are configured to deliver a sensory impression (41) of a chasm (22a) surrounding at least one of the first area (31) and the second area (32), so as to convey height or depth as the sensory stimulus (22).
5. The apparatus (1) according to any one of claims 1 to 4, further comprising a physical barrier (36) to hinder the subject (2) from leaving the test area (3) in at least one of the first area (31) and the second area (32).
6. The apparatus (1) according to any one of claims 1 to 5, wherein the stimulation means (4) are configured to deliver a sensory impression (41) that comprises a barrier (36a) to hinder the subject (2) from leaving the test area (3) in at least one of the first area (31) and the second area (32)
7. The apparatus (1) according to any one of claims 1 to 6, wherein the stimulation means (4) comprise at least one display or projector (42) configured to display at least part of the sensory impression (41) in the surroundings of the test area
(3) .
8. The apparatus (1) according to any one of claims 1 to 7, further comprising at least one sensor (51) to detect the position of the subject (2) in the test area (3).
9. The apparatus (1) according to any one of claims 1 to 8, further comprising at least one sensor (52) to detect the gaze direction of the subject (2).
10. The apparatus (1) according to any one of claims 8 or 9, wherein the stimulation means (4) comprise a goggle-type wearable device (43) configured to deliver a sensory impression (41) that is dependent on the position of the subject (2) in the test area (3), and/or on the gaze direction of the subject (2).
11. The apparatus (1) according to any one of claims 1 to 10, wherein the sensory impression (41) further comprises sound, a blowing wind, and/or heat.
12. A computer program product, comprising machine -readable instructions which, when executed by a computer communicatively coupled to stimulation means
(4) , causes a combination of the stimulation means (4) and a test area (3) to form an apparatus (1) according to any one of claims 1 to 11.
13. A method (100) to align the sensory impression (41) delivered by stimulation means (4) to a test area (3), comprising the steps:
fixing (110) at least one marker (44, 45) in the test area (3);
determining (120) the position of the marker (44, 45) relative to an origin in the test area (3);
determining (130) the dimensions of the test area (3);
tracking (140), by the stimulation means (4), a position of the marker (44, 45) relative to the stimulation means (4); and
determining (150), from the position of the marker (44, 45) relative to the stimulation means (4) in combination with the position of the marker (44, 45) relative to the origin of the test area and the dimensions of the test area (3), the position and orientation of the test area (3).
14. The method (100) according to claim 13, wherein the fixing (110) at least one marker (44) in the test area (3) specifically is: fixing (110) at least two markers (44,
45) in different positions in the test area (3).
15. The method (100) according to any one of claims 13 to 14, wherein a handheld controller of an augmented or virtual reality device that serves as stimulation means (4) is used as at least one marker (44, 45).
16. The method (100) according to any one of claims 13 to 15, further comprising: providing (105), in the test area (3), a fixture (46) that comprises an inverse shape of at least a portion of the marker (44, 45).
17. A computer program product, comprising machine -readable instructions which, when executed by a computer communicatively coupled to an augmented or virtual reality device serving as stimulation means (4), and/or by an augmented or virtual reality device (4), cause the computer, and/or the augmented virtual reality
device (4), to perform at least steps 140 and 150 of the method (100) according to any one of claims 13 to 16.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2017/068409 WO2019015765A1 (en) | 2017-07-20 | 2017-07-20 | Apparatus for measuring or training an emotion or behavior |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2017/068409 WO2019015765A1 (en) | 2017-07-20 | 2017-07-20 | Apparatus for measuring or training an emotion or behavior |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019015765A1 true WO2019015765A1 (en) | 2019-01-24 |
Family
ID=59388073
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2017/068409 WO2019015765A1 (en) | 2017-07-20 | 2017-07-20 | Apparatus for measuring or training an emotion or behavior |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019015765A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021109186A1 (en) * | 2019-12-05 | 2021-06-10 | 中国科学院深圳先进技术研究院 | Animal passive fear-of-heights behavior test device |
CN113575445A (en) * | 2021-08-12 | 2021-11-02 | 中国科学技术大学 | Behaviourological device for testing negative emotion |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120154557A1 (en) * | 2010-12-16 | 2012-06-21 | Katie Stone Perez | Comprehension and intent-based content for augmented reality displays |
US20120249416A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Modular mobile connected pico projectors for a local multi-user collaboration |
WO2016001902A1 (en) * | 2014-07-04 | 2016-01-07 | Libra At Home Ltd | Apparatus comprising a headset, a camera for recording eye movements and a screen for providing a stimulation exercise and an associated method for treating vestibular, ocular or central impairment |
US20160027213A1 (en) * | 2014-07-25 | 2016-01-28 | Aaron Burns | Ground plane adjustment in a virtual reality environment |
US20170116788A1 (en) * | 2015-10-22 | 2017-04-27 | Shandong University | New pattern and method of virtual reality system based on mobile devices |
-
2017
- 2017-07-20 WO PCT/EP2017/068409 patent/WO2019015765A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120154557A1 (en) * | 2010-12-16 | 2012-06-21 | Katie Stone Perez | Comprehension and intent-based content for augmented reality displays |
US20120249416A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Modular mobile connected pico projectors for a local multi-user collaboration |
WO2016001902A1 (en) * | 2014-07-04 | 2016-01-07 | Libra At Home Ltd | Apparatus comprising a headset, a camera for recording eye movements and a screen for providing a stimulation exercise and an associated method for treating vestibular, ocular or central impairment |
US20160027213A1 (en) * | 2014-07-25 | 2016-01-28 | Aaron Burns | Ground plane adjustment in a virtual reality environment |
US20170116788A1 (en) * | 2015-10-22 | 2017-04-27 | Shandong University | New pattern and method of virtual reality system based on mobile devices |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021109186A1 (en) * | 2019-12-05 | 2021-06-10 | 中国科学院深圳先进技术研究院 | Animal passive fear-of-heights behavior test device |
CN113575445A (en) * | 2021-08-12 | 2021-11-02 | 中国科学技术大学 | Behaviourological device for testing negative emotion |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
ES2741023T3 (en) | Systems and methods for martial arts training devices with force, pressure and other anatomically accurate responses | |
Fajen et al. | Reconsidering the role of movement in perceiving action-scaled affordances | |
Fajen | Calibration, information, and control strategies for braking to avoid a collision. | |
US20160253917A1 (en) | Interactive Cognitive-Multisensory Interface Apparatus and Methods for Assessing, Profiling, Training, and Improving Performance of Athletes and other Populations | |
Witt et al. | Spiders appear to move faster than non-threatening objects regardless of one's ability to block them | |
WO2019015765A1 (en) | Apparatus for measuring or training an emotion or behavior | |
EP1642624A3 (en) | Image processing device and image processing method | |
KR20200073492A (en) | Acupuncture training system using mixed reality and acupuncture training method thereof | |
CN109791035A (en) | Object | |
EP3967968A1 (en) | Dart game apparatus and dart game system outputting event effect | |
US20170224265A1 (en) | Apparatus for detecting, diagnosing and exercising an individual's functionalities | |
Bancroft et al. | The throw-and-catch model of human gait: evidence from coupling of pre-step postural activity and step location | |
CN108290065A (en) | Game station for springing back movement | |
Trindade et al. | Purrfect crime: Exploring animal computer interaction through a digital game for humans and cats | |
US20160121184A1 (en) | Guided light system for athletic training and use | |
EP3473972A1 (en) | Dart game apparatus and dart game system | |
RU2017112698A (en) | HUMAN PROTECTION SYSTEM FROM SCATTERED X-RAY RADIATION | |
EP3659014A1 (en) | Visual and inertial motion tracking | |
US20220221254A1 (en) | Dart Game Apparatus and Dart Game System With an Image Projector | |
EP3473971B1 (en) | Dart game system and method for providing a lesson image | |
KR101821592B1 (en) | Walking simulator system using virtual reality and method for calculating pedestrian position thereof | |
US20210259539A1 (en) | Systems, methods, and computer program products for vision assessments using a virtual reality platform | |
KR20160026093A (en) | Putting simulation system and providing method thereof | |
Gutiérrez-Davila et al. | Time required to initiate a defensive reaction to direct and feint attacks in fencing | |
US20200033099A1 (en) | Dart game apparatus and dart game system comprising illumination unit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17743010 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17743010 Country of ref document: EP Kind code of ref document: A1 |