WO2019074758A1 - System and method for prevention or reduction of motion sickness - Google Patents
System and method for prevention or reduction of motion sickness Download PDFInfo
- Publication number
- WO2019074758A1 WO2019074758A1 PCT/US2018/054381 US2018054381W WO2019074758A1 WO 2019074758 A1 WO2019074758 A1 WO 2019074758A1 US 2018054381 W US2018054381 W US 2018054381W WO 2019074758 A1 WO2019074758 A1 WO 2019074758A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- movement
- motion
- visual image
- display
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0027—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0044—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense
- A61M2021/005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense images, e.g. video
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/33—Controlling, regulating or measuring
- A61M2205/332—Force measuring means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/35—Communication
- A61M2205/3546—Range
- A61M2205/3569—Range sublocal, e.g. between console and disposable
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/35—Communication
- A61M2205/3576—Communication with non implanted data transmission devices, e.g. using external transmitter or receiver
- A61M2205/3584—Communication with non implanted data transmission devices, e.g. using external transmitter or receiver using modem, internet or bluetooth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/35—Communication
- A61M2205/3576—Communication with non implanted data transmission devices, e.g. using external transmitter or receiver
- A61M2205/3592—Communication with non implanted data transmission devices, e.g. using external transmitter or receiver using telemetric means, e.g. radio or optical transmission
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/50—General characteristics of the apparatus with microprocessors or computers
- A61M2205/502—User interfaces, e.g. screens or keyboards
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/50—General characteristics of the apparatus with microprocessors or computers
- A61M2205/502—User interfaces, e.g. screens or keyboards
- A61M2205/505—Touch-screens; Virtual keyboard or keypads; Virtual buttons; Soft keys; Mouse touches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2205/00—General characteristics of the apparatus
- A61M2205/58—Means for facilitating use, e.g. by people with impaired vision
- A61M2205/581—Means for facilitating use, e.g. by people with impaired vision by audible feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/63—Motion, e.g. physical activity
Definitions
- the present invention relates to the prevention or reduction of motion sickness brought about by movement and, more particularly, to a motion sickness prevention or reduction system and method that encourages a subject to generate a compensatory movement in response to a measured movement.
- Motion sickness is generally a result of a sensory mismatch between inner ear sensation, visual information, and other body senses.
- the inner ear, visual information, and other body senses become desynchronized (i.e., mismatched), the body has a similar response to having been poisoned, resulting in vomiting, vertigo, and/or other symptoms.
- passengers in vehicles are more likely to experience motion sickness due to the sensory mismatch between the passenger's body senses (e.g., inner ear) and the passenger's actions (e.g., passenger sitting passively).
- drivers or operators of vehicles are much less likely to experience motion sickness due to the correspondence between body senses and the driver's actions (e.g., controlling the vehicle with expected accelerations, motion, etc.).
- An object of the invention is to solve at least the above problems and/or disadvantages and to provide at least the advantages described hereinafter.
- the present invention provides a system and method for preventing or reducing motion sickness in a subject that measures at least one degree of movement that is experienced by the subject and the system, calculates a compensatory movement in response to the at least one degree of movement, and generates a visual image that encourages the subject to generate the calculated compensatory movement.
- a method for preventing or reducing motion sickness is disclosed. The method includes a step of measuring at least one degree of movement. The measured degree of movement is a shared movement experienced by both the system executing the method and a subject.
- the at least one degree of movement can be a translational movement (e.g., forward, back, lateral, etc.) and/or a rotational movement.
- the method also includes calculating a compensatory movement in response to the at least one measured degree of movement.
- the compensatory movement can be a similar, opposite, and/or composite movement based on the at least one measured degree of movement.
- the method also includes generating an interactive visual image based on the compensatory movement and presenting the interactive visual image to the subject. The interactive visual image encourages the subject to initiate the compensatory movement.
- a system for preventing or reducing motion sickness includes a processor, a motion sensor, a visual output device, and an input device.
- the motion sensor is configured to measure at least one degree of movement of the system.
- the at least one degree of movement corresponds to a shared movement experienced by the system and a user.
- the processor receives the measured movement and generates a compensatory movement.
- the compensatory movement can be a similar, opposite, and/ or composite movement based on the measured movement.
- the processor generates a visual image on the visual output device based on the calculated compensatory movement.
- the visual image encourages a user to move the system to generate the compensatory movement.
- the compensatory movement compensates for movement experienced by the user and avoids motion sickness in a user.
- An embodiment of the invention is a system for preventing or reducing motion sickness in a subject, comprising a motion sensor adapted to measure movement experienced by the subject and the system; a visual output device; an input device; and a processor in signal communication with the motion sensor and the visual output device, wherein the processor is configured to: receive a motion measurement from the motion sensor, calculate a compensatory movement in response to the motion measurement, generate an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via the input device, and display the interactive visual image on the visual output device, wherein the calculated compensatory movement and the displayed interactive visual image are adapted to synchroni2e inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
- Another embodiment of the invention is a method of preventing or reducing motions sickness in a subject, comprising measuring motion experienced by the subject; calculating a compensatory movement in response to the measured motion; generating an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via an input device; and displaying the interactive visual image to the subject; wherein the calculated compensatory movement and the displayed interactive visual image are adapted to synchronize inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
- Figure 1A is a block diagram of a system for preventing or reducing motion sickness in a subject, in accordance with an illustrative embodiment of the present invention
- Figure IB is a schematic diagram showing the system of Fig. 1A positioned within and/or on a vehicle, in accordance with an illustrative embodiment of the present invention
- FIG. 2 is a flowchart of a motion sickness compensation method for reducing or preventing motion sickness, in accordance with an illustrative embodiment of the present invention
- FIG. 3 is a schematic diagram of a motion sickness compensation system for preventing or reducing motion sickness in a subject, in accordance with another illustrative embodiment of the present invention.
- Figure 4 is a schematic diagram of the input/output subsystem of Figure 3, in accordance with an illustrative embodiment of the present invention.
- Figure 5 is a schematic diagram of the communications interface of Figure 3, in accordance with an illustrative embodiment of the present invention.
- Figure 6 is a schematic diagram of the memory subsystem of Figure 3, in accordance with an illustrative embodiment of the present invention.
- Figure 7 is a schematic diagram of the motion sickness compensation system of the present invention in a vehicle and that utilizes a handheld input device, in accordance with an illustrative embodiment of the present invention
- Figure 8 is a schematic diagram of a handheld computing device that incorporates the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention.
- FIGS 9A-9E are schematic diagrams of sample display screens generated by the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention.
- Articles "a” and “an” are used herein to refer to one or to more than one (i.e. at least one) of the grammatical object of the article.
- an element means at least one element and can include more than one element.
- all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
- Figure 1 A is a block diagram of a system 2 for preventing or reducing motion sickness in a subject
- Figure IB is a schematic diagram showing the system 2 positioned within and/ or on a vehicle 6, in accordance with an illustrative embodiment of the present invention.
- the system 2 includes a motion sensor 8, a processor 14, an input device 9 and a visual output device 12.
- the motion sensor 8 includes one or more sensors that sense inertial motion.
- the motion sensor 8 can include one or more of the following sensor types: accelerometer(s); global positioning system (GPS); gyroscope(s); mechanical sensor(s); inclinometer(s); vibrations sensor(s); altimeter(s); optical-based sensor(s); and image -based sensor(s).
- the motion sensor 8 is configured to detect movement (e.g., acceleration) in one or more directions (e.g., degrees of movement).
- the one or more directions can correspond to one or more of six degrees of movement, including translational movement 15 (e.g., up/down, left/right, forward/backward) and/or rotational movement 16 (e.g., pitch, yaw, roll).
- the motion sensor 8 includes a plurality of sensors each configured to measure one or more degrees of movement. The motion M measured by the motion sensor 8 is the same movement experienced by a user 4 of the system 2.
- the system 2 is positioned within and/ or on a vehicle 6 such as, for example, a ground-based vehicle (e.g., car, bus, truck, train, snow-mobile, etc.), a water-based vehicle (e.g., boat, ship, submarine, jet-ski, etc.), a flight-based vehicle (e.g., airplane, helicopter, glider, rocket, etc.), and/ or any other vehicle 6.
- a vehicle 6 such as, for example, a ground-based vehicle (e.g., car, bus, truck, train, snow-mobile, etc.), a water-based vehicle (e.g., boat, ship, submarine, jet-ski, etc.), a flight-based vehicle (e.g., airplane, helicopter, glider, rocket, etc.), and/ or any other vehicle 6.
- a user 4 is positioned within and/ or on the vehicle 6 and experiences movement (e.g., acceleration and/ or deceleration) of the vehicle 6.
- movement of the vehicle 6 can cause motion sickness due to a disconnect between motion of the vehicle 6 and the user 4.
- the system 2 is configured to compensate for the movement of the vehicle 6 to prevent or reduce motion sickness in the user 4.
- the processor 14 is configured to receive a motion measurement M from the motion sensor 8 and to generate a compensatory output.
- the motion sensor 8 can be integrated into system 2, as shown in Figs. 1A and IB, or can be positioned remotely from system 2, as depicted by motion sensor 8' in Fig. IB.
- motions sensors can be both integrated into system 2 and also located remotely from system 2.
- the compensatory output generated by processor 14 can be any suitable output configured to provide motion compensation to a user 4, such as, for example, a visual output displayed by the visual output device 12.
- the processor 14 preferably generates a visual output and displays it via the visual output device 12 in response to motion M sensed by the motion sensor 8.
- the visual output generated by processor 14 and displayed by the visual output device 12 is adapted to encourage a user 4 to provide a compensatory input to the system 2 via input device 9, such that the compensatory input corresponds to motion experienced by the user 4.
- the visual output device 12 can be any type of visual display known in the art, including, but not limited to a virtual reality display, a liquid crystal display, an organic light- emitting diode (OLED) display, a cathode ray tube display, a head-mounted display, a projection type display, a holographic display or any other display known in the art.
- a virtual reality display a liquid crystal display
- OLED organic light- emitting diode
- cathode ray tube display a cathode ray tube display
- a head-mounted display a projection type display
- a holographic display any other display known in the art.
- the visual output generated by the processor 14 and displayed by the visual output device 12 can be, for example, a game that encourages a user 6 to provide an input via the input device 9 to move an object on a path corresponding to the direction of the measured movement.
- the input device 9 can be any type of input device known in the art, including, but not limited to, a button, keypad, keyboard, click wheel, touchscreen, or a motion sensor that detects tilt, lateral movement and/ or shaking, etc. of the visual output device 12.
- any input device known in the art that is capable of allowing the user 4 to enter the recommended input into the system 2 may be used.
- the input device 9 described herein may be implemented as physical mechanical components, virtual elements, and/ or combinations thereof.
- the visual output is generated by the processor 14 so as to synchronize movements of the user 4 with movement of the vehicle 6, such that information generated by an inner ear of the user 4 correlates to actions taken by the user 4 thereby preventing or reducing motion sickness.
- the visual output generated by the processor 14 and displayed by the visual output device 12 can include any suitable visual output, such as terrain, targets, tasks, and/or any other suitable visual output.
- the compensatory input suggested by the processor 14 can relate to game actions, such as, for example, shooting/tracking of a target, movement along a path, collection of one or more objects, placing of one or more objects, etc.
- the visual output generated by the processor 14 and displayed by the visual output device 12 is based on a compensatory movement calculated by the processor 14 in response to a motion measurement M received from the motion sensor 8.
- the compensatory movement calculated by the processor 14 can include a similar, opposite, and/ or composite movement based on the sensed movement M.
- the compensatory movement can movement in a similar direction to the movement M but at a smaller amplitude.
- the compensatory movement can be movement in a single direction derived from sensed movement M in multiple directions. It will be appreciated that the compensatory movement calculated by the processor 14 and the corresponding visual output generated by the processor 14 can be any suitable compensatory movement adapted to elicit an appropriate compensatory input from a user 4 via the corresponding visual output displayed by the visual output device 12.
- the system 2 can be incorporated into and/or used in conjunction with any suitable computing device, vehicle, and/ or other system.
- the system 2 can be wholly and/ or partially incorporated into a mobile computing device (e.g., a mobile phone, tablet computer, laptop computer, etc.), a computing device formed integrally with a vehicle (e.g., an in-flight entertainment system, a vehicle entertainment system, a vehicle display, etc.), a user wearable device, and/or any other suitable system.
- a mobile computing device e.g., a mobile phone, tablet computer, laptop computer, etc.
- a computing device formed integrally with a vehicle e.g., an in-flight entertainment system, a vehicle entertainment system, a vehicle display, etc.
- a user wearable device e.g., a user wearable device, and/or any other suitable system.
- the motion sensor 8 is preferably incorporated into the tablet computer.
- the tablet computer's main processor can be adapted to function as the processor 14 of system 2 or alternatively, a separate processor can be incorporated into the tablet computer to perform the functions of processor 14.
- the visual output device 12 would be the tablet computer's display, and the input device can be the tablet computer's touchscreen and/ or motion sensors in the tablet computer that detects movement of the tablet computer by a user 4.
- Figure 2 is a flowchart of a method for reducing or preventing motion sickness, in accordance with an illustrative embodiment of the present invention. The method 100 is discussed with reference to Figs. 1A, IB and 2.
- movement M (e.g., acceleration) is measured by the motion sensor 8 of system 2.
- the movement M is generated by a vehicle 6, such as a ground-based vehicle, water-based vehicle, flight-based vehicle, and/or other vehicle.
- the measured motion M can include motion in one or more degrees of movement, such as, for example, six degrees of movement.
- the motion sensor 8 can be integrated into system 2, the vehicle 6, a user's 4 clothing, and/or any other suitable device configured to detect motion experienced by the user 4.
- a compensatory output is generated by the processor 14 based on the measured motion M.
- the compensatory output can be any suitable output adapted to provide motion compensation to a user 4, and is preferably a visual output.
- the compensatory output is adapted to elicit a compensatory input from a user 4.
- a compensatory input is received by the processor 14.
- the compensatory input is generated by a user 4 via input device 9.
- the compensatory input is based on sensed movement M and is adapted to synchroni2e movement of the user 4 with experienced movement M.
- the compensatory input is adapted to mimic the movement of an operator of the vehicle 6.
- the compensatory input is adapted to reduce or prevent motion sickness in the user 4.
- the visual output of the system 2, via the visual output device 12, is updated in response to the compensatory input by the user 4.
- the visual output is a game or other interactive visual output and the compensatory input by the user 4 via the input device 9 is translated to movement of one or more elements within the game.
- FIG. 3 is a schematic diagram of a motion sickness compensation system for preventing or reducing motion sickness in a subject, in accordance with another illustrative embodiment of the present invention.
- the motion sickness compensation system 2a is a representative device and may comprise a processor subsystem 204, an input/output subsystem 206, a memory subsystem 208, a communications interface 210, and a system bus 212.
- one or more of the motion sickness compensation system 2a components may be combined or omitted such as, for example, not including the communications interface 210.
- the motion sickness compensation system 2a may comprise other components not combined or included in the components shown in Fig. 3.
- the motion sickness compensation system 2a may also comprise a power subsystem (not shown).
- the motion sickness compensation system 2a may comprise multiple iterations of the components shown in Fig. 3.
- the motion sickness compensation system 2a may comprise multiple memory subsystems 208.
- one of each of the listed components is shown in Fig. 3.
- Processor subsystem 204 has the same functionality as the processor 14 described above in connection with Fig. 1A, and thus the following description of processor subsystem 204 applies equally to the processor 14 of Fig. 1 A.
- the processor subsystem 204 may comprise any processing circuitry operative to control the operations and performance of the motion sickness compensation system 2a.
- the processor subsystem 204 may be implemented as a general purpose processor, a chip multiprocessor (CMP), a dedicated processor, an embedded processor, a digital signal processor (DSP), a network processor, an input/output (I/O) processor, a media access control (MAC) processor, a radio baseband processor, a co-processor, a microprocessor such as a complex instruction set computer (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, and/or a very long instruction word (VLIW) microprocessor, or other processing device.
- the processor subsystem 204 also may be implemented by a controller, a microcontroller, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a programmable logic device (PLD), and so forth.
- the processor subsystem 204 may be arranged to run an operating system (OS) and various applications.
- OS operating system
- applications comprise, for example, a telephone application, a camera (e.g., digital camera, video camera) application, a browser application, a multimedia player application, a gaming application, a messaging application (e.g., email, short message, multimedia), a viewer application, and so forth.
- the motion sickness compensation system 2a may comprise a system bus 212 that couples various system components including the processing subsystem 204, the input/output subsystem 206, the memory subsystem 208, and/or the communications subsystem 210.
- the system bus 212 can be any of several types of bus structure(s) including a memory bus or memory controller, a peripheral bus or external bus, and/ or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MSA), Extended ISA (EISA), InteUigent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect Card International Association Bus (PCMCIA), Small Computers Interface (SCSI) or other proprietary bus, or any custom bus suitable for computing device applications.
- ISA Industrial Standard Architecture
- MSA Micro-Channel Architecture
- EISA Extended ISA
- IDE InteUigent Drive Electronics
- VLB VESA Local Bus
- PCMCIA Peripheral Component Interconnect Card International Association Bus
- SCSI Small Computers Interface
- FIG 4 is a schematic diagram of the input/output subsystem 206 of Figure 3, in accordance with an illustrative embodiment of the present invention.
- the input/ output subsystem 206 may comprise any suitable mechanism or component to at least enable a user to provide input to the motion sickness compensation system 2a and to enable the motion sickness compensation system 2a to provide output to the user.
- the input/ output subsystem 206 may comprise any suitable input mechanism, including but not limited to, a button, keypad, keyboard, click wheel, touchscreen, or motion sensor.
- the input/output subsystem 206 may comprise a capacitive sensing mechanism, or a multi-touch capacitive sensing mechanism. It will be appreciated that any of the input mechanisms described herein may be implemented as physical mechanical components, virtual elements, and/ or combinations thereof.
- the input/ output subsystem 206 preferably comprises a visual peripheral output device 214 for providing a display visible to the user.
- Visual peripheral output device 214 has the same functionality as the visual output device 12 described above in connection with Fig. 1A, and thus the following description of the visual peripheral output device 214 applies equally to the visual output device 12 of Fig. 1A.
- the visual peripheral output device 214 may comprise a screen such as, for example, a Liquid Crystal Display (LCD) screen, incorporated into the motion sickness compensation system 2a.
- the visual peripheral output device 214 may comprise a movable display or projecting system for providing a display of content on a surface remote from the motion sickness compensation system 2a.
- the visual peripheral output device 214 can comprise a coder/ decoder, also known as a Codec, to convert digital media data into analog signals.
- the visual peripheral output device 214 may comprise video Codecs, audio Codecs, or any other suitable type of Codec.
- the visual peripheral output device 214 also may comprise display drivers, circuitry for driving display drivers, or both.
- the visual peripheral output device 214 may be operative to display content under the direction of the processor subsystem 204.
- the visual peripheral output device 214 may be able to play media playback information, application screens for an application implemented on the motion sickness compensation system 2a, information regarding ongoing communications operations, information regarding incoming communications requests, or device operation screens, to name only a few.
- the input/output subsystem 206 preferably comprises a motion sensor 216.
- Motion sensor 216 has the same functionality as the motion sensor 8 described above in connection with Fig. 1A, and thus the following description of the motion sensor 216 applies equally to the motion sensor 8 of Fig. 1A.
- the motion sensor 216 may comprise any suitable motion sensor operative to detect movement of the motion sickness compensation system 2a.
- the motion sensor 216 may be operative to detect acceleration or deceleration of the motion sickness compensation system 2a as manipulated by a user and/ or as caused by a vehicle 6.
- the motion sensor 216 may comprise one or more three-axis acceleration motion sensors (e.g., an accelerometer) operative to detect linear acceleration in three directions (i.e., the x (or left/right) direction, the y (or up /down) direction, and the 2 (or forward/backward) direction).
- the motion sensor 216 may comprise one or more two-axis acceleration motion sensors which may be operative to detect linear acceleration only along each of x (or left/ right) and y (or up/ down) directions (or any other pair of directions).
- the motion sensor 216 may comprise an electrostatic capacitance (capacitance-coupling) accelerometer that is based on silicon micro-machined MEMS (Micro Electro Mechanical Systems) technology, a pie2oelectric type accelerometer, a pie2oresistance type accelerometer, or any other suitable accelerometer.
- electrostatic capacitance capacitance-coupling
- MEMS Micro Electro Mechanical Systems
- the motion sensor 216 may be operative to directly detect rotation, rotational movement, angular displacement, tilt, position, orientation, motion along a non-linear (e.g., arcuate) path, or any other non-linear motions.
- a non-linear path e.g., arcuate
- additional processing may be used to indirectly detect some or all of the non-linear motions.
- the motion sensor 216 may be operative to calculate the tilt of the motion sickness compensation system 2a with respect to the y-axis.
- the motion sensor 216 may instead or in addition comprise one or more gyro-motion sensors or gyroscopes for detecting rotational movement.
- the motion sensor 216 may comprise a rotating or vibrating element.
- the motion sensor 216 may comprise one or more controllers (not shown) coupled to the accelerometers or gyroscopes.
- the controllers may be used to calculate a moving vector of the motion sickness compensation system 2a.
- the moving vector may be determined according to one or more predetermined formulas based on the movement data (e.g., x, y, and z axis movement information) provided by the accelerometers or gyroscopes.
- the input/output subsystem 206 may comprise a virtual input/output system 218.
- the virtual input/output system 218 is capable of providing input/output options by combining one or more input/output components to create a virtual input type.
- the virtual input/output system 218 may enable a user to input information through an on-screen keyboard which utilizes the touchscreen and mimics the operation of a physical keyboard, or using the motion sensor 216 to control a pointer on the screen instead of utilizing the touchscreen.
- the virtual input/output system 218 may enable alternative methods of input and output to enable use of the device by persons having various disabilities.
- the virtual input/ output system 218 may convert on-screen text to spoken words to enable reading-impaired persons to operate the device.
- the input/ output subsystem 206 may comprise speciali2ed output circuitry associated with output devices such as, for example, an audio peripheral output device 220.
- the audio peripheral output device 220 may comprise an audio output including on or more speakers integrated into the motion sickness compensation system 2a.
- the speakers may be, for example, mono or stereo speakers.
- the audio peripheral output device 220 also may comprise an audio component remotely coupled to audio peripheral output device 220 such as, for example, a headset, headphones, and/ or ear buds which may be coupled to the audio peripheral output device 220 through the communications subsystem 210.
- FIG. 5 is a schematic diagram of the communications interface 210 of Figure 3, in accordance with an illustrative embodiment of the present invention.
- the communications interface 210 may comprise any suitable hardware, software, or combination of hardware and software that is capable of coupling the motion sickness compensation system 2a to one or more networks and/ or devices.
- the communications interface 210 may be arranged to operate with any suitable technique for controlling information signals using a desired set of communications protocols, services or operating procedures.
- the communications interface 210 may comprise the appropriate physical connectors to connect with a corresponding communications medium, whether wired or wireless.
- a network may comprise local area networks (LAN) as well as wide area networks (WAN) including without limitation Internet, wired channels, wireless channels, communication devices including telephones, computers, wire, radio, optical or other electromagnetic channels, and combinations thereof, including other devices and/or components capable of/associated with communicating data.
- LAN local area networks
- WAN wide area networks
- the communication environments comprise in-body communications, various devices, and various modes of communications such as wireless communications, wired communications, and combinations of the same.
- Wireless communication modes comprise any mode of communication between points (e.g., nodes) that utilize, at least in part, wireless technology including various protocols and combinations of protocols associated with wireless transmission, data, and devices.
- the points comprise, for example, wireless devices such as wireless headsets, audio and multimedia devices and equipment, such as audio players and multimedia players, telephones, including mobile telephones and cordless telephones, and computers and computer-related devices and components, such as printers.
- Wired communication modes comprise any mode of communication between points that utilize wired technology including various protocols and combinations of protocols associated with wired transmission, data, and devices.
- the points comprise, for example, devices such as audio and multimedia devices and equipment, such as audio players and multimedia players, telephones, including mobile telephones and cordless telephones, and computers and computer-related devices and components, such as printers.
- the wired communication modules may communicate in accordance with a number of wired protocols.
- wired protocols may comprise Universal Serial Bus (USB) communication, RS-232, RS-422, RS-423, RS-485 serial protocols, FireWire, Ethernet, Fibre Channel, MIDI, ATA, Serial ATA, PCI Express, T-l (and variants), Industry Standard Architecture (ISA) parallel communication, Small Computer System Interface (SCSI) communication, or Peripheral Component Interconnect (PCI) communication, to name only a few examples.
- USB Universal Serial Bus
- RS-422 RS-422
- RS-423 RS-485 serial protocols
- FireWire FireWire
- Ethernet Fibre Channel
- MIDI MIDI
- ATA Serial ATA
- PCI Express PCI Express
- T-l and variants
- ISA Industry Standard Architecture
- SCSI Small Computer System Interface
- PCI Peripheral Component Interconnect
- the communications interface 210 may comprise one or more interfaces such as, for example, a wireless communications interface 222, a wired communications interface 224, a network interface, a transmit interface, a receive interface, a media interface, a system interface 226, a component interface, a switching interface, a chip interface, a controller, and so forth.
- the communications interface 210 may comprise a wireless interface 222 comprising one or more antennas 228, transmitters, receivers, transceivers, amplifiers, filters, control logic, and so forth.
- the communications interface 210 may provide voice and/or data communications functionality in accordance with different types of cellular radiotelephone systems.
- the described aspects may communicate over wireless shared media in accordance with a number of wireless protocols.
- wireless protocols may comprise various wireless local area network (WLAN) protocols, including the Institute of Electrical and Electronics Engineers (IEEE) 802. xx series of protocols, such as IEEE 802.11a/b/g/n, IEEE 802.16, IEEE 802.20, and so forth.
- WLAN wireless local area network
- IEEE 802. xx series of protocols such as IEEE 802.11a/b/g/n, IEEE 802.16, IEEE 802.20, and so forth.
- wireless protocols may comprise various wireless wide area network (WW AN) protocols, such as GSM cellular radiotelephone system protocols with GPRS, CDMA cellular radiotelephone communication systems with lxRTT, EDGE systems, EV- DO systems, EV-DV systems, HSDPA systems, and so forth.
- WW AN wireless wide area network
- Further examples of wireless protocols may comprise wireless personal area network (PAN) protocols, such as an Infrared protocol, a protocol from the Bluetooth Special Interest Group (SIG) series of protocols, including Bluetooth Specification versions vl.O, vl.l, vl.2, v2.0, v2.0 with Enhanced Data Rate (EDR), as well as one or more Bluetooth Profiles, and so forth.
- wireless protocols may comprise near-field communication techniques and protocols, such as electro-magnetic induction (EMI) techniques.
- EMI techniques may comprise passive or active radio-frequency identification (RFID) protocols and devices.
- RFID radio-frequency identification
- Other suitable protocols may comprise Ultra Wide Band (UWB), Digital Office (DO), Digital Home, Trusted Platform Module (TPM), ZigBee, and so forth.
- the described aspects may comprise part of a cellular communication system.
- cellular communication systems may comprise CDMA cellular radiotelephone communication systems, GSM cellular radiotelephone systems, North American Digital Cellular (NADC) cellular radiotelephone systems, Time Division Multiple Access (TDMA) cellular radiotelephone systems, Extended-TDMA (E- TDMA) cellular radiotelephone systems, Narrowband Advanced Mobile Phone Service (NAMPS) cellular radiotelephone systems, third generation (3G) wireless standards systems such as WCDMA, CDMA-2000, UMTS cellular radiotelephone systems compliant with the Third-Generation Partnership Project (3GPP), fourth generation (4G) wireless standards, and so forth.
- 3G Third generation
- 3GPP Third-Generation Partnership Project
- 4G fourth generation
- FIG. 6 is a schematic diagram of the memory subsystem 208 of Figure 3, in accordance with an illustrative embodiment of the present invention.
- the memory subsystem 208 may comprise any machine-readable or computer-readable media capable of storing data, including both volatile /non- volatile memory and removable/non-removable memory.
- the memory subsystem 208 may comprise at least one non-volatile memory unit 230 and a local bus 234.
- the non-volatile memory unit 230 is capable of storing one or more software programs 232_l-232_n.
- the software programs 232_l-232_n may contain, for example, applications, user data, device data, and/or configuration data, or combinations therefore, to name only a few.
- the software programs 232_l-232_n may contain instructions executable by the various components of the motion sickness compensation system 2a.
- the memory subsystem 208 may comprise any machine- readable or computer-readable media capable of storing data, including both volatile/ nonvolatile memory and removable/non-removable memory.
- memory may comprise read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDR-RAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory (e.g., NOR or NAND flash memory), content addressable memory (CAM), polymer memory (e.g., ferroelectric polymer memory), phase-change memory (e.g., ovonic memory), ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, disk memory (e.g., floppy disk, hard drive, optical disk, magnetic disk), or card (e.g., magnetic
- the memory subsystem 208 may contain a software program for motion sickness compensation using the capabilities of the motion sickness compensation system 2a and the motion sensor 204, as discussed in connection with FIGS. 1-2.
- the memory subsystem 208 may contain an instruction set, in the form of a file 232_n for executing a method of motion sickness compensation.
- the instruction set may be stored in any acceptable form of machine readable instructions, including source code or various appropriate programming languages. Some examples of programming languages that may be used to store the instruction set comprise, but are not limited to: Java, C, C++, C#, Python, Objective-C, Visual Basic, or .NET programming.
- a compiler or interpreter is comprised to convert the instruction set into machine executable code for execution by the processing subsystem 204.
- FIG. 7 is a schematic diagram of the motion sickness compensation system of the present invention in a vehicle and that utilizes a handheld input device, in accordance with an illustrative embodiment of the present invention.
- the system 2b is similar to systems 2 and 2a described above, and detailed descriptions of common components are not repeated herein.
- the system 2b includes a handheld input device 9 configured to receive one or more inputs from a user 4.
- the handheld input device 9 can include any suitable inputs such as one or more buttons, d-pads, joysticks, motion sensors, toggles, triggers, and/or any other suitable input device.
- Visual output devices 12 are positioned in-line with a user's 4 line of sight.
- local processors 14 are configured to receive one or more inputs from respective motion sensors, such as a local motion sensor 8a and/or a central motion sensor 8b.
- the motion sensors 8a, 8b are configured to detect motion in one or more degrees of freedom of a vehicle 6 containing the user 4 and the system 2b.
- the central motion sensor 8b can provide a signal indicative of the one or more degrees of freedom of the vehicle 6 to local processors 14.
- the local processors 14 generate images on the visual output devices 12 (e.g., displays) configured to elicit compensatory movement and/ or inputs from a respective user 4.
- the system 2b includes an audio output system 36, such as, for example, headphones.
- the audio output system 36 is configured to provide an audio component to the user 4 to further reduce motion sickness.
- the audio output system 36 is configured to provide an auditory distraction, such as game sounds, music, etc., in conjunction with the visual output from the visual output device 12.
- the audio output can be matched to the visual output of the visual output device 12, can be user selected audio output, and/or audio output generated by the system 2b.
- FIG. 8 is a schematic diagram of a handheld computing device that incorporates the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention.
- the system 2c is similar to systems 2, 2a and 2b discussed above, and detailed descriptions of common components are not repeated herein.
- the system 2c includes a handheld computing device 40 containing one or more of the elements of the system 2c.
- the handheld computing device 40 includes a processor 44, a motion sensor 8, a visual output device 12, one or more input devices, and/ or any other suitable elements of the system 2.
- the visual output device 12 includes a touchscreen or other input device formed integrally with, above, and/ or below the visual output device 12 to receive input from a user 4.
- Figures 9A-9E are schematic diagrams of sample display screens generated by the motion sickness compensation system 2c of Fig. 8, in accordance with an illustrative embodiment of the present invention.
- Fig. 9A illustrates an initial or starting state of a visual output device 12.
- the visual output device 12 is configured to display a simulated vehicle or target 50 aligned with a simulated horizon 52.
- Fig. 9B illustrates the visual output device 12 during an angular acceleration (e.g., a turn) of the vehicle 6 containing system 2c.
- the visual output device 12 shows the simulated horizon 52 tilted in response to the angular acceleration.
- the vehicle 6 has a leftward angular acceleration (e.g., a left turn) which corresponds to a lowered left-side and raised right-side of the simulated horizon 52.
- tilting or rotation of the simulated horizon 52 can be in any suitable direction corresponding to movement, such as angular acceleration, of the vehicle 6.
- the tilted simulated horizon 52 encourages a user 4 to provide an input to align the simulated vehicle 50 with the simulated horizon 52.
- the handheld computing device 40 includes the visual output device 12 and one or more inputs (not shown).
- a user 4 tilts the handheld computing device 40 to rotate the simulated vehicle 50 into alignment with the simulated horizon 52.
- the movement of the handheld computing device 40 (or other input) provides simulated movement control that corresponds to the movement detected by one or more biological systems of the user 4.
- the input device is configured to mimic a control of the vehicle 6 (such as a steering wheel) to compensate for motion experienced by a passenger in the vehicle.
- Fig. 9D illustrates the visual output device 12 during a linear acceleration (e.g., increasing/decreasing speed).
- the simulated horizon 52 is raised relative to a center line of the visual output device 12.
- the simulated horizon 52 can be moved at rate proportional to the movement/acceleration of the vehicle 6. In other embodiments, the simulated horizon 52 is moved at a fixed rate unrelated to the magnitude of the movement/ acceleration of the vehicle 6. Movement of the simulated horizon 52 encourages compensatory input from the user 4.
- a user 4 can tilt and/ or raise the handheld computing device 40 to compensate for movement of the simulated horizon 52.
- the movement of the handheld computing device 40 (or other input) provides simulated movement control that corresponds to the movement detected by one or more biological systems of the user 4.
- the compensatory input can be provided by any suitable input device, such as a controller, a handheld device, a head mounted device, a gesture tracking device, and/ or any other suitable input.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Animal Behavior & Ethology (AREA)
- General Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Psychology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Dentistry (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Social Psychology (AREA)
- Physiology (AREA)
- Acoustics & Sound (AREA)
- Anesthesiology (AREA)
- Hematology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system and method for preventing or reducing motion sickness in a subject is provided that measures at least one degree of movement that is experienced by the subject and the system, calculates a compensatory movement in response to the at least one degree of movement, and generates a visual image that encourages the subject to generate the calculated compensatory movement.
Description
SYSTEM AND METHOD FOR PREVENTION OR REDUCTION OF MOTION
SICKNESS
Statement of Related Cases
[0001] This application claims priority to U.S. Provisional Application Serial No. 62/570,289, filed October 10, 2017, whose entire disclosure is incorporated herein by reference.
Field of the Invention
[0002] The present invention relates to the prevention or reduction of motion sickness brought about by movement and, more particularly, to a motion sickness prevention or reduction system and method that encourages a subject to generate a compensatory movement in response to a measured movement.
Background of the Invention
[0003 ] Motion sickness is generally a result of a sensory mismatch between inner ear sensation, visual information, and other body senses. When the inner ear, visual information, and other body senses become desynchronized (i.e., mismatched), the body has a similar response to having been poisoned, resulting in vomiting, vertigo, and/or other symptoms. For example, passengers in vehicles are more likely to experience motion sickness due to the sensory mismatch between the passenger's body senses (e.g., inner ear) and the passenger's actions (e.g., passenger sitting passively). In contrast, drivers or operators of vehicles are much less likely to experience motion sickness due to the
correspondence between body senses and the driver's actions (e.g., controlling the vehicle with expected accelerations, motion, etc.).
[0004] Current treatment options include medications and static visuals. Medications treat symptoms of motion sickness but do not address the root cause (i.e., the desynchroni2ation of senses). Static visuals can provide some relief, but fail to provide synchronization between all three of: (1) an inner ear sensation; (2) visual information; and (3) other body senses. Static visuals fail to provide compensation for frequencies that commonly cause motion sickness such as, for example, 1 vibration every 3-10 seconds (e.g., ocean waves, vehicle movement on mountain roads, etc.). Thus, there is a need for a system and method to prevent or reduce motion sickness that does not rely on medication and that is capable of compensating for movement frequencies that commonly cause motion sickness.
SUMMARY OF THE INVENTION
[0005] An object of the invention is to solve at least the above problems and/or disadvantages and to provide at least the advantages described hereinafter.
[0006] The present invention provides a system and method for preventing or reducing motion sickness in a subject that measures at least one degree of movement that is experienced by the subject and the system, calculates a compensatory movement in response to the at least one degree of movement, and generates a visual image that encourages the subject to generate the calculated compensatory movement.
[0007] In various embodiments, a method for preventing or reducing motion sickness is disclosed. The method includes a step of measuring at least one degree of movement. The measured degree of movement is a shared movement experienced by both the system executing the method and a subject. The at least one degree of movement can be a translational movement (e.g., forward, back, lateral, etc.) and/or a rotational movement. The method also includes calculating a compensatory movement in response to the at least one measured degree of movement. The compensatory movement can be a similar, opposite, and/or composite movement based on the at least one measured degree of movement. The method also includes generating an interactive visual image based on the compensatory movement and presenting the interactive visual image to the subject. The interactive visual image encourages the subject to initiate the compensatory movement.
[0008] In various embodiments, a system for preventing or reducing motion sickness is disclosed. The system includes a processor, a motion sensor, a visual output device, and an input device. The motion sensor is configured to measure at least one degree of movement of the system. The at least one degree of movement corresponds to a shared movement experienced by the system and a user. The processor receives the measured movement and generates a compensatory movement. The compensatory movement can be a similar, opposite, and/ or composite movement based on the measured movement. The processor generates a visual image on the visual output device based on the calculated compensatory movement. The visual image encourages a user to move the system to
generate the compensatory movement. The compensatory movement compensates for movement experienced by the user and avoids motion sickness in a user.
[0009] An embodiment of the invention is a system for preventing or reducing motion sickness in a subject, comprising a motion sensor adapted to measure movement experienced by the subject and the system; a visual output device; an input device; and a processor in signal communication with the motion sensor and the visual output device, wherein the processor is configured to: receive a motion measurement from the motion sensor, calculate a compensatory movement in response to the motion measurement, generate an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via the input device, and display the interactive visual image on the visual output device, wherein the calculated compensatory movement and the displayed interactive visual image are adapted to synchroni2e inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
[0010] Another embodiment of the invention is a method of preventing or reducing motions sickness in a subject, comprising measuring motion experienced by the subject; calculating a compensatory movement in response to the measured motion; generating an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via an input device; and displaying the interactive visual image to the subject; wherein the calculated compensatory movement and the displayed interactive visual image are adapted to
synchronize inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The invention will be described in detail with reference to the following drawings in which like reference numerals refer to like elements wherein:
[0012] Figure 1A is a block diagram of a system for preventing or reducing motion sickness in a subject, in accordance with an illustrative embodiment of the present invention;
[0013] Figure IB is a schematic diagram showing the system of Fig. 1A positioned within and/or on a vehicle, in accordance with an illustrative embodiment of the present invention;
[0014] Figure 2 is a flowchart of a motion sickness compensation method for reducing or preventing motion sickness, in accordance with an illustrative embodiment of the present invention;
[0015] Figure 3 is a schematic diagram of a motion sickness compensation system for preventing or reducing motion sickness in a subject, in accordance with another illustrative embodiment of the present invention;
[0016] Figure 4 is a schematic diagram of the input/output subsystem of Figure 3, in accordance with an illustrative embodiment of the present invention;
[0017] Figure 5 is a schematic diagram of the communications interface of Figure 3, in accordance with an illustrative embodiment of the present invention;
[0018] Figure 6 is a schematic diagram of the memory subsystem of Figure 3, in accordance with an illustrative embodiment of the present invention;
[0019] Figure 7 is a schematic diagram of the motion sickness compensation system of the present invention in a vehicle and that utilizes a handheld input device, in accordance with an illustrative embodiment of the present invention;
[0020] Figure 8 is a schematic diagram of a handheld computing device that incorporates the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention; and
[0021] Figures 9A-9E are schematic diagrams of sample display screens generated by the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0022] In the following detailed description of various embodiments of the system and method of the present invention, numerous specific details are set forth in order to provide a thorough understanding of various aspects of one or more embodiments. However, the one or more embodiments may be practiced without some or all of these specific details. In other instances, well-known methods, procedures, and/or components have not been described in detail so as not to unnecessarily obscure aspects of embodiments.
[0023] Articles "a" and "an" are used herein to refer to one or to more than one (i.e. at least one) of the grammatical object of the article. By way of example, "an element" means
at least one element and can include more than one element. Unless otherwise defined, all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
[0024] The drawing figures are not necessarily to scale and certain features of the invention may be shown exaggerated in scale or in somewhat schematic form in the interest of clarity and conciseness. In this description, relative terms such as "hori2ontal," "vertical," "up," "down," "top," "bottom," as well as derivatives thereof (e.g., "hori2ontally," "downwardly," "upwardly," etc.) should be construed to refer to the orientation as then described or as shown in the drawing figure under discussion. These relative terms are for convenience of description and normally are not intended to require a particular orientation.
[0025] Terms including "inwardly" versus "outwardly," "longitudinal" versus "lateral" and the like are to be interpreted relative to one another or relative to an axis of elongation, or an axis or center of rotation, as appropriate. Terms concerning electrical attachments, coupling and the like, such as "electrically connected," "electrically coupled," or "in signal communication" refer to a relationship wherein elements are electrically coupled to one another either directly or indirectly through intervening elements and through any combination of wired or wireless communication channels. The terms "user," "subject" and "passenger" are used interchangeably and refer to individuals that utili2e the present invention for reducing or preventing motion sickness. The terms "movement" and "motion" are also used interchangeably.
[0026] While preferred embodiments are disclosed, still other embodiments of the system and method of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments. As will be realized, the following disclosure is capable of modifications in various obvious aspects, all without departing from the spirit and scope of the present invention. Also, the reference or non-reference to a particular embodiment of the invention shall not be interpreted to limit the scope of the present invention.
[0027] Figure 1 A is a block diagram of a system 2 for preventing or reducing motion sickness in a subject, and Figure IB is a schematic diagram showing the system 2 positioned within and/ or on a vehicle 6, in accordance with an illustrative embodiment of the present invention. The system 2 includes a motion sensor 8, a processor 14, an input device 9 and a visual output device 12.
[0028] The motion sensor 8 includes one or more sensors that sense inertial motion. For example, the motion sensor 8 can include one or more of the following sensor types: accelerometer(s); global positioning system (GPS); gyroscope(s); mechanical sensor(s); inclinometer(s); vibrations sensor(s); altimeter(s); optical-based sensor(s); and image -based sensor(s). The motion sensor 8 is configured to detect movement (e.g., acceleration) in one or more directions (e.g., degrees of movement). The one or more directions can correspond to one or more of six degrees of movement, including translational movement 15 (e.g., up/down, left/right, forward/backward) and/or rotational movement 16 (e.g., pitch, yaw, roll). In some embodiments, the motion sensor 8 includes a plurality of sensors each
configured to measure one or more degrees of movement. The motion M measured by the motion sensor 8 is the same movement experienced by a user 4 of the system 2.
[0029] As illustrated in Fig. IB, the system 2 is positioned within and/ or on a vehicle 6 such as, for example, a ground-based vehicle (e.g., car, bus, truck, train, snow-mobile, etc.), a water-based vehicle (e.g., boat, ship, submarine, jet-ski, etc.), a flight-based vehicle (e.g., airplane, helicopter, glider, rocket, etc.), and/ or any other vehicle 6. A user 4 is positioned within and/ or on the vehicle 6 and experiences movement (e.g., acceleration and/ or deceleration) of the vehicle 6.
[0030] In some cases, movement of the vehicle 6 can cause motion sickness due to a disconnect between motion of the vehicle 6 and the user 4. The system 2 is configured to compensate for the movement of the vehicle 6 to prevent or reduce motion sickness in the user 4.
[0031]The processor 14 is configured to receive a motion measurement M from the motion sensor 8 and to generate a compensatory output. The motion sensor 8 can be integrated into system 2, as shown in Figs. 1A and IB, or can be positioned remotely from system 2, as depicted by motion sensor 8' in Fig. IB. In addition, motions sensors can be both integrated into system 2 and also located remotely from system 2.
[0032] The compensatory output generated by processor 14 can be any suitable output configured to provide motion compensation to a user 4, such as, for example, a visual output displayed by the visual output device 12. The processor 14 preferably generates a visual output and displays it via the visual output device 12 in response to
motion M sensed by the motion sensor 8. The visual output generated by processor 14 and displayed by the visual output device 12 is adapted to encourage a user 4 to provide a compensatory input to the system 2 via input device 9, such that the compensatory input corresponds to motion experienced by the user 4.
[0033] The visual output device 12 can be any type of visual display known in the art, including, but not limited to a virtual reality display, a liquid crystal display, an organic light- emitting diode (OLED) display, a cathode ray tube display, a head-mounted display, a projection type display, a holographic display or any other display known in the art.
[0034] The visual output generated by the processor 14 and displayed by the visual output device 12 can be, for example, a game that encourages a user 6 to provide an input via the input device 9 to move an object on a path corresponding to the direction of the measured movement. The input device 9 can be any type of input device known in the art, including, but not limited to, a button, keypad, keyboard, click wheel, touchscreen, or a motion sensor that detects tilt, lateral movement and/ or shaking, etc. of the visual output device 12. In general, any input device known in the art that is capable of allowing the user 4 to enter the recommended input into the system 2 may be used. Further, the input device 9 described herein may be implemented as physical mechanical components, virtual elements, and/ or combinations thereof.
[0035] The visual output is generated by the processor 14 so as to synchronize movements of the user 4 with movement of the vehicle 6, such that information generated by an inner ear of the user 4 correlates to actions taken by the user 4 thereby preventing or
reducing motion sickness. The visual output generated by the processor 14 and displayed by the visual output device 12 can include any suitable visual output, such as terrain, targets, tasks, and/or any other suitable visual output. The compensatory input suggested by the processor 14 can relate to game actions, such as, for example, shooting/tracking of a target, movement along a path, collection of one or more objects, placing of one or more objects, etc.
[0036] As discussed above, the visual output generated by the processor 14 and displayed by the visual output device 12 is based on a compensatory movement calculated by the processor 14 in response to a motion measurement M received from the motion sensor 8. The compensatory movement calculated by the processor 14 can include a similar, opposite, and/ or composite movement based on the sensed movement M. For example, in some embodiments, the compensatory movement can movement in a similar direction to the movement M but at a smaller amplitude. As another example, the compensatory movement can be movement in a single direction derived from sensed movement M in multiple directions. It will be appreciated that the compensatory movement calculated by the processor 14 and the corresponding visual output generated by the processor 14 can be any suitable compensatory movement adapted to elicit an appropriate compensatory input from a user 4 via the corresponding visual output displayed by the visual output device 12.
[0037] The system 2 can be incorporated into and/or used in conjunction with any suitable computing device, vehicle, and/ or other system. For example, the system 2 can be wholly and/ or partially incorporated into a mobile computing device (e.g., a mobile phone,
tablet computer, laptop computer, etc.), a computing device formed integrally with a vehicle (e.g., an in-flight entertainment system, a vehicle entertainment system, a vehicle display, etc.), a user wearable device, and/or any other suitable system. Although specific embodiments are discussed herein, including specific devices and/or vehicles, it will be appreciated that the system 2 can be incorporated into any suitable computing device and/ or vehicle while still falling within the scope of the present invention.
[0038] For example, if the system 2 is incorporated into a tablet computer, the motion sensor 8 is preferably incorporated into the tablet computer. The tablet computer's main processor can be adapted to function as the processor 14 of system 2 or alternatively, a separate processor can be incorporated into the tablet computer to perform the functions of processor 14. The visual output device 12 would be the tablet computer's display, and the input device can be the tablet computer's touchscreen and/ or motion sensors in the tablet computer that detects movement of the tablet computer by a user 4.
[0039] Figure 2 is a flowchart of a method for reducing or preventing motion sickness, in accordance with an illustrative embodiment of the present invention. The method 100 is discussed with reference to Figs. 1A, IB and 2.
[0040] At step 102, movement M (e.g., acceleration) is measured by the motion sensor 8 of system 2. The movement M is generated by a vehicle 6, such as a ground-based vehicle, water-based vehicle, flight-based vehicle, and/or other vehicle. The measured motion M can include motion in one or more degrees of movement, such as, for example, six degrees of movement. The motion sensor 8 can be integrated into system 2, the vehicle
6, a user's 4 clothing, and/or any other suitable device configured to detect motion experienced by the user 4.
[0041] At step 104, a compensatory output is generated by the processor 14 based on the measured motion M. The compensatory output can be any suitable output adapted to provide motion compensation to a user 4, and is preferably a visual output. The compensatory output is adapted to elicit a compensatory input from a user 4.
[0042] At step 106, a compensatory input is received by the processor 14. The compensatory input is generated by a user 4 via input device 9. The compensatory input is based on sensed movement M and is adapted to synchroni2e movement of the user 4 with experienced movement M. For example, in some embodiments, the compensatory input is adapted to mimic the movement of an operator of the vehicle 6. The compensatory input is adapted to reduce or prevent motion sickness in the user 4.
[0043] At step 108, the visual output of the system 2, via the visual output device 12, is updated in response to the compensatory input by the user 4. For example, in some embodiments the visual output is a game or other interactive visual output and the compensatory input by the user 4 via the input device 9 is translated to movement of one or more elements within the game.
[0044] Figure 3 is a schematic diagram of a motion sickness compensation system for preventing or reducing motion sickness in a subject, in accordance with another illustrative embodiment of the present invention. The motion sickness compensation system 2a is a representative device and may comprise a processor subsystem 204, an input/output
subsystem 206, a memory subsystem 208, a communications interface 210, and a system bus 212. In some embodiments, one or more of the motion sickness compensation system 2a components may be combined or omitted such as, for example, not including the communications interface 210. In some embodiments, the motion sickness compensation system 2a may comprise other components not combined or included in the components shown in Fig. 3.
[0045] For example, the motion sickness compensation system 2a may also comprise a power subsystem (not shown). In other embodiments, the motion sickness compensation system 2a may comprise multiple iterations of the components shown in Fig. 3. For example, the motion sickness compensation system 2a may comprise multiple memory subsystems 208. For the sake of conciseness and clarity, and not limitation, one of each of the listed components is shown in Fig. 3.
[0046] Processor subsystem 204 has the same functionality as the processor 14 described above in connection with Fig. 1A, and thus the following description of processor subsystem 204 applies equally to the processor 14 of Fig. 1 A. The processor subsystem 204 may comprise any processing circuitry operative to control the operations and performance of the motion sickness compensation system 2a. In various aspects, the processor subsystem 204 may be implemented as a general purpose processor, a chip multiprocessor (CMP), a dedicated processor, an embedded processor, a digital signal processor (DSP), a network processor, an input/output (I/O) processor, a media access control (MAC) processor, a radio baseband processor, a co-processor, a microprocessor such as a complex
instruction set computer (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, and/or a very long instruction word (VLIW) microprocessor, or other processing device. The processor subsystem 204 also may be implemented by a controller, a microcontroller, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a programmable logic device (PLD), and so forth.
[0047] In various aspects, the processor subsystem 204 may be arranged to run an operating system (OS) and various applications. Examples of an OS comprise, for example, operating systems generally known under the trade names of Apple OS, Microsoft Windows OS, Android OS, and any other proprietary or open source OS. Examples of applications comprise, for example, a telephone application, a camera (e.g., digital camera, video camera) application, a browser application, a multimedia player application, a gaming application, a messaging application (e.g., email, short message, multimedia), a viewer application, and so forth.
[0048] In some embodiments, the motion sickness compensation system 2a may comprise a system bus 212 that couples various system components including the processing subsystem 204, the input/output subsystem 206, the memory subsystem 208, and/or the communications subsystem 210. The system bus 212 can be any of several types of bus structure(s) including a memory bus or memory controller, a peripheral bus or external bus, and/ or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MSA), Extended ISA (EISA), InteUigent Drive Electronics (IDE), VESA Local Bus (VLB),
Peripheral Component Interconnect Card International Association Bus (PCMCIA), Small Computers Interface (SCSI) or other proprietary bus, or any custom bus suitable for computing device applications.
[0049] Figure 4 is a schematic diagram of the input/output subsystem 206 of Figure 3, in accordance with an illustrative embodiment of the present invention. The input/ output subsystem 206 may comprise any suitable mechanism or component to at least enable a user to provide input to the motion sickness compensation system 2a and to enable the motion sickness compensation system 2a to provide output to the user. For example, the input/ output subsystem 206 may comprise any suitable input mechanism, including but not limited to, a button, keypad, keyboard, click wheel, touchscreen, or motion sensor. In some embodiments, the input/output subsystem 206 may comprise a capacitive sensing mechanism, or a multi-touch capacitive sensing mechanism. It will be appreciated that any of the input mechanisms described herein may be implemented as physical mechanical components, virtual elements, and/ or combinations thereof.
[0050] The input/ output subsystem 206 preferably comprises a visual peripheral output device 214 for providing a display visible to the user. Visual peripheral output device 214 has the same functionality as the visual output device 12 described above in connection with Fig. 1A, and thus the following description of the visual peripheral output device 214 applies equally to the visual output device 12 of Fig. 1A. The visual peripheral output device 214 may comprise a screen such as, for example, a Liquid Crystal Display (LCD) screen, incorporated into the motion sickness compensation system 2a. As another example, the
visual peripheral output device 214 may comprise a movable display or projecting system for providing a display of content on a surface remote from the motion sickness compensation system 2a. In some embodiments, the visual peripheral output device 214 can comprise a coder/ decoder, also known as a Codec, to convert digital media data into analog signals. For example, the visual peripheral output device 214 may comprise video Codecs, audio Codecs, or any other suitable type of Codec.
[0051] The visual peripheral output device 214 also may comprise display drivers, circuitry for driving display drivers, or both. The visual peripheral output device 214 may be operative to display content under the direction of the processor subsystem 204. For example, the visual peripheral output device 214 may be able to play media playback information, application screens for an application implemented on the motion sickness compensation system 2a, information regarding ongoing communications operations, information regarding incoming communications requests, or device operation screens, to name only a few.
[0052]The input/output subsystem 206 preferably comprises a motion sensor 216. Motion sensor 216 has the same functionality as the motion sensor 8 described above in connection with Fig. 1A, and thus the following description of the motion sensor 216 applies equally to the motion sensor 8 of Fig. 1A. The motion sensor 216 may comprise any suitable motion sensor operative to detect movement of the motion sickness compensation system 2a. For example, the motion sensor 216 may be operative to detect acceleration or
deceleration of the motion sickness compensation system 2a as manipulated by a user and/ or as caused by a vehicle 6.
[0053] In some embodiments, the motion sensor 216 may comprise one or more three-axis acceleration motion sensors (e.g., an accelerometer) operative to detect linear acceleration in three directions (i.e., the x (or left/right) direction, the y (or up /down) direction, and the 2 (or forward/backward) direction). As another example, the motion sensor 216 may comprise one or more two-axis acceleration motion sensors which may be operative to detect linear acceleration only along each of x (or left/ right) and y (or up/ down) directions (or any other pair of directions). In some embodiments, the motion sensor 216 may comprise an electrostatic capacitance (capacitance-coupling) accelerometer that is based on silicon micro-machined MEMS (Micro Electro Mechanical Systems) technology, a pie2oelectric type accelerometer, a pie2oresistance type accelerometer, or any other suitable accelerometer.
[0054] In some embodiments, the motion sensor 216 may be operative to directly detect rotation, rotational movement, angular displacement, tilt, position, orientation, motion along a non-linear (e.g., arcuate) path, or any other non-linear motions. For example, when the motion sensor 216 is a linear motion sensor, additional processing may be used to indirectly detect some or all of the non-linear motions. By comparing the linear output of the motion sensor 216 with a gravity vector (i.e., a static acceleration), the motion sensor 216 may be operative to calculate the tilt of the motion sickness compensation system 2a with respect to the y-axis. In some embodiments, the motion sensor 216 may instead or in
addition comprise one or more gyro-motion sensors or gyroscopes for detecting rotational movement. For example, the motion sensor 216 may comprise a rotating or vibrating element.
[0055] In some embodiments, the motion sensor 216 may comprise one or more controllers (not shown) coupled to the accelerometers or gyroscopes. The controllers may be used to calculate a moving vector of the motion sickness compensation system 2a. The moving vector may be determined according to one or more predetermined formulas based on the movement data (e.g., x, y, and z axis movement information) provided by the accelerometers or gyroscopes.
[0056] In some embodiments, the input/output subsystem 206 may comprise a virtual input/output system 218. The virtual input/output system 218 is capable of providing input/output options by combining one or more input/output components to create a virtual input type. For example, the virtual input/output system 218 may enable a user to input information through an on-screen keyboard which utilizes the touchscreen and mimics the operation of a physical keyboard, or using the motion sensor 216 to control a pointer on the screen instead of utilizing the touchscreen. As another example, the virtual input/output system 218 may enable alternative methods of input and output to enable use of the device by persons having various disabilities. For example, the virtual input/ output system 218 may convert on-screen text to spoken words to enable reading-impaired persons to operate the device.
[0057] In some embodiments, the input/ output subsystem 206 may comprise speciali2ed output circuitry associated with output devices such as, for example, an audio peripheral output device 220. The audio peripheral output device 220 may comprise an audio output including on or more speakers integrated into the motion sickness compensation system 2a. The speakers may be, for example, mono or stereo speakers. The audio peripheral output device 220 also may comprise an audio component remotely coupled to audio peripheral output device 220 such as, for example, a headset, headphones, and/ or ear buds which may be coupled to the audio peripheral output device 220 through the communications subsystem 210.
[0058] Figure 5 is a schematic diagram of the communications interface 210 of Figure 3, in accordance with an illustrative embodiment of the present invention. The communications interface 210 may comprise any suitable hardware, software, or combination of hardware and software that is capable of coupling the motion sickness compensation system 2a to one or more networks and/ or devices. The communications interface 210 may be arranged to operate with any suitable technique for controlling information signals using a desired set of communications protocols, services or operating procedures. The communications interface 210 may comprise the appropriate physical connectors to connect with a corresponding communications medium, whether wired or wireless.
[0059] In various aspects, a network may comprise local area networks (LAN) as well as wide area networks (WAN) including without limitation Internet, wired channels, wireless
channels, communication devices including telephones, computers, wire, radio, optical or other electromagnetic channels, and combinations thereof, including other devices and/or components capable of/associated with communicating data. For example, the communication environments comprise in-body communications, various devices, and various modes of communications such as wireless communications, wired communications, and combinations of the same.
[0060] Wireless communication modes comprise any mode of communication between points (e.g., nodes) that utilize, at least in part, wireless technology including various protocols and combinations of protocols associated with wireless transmission, data, and devices. The points comprise, for example, wireless devices such as wireless headsets, audio and multimedia devices and equipment, such as audio players and multimedia players, telephones, including mobile telephones and cordless telephones, and computers and computer-related devices and components, such as printers.
[0061] Wired communication modes comprise any mode of communication between points that utilize wired technology including various protocols and combinations of protocols associated with wired transmission, data, and devices. The points comprise, for example, devices such as audio and multimedia devices and equipment, such as audio players and multimedia players, telephones, including mobile telephones and cordless telephones, and computers and computer-related devices and components, such as printers. In various implementations, the wired communication modules may communicate in accordance with a number of wired protocols. Examples of wired protocols may comprise Universal Serial Bus
(USB) communication, RS-232, RS-422, RS-423, RS-485 serial protocols, FireWire, Ethernet, Fibre Channel, MIDI, ATA, Serial ATA, PCI Express, T-l (and variants), Industry Standard Architecture (ISA) parallel communication, Small Computer System Interface (SCSI) communication, or Peripheral Component Interconnect (PCI) communication, to name only a few examples.
[0062] Accordingly, in various aspects, the communications interface 210 may comprise one or more interfaces such as, for example, a wireless communications interface 222, a wired communications interface 224, a network interface, a transmit interface, a receive interface, a media interface, a system interface 226, a component interface, a switching interface, a chip interface, a controller, and so forth. When implemented by a wireless device or within a wireless system, for example, the communications interface 210 may comprise a wireless interface 222 comprising one or more antennas 228, transmitters, receivers, transceivers, amplifiers, filters, control logic, and so forth.
[0063] In various aspects, the communications interface 210 may provide voice and/or data communications functionality in accordance with different types of cellular radiotelephone systems. In various implementations, the described aspects may communicate over wireless shared media in accordance with a number of wireless protocols. Examples of wireless protocols may comprise various wireless local area network (WLAN) protocols, including the Institute of Electrical and Electronics Engineers (IEEE) 802. xx series of protocols, such as IEEE 802.11a/b/g/n, IEEE 802.16, IEEE 802.20, and so forth. Other examples of wireless protocols may comprise various wireless wide area network
(WW AN) protocols, such as GSM cellular radiotelephone system protocols with GPRS, CDMA cellular radiotelephone communication systems with lxRTT, EDGE systems, EV- DO systems, EV-DV systems, HSDPA systems, and so forth. Further examples of wireless protocols may comprise wireless personal area network (PAN) protocols, such as an Infrared protocol, a protocol from the Bluetooth Special Interest Group (SIG) series of protocols, including Bluetooth Specification versions vl.O, vl.l, vl.2, v2.0, v2.0 with Enhanced Data Rate (EDR), as well as one or more Bluetooth Profiles, and so forth. Yet another example of wireless protocols may comprise near-field communication techniques and protocols, such as electro-magnetic induction (EMI) techniques. An example of EMI techniques may comprise passive or active radio-frequency identification (RFID) protocols and devices. Other suitable protocols may comprise Ultra Wide Band (UWB), Digital Office (DO), Digital Home, Trusted Platform Module (TPM), ZigBee, and so forth.
[0064] In various implementations, the described aspects may comprise part of a cellular communication system. Examples of cellular communication systems may comprise CDMA cellular radiotelephone communication systems, GSM cellular radiotelephone systems, North American Digital Cellular (NADC) cellular radiotelephone systems, Time Division Multiple Access (TDMA) cellular radiotelephone systems, Extended-TDMA (E- TDMA) cellular radiotelephone systems, Narrowband Advanced Mobile Phone Service (NAMPS) cellular radiotelephone systems, third generation (3G) wireless standards systems such as WCDMA, CDMA-2000, UMTS cellular radiotelephone systems compliant with the
Third-Generation Partnership Project (3GPP), fourth generation (4G) wireless standards, and so forth.
[0065] Figure 6 is a schematic diagram of the memory subsystem 208 of Figure 3, in accordance with an illustrative embodiment of the present invention. The memory subsystem 208 may comprise any machine-readable or computer-readable media capable of storing data, including both volatile /non- volatile memory and removable/non-removable memory. The memory subsystem 208 may comprise at least one non-volatile memory unit 230 and a local bus 234. The non-volatile memory unit 230 is capable of storing one or more software programs 232_l-232_n. The software programs 232_l-232_n may contain, for example, applications, user data, device data, and/or configuration data, or combinations therefore, to name only a few. The software programs 232_l-232_n may contain instructions executable by the various components of the motion sickness compensation system 2a.
[0066] In various aspects, the memory subsystem 208 may comprise any machine- readable or computer-readable media capable of storing data, including both volatile/ nonvolatile memory and removable/non-removable memory. For example, memory may comprise read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDR-RAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory (e.g., NOR or NAND flash memory), content addressable memory (CAM), polymer memory (e.g., ferroelectric
polymer memory), phase-change memory (e.g., ovonic memory), ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, disk memory (e.g., floppy disk, hard drive, optical disk, magnetic disk), or card (e.g., magnetic card, optical card), or any other type of media suitable for storing information.
[0067] In some embodiments, the memory subsystem 208 may contain a software program for motion sickness compensation using the capabilities of the motion sickness compensation system 2a and the motion sensor 204, as discussed in connection with FIGS. 1-2. In one embodiment, the memory subsystem 208 may contain an instruction set, in the form of a file 232_n for executing a method of motion sickness compensation. The instruction set may be stored in any acceptable form of machine readable instructions, including source code or various appropriate programming languages. Some examples of programming languages that may be used to store the instruction set comprise, but are not limited to: Java, C, C++, C#, Python, Objective-C, Visual Basic, or .NET programming. In some embodiments a compiler or interpreter is comprised to convert the instruction set into machine executable code for execution by the processing subsystem 204.
[0068] Figure 7 is a schematic diagram of the motion sickness compensation system of the present invention in a vehicle and that utilizes a handheld input device, in accordance with an illustrative embodiment of the present invention. The system 2b is similar to systems 2 and 2a described above, and detailed descriptions of common components are not repeated herein. The system 2b includes a handheld input device 9 configured to receive one or more inputs from a user 4. The handheld input device 9 can include any suitable
inputs such as one or more buttons, d-pads, joysticks, motion sensors, toggles, triggers, and/or any other suitable input device. Visual output devices 12 are positioned in-line with a user's 4 line of sight.
[0069] In some embodiments, local processors 14 are configured to receive one or more inputs from respective motion sensors, such as a local motion sensor 8a and/or a central motion sensor 8b. The motion sensors 8a, 8b are configured to detect motion in one or more degrees of freedom of a vehicle 6 containing the user 4 and the system 2b. The central motion sensor 8b can provide a signal indicative of the one or more degrees of freedom of the vehicle 6 to local processors 14. The local processors 14 generate images on the visual output devices 12 (e.g., displays) configured to elicit compensatory movement and/ or inputs from a respective user 4.
[0070] In some embodiments, the system 2b includes an audio output system 36, such as, for example, headphones. The audio output system 36 is configured to provide an audio component to the user 4 to further reduce motion sickness. For example, in some embodiments, the audio output system 36 is configured to provide an auditory distraction, such as game sounds, music, etc., in conjunction with the visual output from the visual output device 12. The audio output can be matched to the visual output of the visual output device 12, can be user selected audio output, and/or audio output generated by the system 2b.
[0071] Figure 8 is a schematic diagram of a handheld computing device that incorporates the motion sickness compensation system of the present invention, in
accordance with an illustrative embodiment of the present invention. The system 2c is similar to systems 2, 2a and 2b discussed above, and detailed descriptions of common components are not repeated herein. The system 2c includes a handheld computing device 40 containing one or more of the elements of the system 2c. For example, in some embodiments, the handheld computing device 40 includes a processor 44, a motion sensor 8, a visual output device 12, one or more input devices, and/ or any other suitable elements of the system 2. In some embodiments, the visual output device 12 includes a touchscreen or other input device formed integrally with, above, and/ or below the visual output device 12 to receive input from a user 4.
[0072] Figures 9A-9E are schematic diagrams of sample display screens generated by the motion sickness compensation system 2c of Fig. 8, in accordance with an illustrative embodiment of the present invention. Fig. 9A illustrates an initial or starting state of a visual output device 12. The visual output device 12 is configured to display a simulated vehicle or target 50 aligned with a simulated horizon 52.
[0073] Fig. 9B illustrates the visual output device 12 during an angular acceleration (e.g., a turn) of the vehicle 6 containing system 2c. The visual output device 12 shows the simulated horizon 52 tilted in response to the angular acceleration. For example, in the illustrated embodiment, the vehicle 6 has a leftward angular acceleration (e.g., a left turn) which corresponds to a lowered left-side and raised right-side of the simulated horizon 52. It will be appreciated that tilting or rotation of the simulated horizon 52 can be in any suitable direction corresponding to movement, such as angular acceleration, of the vehicle 6.
[0074] The tilted simulated horizon 52 encourages a user 4 to provide an input to align the simulated vehicle 50 with the simulated horizon 52. For example, in Fig. 9C, the handheld computing device 40 includes the visual output device 12 and one or more inputs (not shown). A user 4 tilts the handheld computing device 40 to rotate the simulated vehicle 50 into alignment with the simulated horizon 52. The movement of the handheld computing device 40 (or other input) provides simulated movement control that corresponds to the movement detected by one or more biological systems of the user 4. For example, in some embodiments, the input device is configured to mimic a control of the vehicle 6 (such as a steering wheel) to compensate for motion experienced by a passenger in the vehicle.
[0075] Fig. 9D illustrates the visual output device 12 during a linear acceleration (e.g., increasing/decreasing speed). The simulated horizon 52 is raised relative to a center line of the visual output device 12. The simulated horizon 52 can be moved at rate proportional to the movement/acceleration of the vehicle 6. In other embodiments, the simulated horizon 52 is moved at a fixed rate unrelated to the magnitude of the movement/ acceleration of the vehicle 6. Movement of the simulated horizon 52 encourages compensatory input from the user 4.
[0076] For example, as shown in Fig. 9E, a user 4 can tilt and/ or raise the handheld computing device 40 to compensate for movement of the simulated horizon 52. The movement of the handheld computing device 40 (or other input) provides simulated movement control that corresponds to the movement detected by one or more biological
systems of the user 4. Although the embodiment discussed illustrate movement of a handheld computing device 40, it will be appreciated that the compensatory input can be provided by any suitable input device, such as a controller, a handheld device, a head mounted device, a gesture tracking device, and/ or any other suitable input.
[0077] The foregoing embodiments and advantages are merely exemplary, and are not to be construed as limiting the present invention. The description of the present invention is intended to be illustrative, and not to limit the scope of the claims. Many alternatives, modifications, and variations will be apparent to those skilled in the art. Various changes may be made without departing from the spirit and scope of the invention.
Claims
1. A system for preventing or reducing motion sickness in a subject, comprising: a motion sensor adapted to measure movement experienced by the subject and the system;
a visual output device;
an input device; and
a processor in signal communication with the motion sensor and the visual output device, wherein the processor is configured to:
receive a motion measurement from the motion sensor,
calculate a compensatory movement in response to the motion measurement,
generate an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via the input device, and
display the interactive visual image on the visual output device, wherein the calculated compensatory movement and the displayed interactive visual image are adapted to synchroni2e inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
2. The system of claim 1, wherein the motion sensor is adapted to measure at least one degree of movement.
3. The system of claim 2, wherein the motion sensor comprises an accelerometer.
4. The system of claim 1, wherein the visual output device comprises at least one of a virtual reality display, a liquid crystal display, an organic light-emitting diode (OLED) display, a cathode ray tube display, a head-mounted display, a projection type display and a holographic display.
5. The system of claim 1, wherein the interactive visual image corresponds to gaming images and the compensatory movement corresponds to gaming actions.
6. The system of claim 1, wherein the input device comprises at least one of a button, a keypad, a keyboard, a click wheel, a touchscreen, a controller, a head-mounted device, a gesture tracking device and a motion sensor.
7. The system of claim 1, wherein the processor is configured to update the interactive visual image displayed on the visual output device based on an input from the subject via the input device.
8. The system of claim 1, further comprising an audio output system in signal communication with the processor, and wherein the processor is configured to provide an audio signal to the subject.
9. The system of claim 8, wherein the audio signal is correlated with the interactive visual image.
10. A mobile computing device incorporating the system of claim 1.
11. A vehicle entertainment system incorporating the system of claim 1.
12. A method of preventing or reducing motions sickness in a subject, comprising:
measuring motion experienced by the subject;
calculating a compensatory movement in response to the measured motion; generating an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via an input device; and
displaying the interactive visual image to the subject;
wherein the calculated compensatory movement and the displayed interactive visual image are adapted to synchronize inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
13. The method of claim 12, wherein at least one degree of motion experienced by the subject is measured.
14. The method of claim 13, wherein at least one degree of motion experienced by the subject is measured with at least one accelerometer.
15. The method of claim 12, wherein the interactive visual image is displayed on at least one of a virtual reality display, a liquid crystal display, an organic light-emitting diode (OLED) display, a cathode ray tube display, a head-mounted display, a projection type display and a holographic display.
16. The method of claim 12, wherein the interactive visual image corresponds to gaming images and the compensatory movement corresponds to gaming actions.
17. The method of claim 12, wherein the input device comprises at least one of a button, a keypad, a keyboard, a click wheel, a touchscreen, a controller, a head-mounted device, a gesture tracking device and a motion sensor.
18. The method of claim 12, wherein further comprising updating the interactive visual image displayed to the subject based on an input from the subject via the input device.
19. The method of claim 12, further comprising providing an audio signal to the subject.
20. The method of claim 19, wherein the audio signal is correlated with the interactive visual image.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762570289P | 2017-10-10 | 2017-10-10 | |
US62/570,289 | 2017-10-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019074758A1 true WO2019074758A1 (en) | 2019-04-18 |
Family
ID=66101649
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2018/054381 WO2019074758A1 (en) | 2017-10-10 | 2018-10-04 | System and method for prevention or reduction of motion sickness |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019074758A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114187998A (en) * | 2021-12-06 | 2022-03-15 | 上海瑾盛通信科技有限公司 | Anti-motion sickness training method and related device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5829446A (en) * | 1996-12-03 | 1998-11-03 | Raytheon Company | Competing opposing stimulus simulator sickness reduction technique |
US20020099257A1 (en) * | 2001-01-21 | 2002-07-25 | Parker Donald E. | Alleviating motion, simulator, and virtual environmental sickness by presenting visual scene components matched to inner ear vestibular sensations |
US20140176296A1 (en) * | 2012-12-19 | 2014-06-26 | HeadsUp Technologies, Inc. | Methods and systems for managing motion sickness |
WO2016126522A1 (en) * | 2015-02-05 | 2016-08-11 | Sony Computer Entertainment Inc. | Motion sickness monitoring and application of supplemental sound to counteract sickness |
-
2018
- 2018-10-04 WO PCT/US2018/054381 patent/WO2019074758A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5829446A (en) * | 1996-12-03 | 1998-11-03 | Raytheon Company | Competing opposing stimulus simulator sickness reduction technique |
US20020099257A1 (en) * | 2001-01-21 | 2002-07-25 | Parker Donald E. | Alleviating motion, simulator, and virtual environmental sickness by presenting visual scene components matched to inner ear vestibular sensations |
US20140176296A1 (en) * | 2012-12-19 | 2014-06-26 | HeadsUp Technologies, Inc. | Methods and systems for managing motion sickness |
WO2016126522A1 (en) * | 2015-02-05 | 2016-08-11 | Sony Computer Entertainment Inc. | Motion sickness monitoring and application of supplemental sound to counteract sickness |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114187998A (en) * | 2021-12-06 | 2022-03-15 | 上海瑾盛通信科技有限公司 | Anti-motion sickness training method and related device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11691079B2 (en) | Virtual vehicle control method in virtual scene, computer device, and storage medium | |
US11151773B2 (en) | Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium | |
US10620232B2 (en) | Detecting controllers in vehicles using wearable devices | |
US9122307B2 (en) | Advanced remote control of host application using motion and voice commands | |
WO2014107220A1 (en) | A headset computer with head tracking input used for inertial control | |
US11698258B1 (en) | Relative inertial measurement system with visual correction | |
JP2018200557A (en) | Program to provide virtual space, information processing device to execute the same and method for providing virtual space | |
KR102479585B1 (en) | Center of gravity shifting force device | |
EP3253283B1 (en) | Fan-driven force device | |
US20210082187A1 (en) | Haptic simulation of motion in virtual reality | |
JP2019211864A (en) | Computer program, information processing device, and information processing method | |
JP2019074962A (en) | Program for providing virtual experience, computer and method | |
WO2019074758A1 (en) | System and method for prevention or reduction of motion sickness | |
JP6495398B2 (en) | Method and program for providing virtual space, and information processing apparatus for executing the program | |
KR102650572B1 (en) | Electronic apparatus and method for controlling thereof | |
US20180314330A1 (en) | Haptic device linked with augmented reality or virtual reality implementation device | |
WO2013042530A1 (en) | Display device, display control method, and program | |
JP2015039589A (en) | Driving simulation device using mobile terminal, and driving simulation program | |
JP2018206353A (en) | Information processing method, apparatus, and program for implementing that information processing method in computer | |
WO2022103741A1 (en) | Method and device for processing user input for multiple devices | |
JP2018106364A (en) | Method implemented by computer for communication via virtual space, program for causing computer to execute method, and information processing apparatus | |
JP2019075091A (en) | Program for providing virtual experience, computer and method | |
JP2019020829A (en) | Information processing method, device, and program for causing computer to execute the information processing method | |
KR102681269B1 (en) | Virtual and augmented reality driving apparatus for a watch type mobile communication terminal, an interface system and method using the same | |
KR101696607B1 (en) | Computer program storing recording medium for real-time controlling object in virtual reality interface system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18866520 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18866520 Country of ref document: EP Kind code of ref document: A1 |