WO2014031342A1 - Adaptive visual output based on motion compensation of a mobile device - Google Patents
Adaptive visual output based on motion compensation of a mobile device Download PDFInfo
- Publication number
- WO2014031342A1 WO2014031342A1 PCT/US2013/054000 US2013054000W WO2014031342A1 WO 2014031342 A1 WO2014031342 A1 WO 2014031342A1 US 2013054000 W US2013054000 W US 2013054000W WO 2014031342 A1 WO2014031342 A1 WO 2014031342A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mobile device
- data
- motion
- motion compensation
- visual output
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0261—Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/007—Use of pixel shift techniques, e.g. by mechanical shift of the physical pixels or by optical shift of the perceived pixels
Definitions
- This application relates to the technical field of data processing, more specifically to methods and apparatuses associated with adaptive visual output of a mobile device based on motion compensation.
- a handheld device such as tablet, smartphone, or other mobile device may be very difficult to read or watch when the device is in motion. This includes when the user is reading or watching content on the device while traveling in a car, while walking, or while engaging in other human motion. Environmental motion caused by a car or the user is transferred to the mobile device, and the user's eyes have to try to follow the changing position of the content, e.g., reading material. For many people, focusing on moving words or images may cause nausea, especially while trying to read in a car.
- Figure 1 illustrates an arrangement for adaptive visual output of a mobile device based on motion compensation
- Figure 2 illustrates an operational diagram for motion compensation by the mobile device of Figure 1 ;
- Figure 3 illustrates a method of performing motion compensation by the mobile device of Figure 1 ;
- Figure 4 illustrates a method of operating the remote computing device of Figure 1 ;
- Figure 5 illustrates an example of a mobile device of Figure 1 ; all arranged in accordance with embodiments of the present disclosure.
- Embodiments of the present disclosure may enable a mobile device such as a smartphone, tablet, notebook, or the like, to compensate for the environmental motion arising while the user of the mobile device is traveling in a vehicle, is walking, or is moving with the mobile device in some other manner.
- Figure 1 illustrates a perspective view of a system 100 in which a mobile device is configured to adjust visual output of an application using motion compensation, and illustrates an example arrangement of the mobile device, in accordance with various embodiments.
- Visual output may refer to any visual information rendered or displayed by a mobile device and may include, user-interfaces, movies, applications, games, system utilities, documents, productivity tools, buttons, dialog windows, and the like.
- Motion compensation may refer to compensation of motion that is caused by or due to (but not limited to) the environment in which the mobile device is operated. This motion may be referred to as environmental motion.
- system 100 may include a mobile device 102 which may be operated and/or manipulated, by a user 104.
- Mobile device 102 may be communicatively coupled to remote computing device 106 via one or more networks 108.
- mobile device 102 may be configured to measure, in real-time, various characteristics of the environment in which mobile device 102 is being operated and acquire data associated with environmental motion.
- mobile device 102 may be configured to measure acceleration, speed, vertical displacements (i.e., roughness of terrain), and lateral displacements by using one or more of accelerometer data, global positioning system (GPS) data, images of the surrounding landscape, and eye or face movement of user 104.
- GPS global positioning system
- Mobile device 102 may be configured to use the various data associated with motion of an environment to calculate motion compensation for visual output rendered by mobile device 102.
- mobile device 102 may be configured to use accelerometer data, GPS data, and image data to determine a pattern or frequency of movement or other environmental motion of mobile device 102.
- Mobile device 102 may be configured to compensate for movement or other environmental motion by adjusting visual output of mobile device 102.
- mobile device 102 may be configured to reduce or eliminate a frequency of movement or environmental motion.
- Mobile device 102 may be configured to generate patterns based on accumulated data associated with environmental motion, i.e., historic motion characteristics, and use the patterns to estimate future environmental motion. Based on the estimated future environmental motion, mobile device 102 may be configured to proactively decrease the effects expected by the future environmental motion. For example, mobile device 102 may be configured to recall environmental motion data acquired during previous travels through a specific geographic terrain and implement a motion compensation scheme particular to the specific geographic terrain, e.g., to compensate for motion associated with traveling over a rocky road in a car.
- Mobile device 102 may also be configured to use face-tracking or eye-tracking information while calculating motion compensation.
- mobile device 102 may be configured to use face-tracking or eye-tracking information to determine motion of mobile device 102 relative to user 104 and compensate for the relative motion.
- mobile device 102 may use eye-tracking information to determine a current reading speed of user 104 and select between multiple alternate compensation calculations based on the determined reading speed.
- mobile device may adjust visual output of mobile device 102 to improve the reading and/or interactive experience of user 104.
- Mobile device 102 may include display device 1 10, user-facing image capture device 112, away-facing image capture device 1 12, motion-related sensors 1 16, a network interface 118, a peripheral interface 120, storage 122, and one or more processors 124, coupled with each other, via e.g., one or more communication buses 126.
- Display 1 10 may be any one of a number of display technologies suitable for use on a mobile device.
- display 1 10 may be a liquid crystal display (LCD), a thin- film transistor LCD, a plasma display, or the like. According to various aspects of display 1 10 may be a liquid crystal display (LCD), a thin- film transistor LCD, a plasma display, or the like. According to various aspects of display 1 10 may be a liquid crystal display (LCD), a thin- film transistor LCD, a plasma display, or the like. According to various combinations of a liquid crystal display (LCD), a thin- film transistor LCD, a plasma display, or the like. According to various combinations of a plasma display, or the like.
- display 1 10 may be a touch sensitive display, i.e., a touchscreen.
- display 1 10 may be one of a number of types of touch screen, such as acoustic, capacitive, resistive, infrared, or the like.
- User-facing image capture device 112 may be disposed on the mobile device 102 and oriented to face user 104. User-facing image capture device 112 may be configured to capture images from a user-facing direction. User-facing image capture device 1 12 may be a complimentary metal oxide semiconductor (CMOS) image sensor, a charge-coupled device (CCD) image sensor, or one or more antennas configured to construct or create images in response to received electromagnetic signals.
- CMOS complimentary metal oxide semiconductor
- CCD charge-coupled device
- Away-facing image capture device 1 14 may be oriented towards a direction opposite to user 104.
- Image capture device 114 may be configured to optically capture images from an outward facing direction.
- Image capture device 1 14 may be a
- mobile device 102 may use away-facing image capture device 1 14 to capture images of portions of the environment facing away from user 104 and use the captured images to calculate motion compensation.
- CMOS complimentary metal oxide semiconductor
- CCD charge-coupled device
- mobile device 102 may use away-facing image capture device 1 14 to capture images of portions of the environment facing away from user 104 and use the captured images to calculate motion compensation.
- Motion-related sensors 1 16 may be configured to capture data related to environmental motion in various types of vehicles or modes of travel. As described above, briefly, motion-related sensors 1 16 may include accelerometer, a GPS unit, and the like. According to embodiments, antennas on mobile device 102 may be used to determine motion of mobile device 102 using triangulation techniques based on wirelessly received data. According to various embodiments, motion-related sensors 116 may be configured to acquire data associated with environmental motion caused by operation of mobile device 102 within any one of a variety of vehicles or transportation techniques.
- motion-related sensors 1 16 may be configured to acquire data associated with environmental motion caused by traveling in a car, a truck, a train, an airplane, a boat, a ship, a mobile chair, a bicycle, a horse carriage, a motorcycle, and the like.
- motion-related sensors 1 16 may be configured to capture data related to environmental motion caused by walking, jogging, or running by user 104.
- Network interface 118 may be configured to couple mobile device 102 to remote computing device 106 through one or more networks 108, hereinafter network 108.
- Peripheral interface 120 may enable a variety of user interfaces, such as mice, keyboards, monitors, and/or audio commands.
- peripheral interface 120 may enable USB ports, PS/2 ports, Firewire® ports, Bluetooth®, and the like, according to various embodiments.
- Storage 122 may be volatile memory, non-volatile memory, and/or a combination of volatile memory and non-volatile memory. Storage 122 may also include optical, electro-magnetic and/or solid state storage. Storage 122 may store a plurality of instructions which, when executed by processor 124, may cause mobile device 102 to perform various functions related to motion compensation, as discussed above and as will be discussed below in further detail.
- processors 124 may be configured to execute the plurality of instructions stored in storage 122.
- Processor 124 may be any one of a number of single or multi-core processors.
- processor 124 may be configured to enable mobile device 102 to perform any one or more of the various functions disclosed herein.
- processor 124 may be configured to cause mobile device 102 to acquire data associated with environmental motion using one or more of user-facing image capture device 112, facing-away image capture device 1 14, and motion-related sensors 116.
- Processor 124 may also be configured to calculate motion compensation and adjust visual output on display 110 to reduce or eliminate environmental motion and/or motion relative to user 104.
- remote computing device 106 may be communicatively coupled to mobile device 102 via network 108.
- Remote computing device 106 may be located remotely from mobile device 102, such that each of remote computing device 106 and mobile device 102 are remote devices with respect to each other device.
- Remote computing device 106 may be configured to receive and store various data related to environmental motion and/or motion compensation from mobile device 102 and/or one or more devices that may be similar to mobile device 102.
- Remote computing device 106 may be configured to receive, via network 108, data from motion related sensors 116, data from user-facing image capture device 112, and/or data from facing-away image capture device 114.
- remote computing device 106 may calculate motion compensation for visual output of mobile device 102 and may transmit the calculated motion compensation to mobile device 102. According to other embodiments, remote computing device 106 may accumulate various environmental motion data specific to various geographical locations and related to various modes of travel. Remote computing device 106 may then be configured to selectively provide environmental motion data to mobile device 102 or similar devices, in response to receiving a request for such information via the network 108.
- Remote computing device 106 may include storage 128, processor 130, network interface 132, and peripheral interface 134.
- Storage 128 may be volatile memory, non-volatile memory, and/or a combination of volatile memory and non-volatile memory. Storage 128 may also include optical, electro-magnetic and/or solid state storage. Storage 128 may store a plurality of instructions which, when executed by processor one or more processors 130, may cause remote computing device 102 to perform various functions related to supporting motion compensation in mobile device 102.
- processor 130 may be configured to execute the plurality of instructions stored in storage 128.
- Processor 130 may be any one of a number of single or multi-core processors.
- processor 130 may be configured to enable remote computing device 106 to perform any one or more of the various motion compensation-related functions disclosed herein.
- Processor 130 may be configured to cause remote computing device 130 to transfer data related to environmental motion and/or motion compensation to and/or from mobile device 102 via network 108.
- Network interface 132 may be configured to couple remote computing device 106 to mobile device 102 through network 108.
- Network interface 132 may include a wireless wide area network interface, such as 3G or 4G telecommunication interface. (3G and 4G refer to the 3 rd and 4 th Generation of Mobil Telecommunication Standards as defined by International Telecommunication Union.)
- Peripheral interface 134 may enable a variety of user interfaces, such as mice, keyboards, monitors, and/or audio commands.
- peripheral interface 134 may enable USB ports, PS/2 ports, Firewire® ports, Bluetooth®, and the like, according to various embodiments.
- FIG. 2 illustrates an operational diagram 200 for motion compensation of mobile device 102, according to various embodiments.
- Diagram 200 includes a motion characteristics collection module 202, display frame motion compensation application module 204, and a historic motion compensation learning module 206, communicatively coupled with each other as shown.
- Motion characteristics collection module 202 may be configured to collect various data and/or information about environmental motion in which mobile device 102 is operated. Motion characteristics collection module 202 may be configured to provide motion compensation calculations to display frame motion compensation application module 204. Motion characteristics collection module 202 may be configured to receive patterns or data related to historic motion compensation calculations from historic motion compensation learning module 206. Motion characteristics collection module 202 may include eye tracking module 208, position tracking module 210, and motion compensation calculation module 212. Eye tracking module 208 and position tracking module 210 may be configured to provide information to motion compensation calculation module 212.
- Eye tracking module 208 may be a service or application configured to use images from user-facing image capture device 1 12 to track a face or eyes of a user, e.g. user 104. Eye tracking module 208 may be configured to track and orientation of the user's eyes or face relative to an orientation of a display 1 10 to enhance the calculations of motion compensation. Eye tracking module 208 may be configured to determine where on display 110 of mobile device 102 a user's eyes are focused and/or to determine a reading speed of user.
- Position tracking module 210 may be a service or application configured to track the specific GPS or location coordinates, rate of movement, accelerometer data, and the like, of mobile device 102.
- Motion compensation calculation module 212 may receive input data from eye tracking module 208 and position tracking module 210 and may be configured to calculate motion compensation based on the received input data. Motion compensation calculation module 212 may be a service or an application configured to calculate the real-time motion compensation, for example, by generating real-time motion compensation vectors. According to embodiments, motion compensation calculation module 212 may receive and use information from eye tracking module 208, position tracking module 210 and historic motion compensation learning module 206 to predictively calculate real-time motion compensation. The calculated motion compensation may be used to adjust visual output of display 1 10.
- Display frame motion compensation application module 204 may be configured to receive motion compensation calculations from motion characteristics collection module 202.
- Display frame motion compensation application module may be a service or application that may be integrated into display 110 or other portions of mobile device 102 to apply motion compensation to visual output.
- Display frame motion compensation application module may be configured to adjust a portion of visual output display 110 independent of the overall movement of mobile device 102. Such adjustment may result in visual output that is stable and independent of environmental motion.
- display frame motion compensation application module 204 may apply motion compensation to all or to a portion of display 110. For example, motion compensation may be applied just to a portion of display 1 10 upon which the eyes of user 104 are focused.
- Display frame motion compensation application module 204 may be selectively enabled by user 104.
- mobile device 102 may continuously acquire environmental motion data and may continuously update patterns generated by historic motion compensation learning module 206 for a specific geographic location, mode of travel, and rate of travel.
- user 104 may selectively enable or disable display frame motion compensation application module 204, for example, via a switch, button, and/or user interface included in visual output of display 110.
- Historic motion compensation learning module 206 may be a service or application configured to provide historical and cumulative learning or information about the motion characteristics, i.e., environmental motion, and motion compensation data collected and/or calculated for mobile device 102 while having a particular location and travel rate. For example, mobile device 102 may be operated during travels over a freeway, e.g., US 10 in Phoenix, at 65 mph to have specific environmental motion or motion characteristics from which a particular set of motion compensation data or vectors may be generated. By contrast, mobile device 102 may be operated during travels on a two-lane road in the suburbs to have environmental motion or motion characteristics from which an entirely different set of motion compensation data or vectors may be generated.
- Historic motion compensation learning module 206 may receive motion compensation calculations from display frame motion compensation application module 204. Alternatively, historic motion compensation learning module 206 may receive motion compensation calculations directly from motion characteristics collection module 202. According to some embodiments, historic motion compensation learning module 206 may be enabled to operate independent of whether display frame motion compensation application module 204 is enabled.
- Historic motion compensation learning module 206 may include a pattern recognition engine or other learning algorithm, such as the Hidden Markov Model, etc., as a core of the recognition engine or learning system. Other pattern recognition techniques, such as those known by one of ordinary skill in the art, may be applied.
- the pattern recognition engine or learning algorithm may be configured to analyze patterns in motion compensation calculations. The patterns may be used to estimate future environmental motion to support predictive motion compensation calculations.
- the pattern recognition engine or learning algorithm may be configured to analyze patterns in environmental motion that are caused in part by a user. For example, some environmental motion may be caused by a user's inability to hold a mobile device 102 steady while traveling in a vehicle, or during some other movement.
- instructions for historic motion compensation learning module 206 may be stored locally on mobile device 102, e.g., in storage 122. According to other embodiments, instructions for historic motion compensation learning module 206 may be stored locally on mobile device 102 and/or remotely from mobile device 102. For example, instructions for historic motion compensation learning module 206 may be stored on remote computing device 106, e.g. in storage 128.
- Figure 3 illustrates a method 300 of operating mobile device 102 to compensate for environmental motion, according to various embodiments.
- mobile device 102 may acquire data associated with motion of mobile device 102.
- mobile device 102 may be configured to acquire data associated with motion of an environment in which mobile device 102 is situated and/or operated.
- the data associated with the motion of mobile device 102 may include accelerometer data, GPS data, and/or mode of travel data.
- Mode of travel data may indicate a mode of travel selected from a travel mode group.
- the travel mode group may include at least one of a car mode, a truck mode, a train mode, an air plane mode, a boat mode, a ship mode, a walking mode, a jogging mode, a running mode, a mobile chair mode, a bicycle mode, a horse carriage mode, and a motorcycle mode.
- mobile device 102 may calculate motion compensation based on the data acquired at block 302. In particular, mobile device 102 may calculate motion compensation for at least a portion of visual output of mobile device 102. Mobile device 102 may calculate motion compensation based on the acquired data and may calculate motion compensation to adapt a portion of visual output generated by the application.
- Mobile device 102 may be configured to provide user 104 with the option of selecting an automatic mode or a manual mode.
- automatic mode mobile device 102 may automatically acquire environmental motion data and use a default setting for mode of travel based upon speed and geographic location of mobile device 102.
- manual mode mobile device 102 may request input from user 104.
- mobile device 102 may receive input from user 104 related to geographic location, and mode of travel, terrain, and the like.
- mobile device 102 may be configured to request feedback from user 104 regarding whether user 104 perceives improvement in the visual output. Based on the feedback from user 104, mobile device 102 may require environmental motion data and/or re-calculate motion compensation.
- mobile device 102 may be a tablet displaying visual output associated with an electronic book (ebook).
- User 104 may be reading visual output on mobile device 102 while sitting in the passenger seat of an automobile driving on a highway. As the car is moving, mobile device 102 and user 104 may be physically moving from environmental motion of the car. This environmental motion may be considered a reference movement or motion to be compensated for. Reading the visual output on mobile device 102 may be difficult while riding in the car because of the constant movement of display 110, which may be difficult to focus on. The constant movement of display 110 may also cause user 104 to become nauseous while trying to focus on the moving mobile device 102.
- Mobile device 102 may begin collecting motion data from the current motion environment.
- the collected motion data may include accelerometer data, GPS coordinate movement, eye and facial movement of user 104 as measured by user- facing image capture device 112, and the like.
- the collected motion data may also include surrounding environmental movement as measured by facing-away image capture device 1 14.
- Mobile device 102 may also access previously learned motion compensation data, i.e. historic motion compensation, for a user 104 at the current specific GPS location, and for the terrain associated with the acquired GPS location.
- Mobile device 102 may then calculate motion compensation in real-time, e.g., by calculating predictive motion compensation vectors. Mobile device 102 may use the calculated motion compensation to adjust visual output on display 110 (independent of a housing of mobile device 102). The adjusted visual output may offset, reduce, or minimize the overall image movement that user 104 perceives. User 104 may then be able to read in the car without becoming nauseous.
- mobile device 102 may use the calculated motion compensation to adjust the visual output to increase a frequency of motion or decrease a frequency of motion to avoid a frequency of motion that may increase motion sickness or nausea experienced by user 104.
- user 104 may be reading visual output on mobile device 102 while walking on a sidewalk. While user 104 is walking, user 104 and mobile device 102 may both be moving but may both be moving out of synchronization as user 104 attempts to hold mobile device 102 steady. The difference between the walking movement and the movement of mobile device 102 may be considered the reference movement to be compensated. Generally, walking may create enough environmental motion to make it difficult to focus on mobile device 102. User 104 may then switch ON a motion compensation option of mobile device 102 and select a "walking motion" setting. According to method 300 and other disclosed embodiments, mobile device 102 may begin collecting motion data from the current motion environment. Mobile device 102 may access historic motion compensation data, locally or remotely.
- Mobile device 102 may then calculate motion compensation in real-time, such as real-time predictive motion compensation vectors. Based on the calculated motion compensation, mobile device 102 may adjust the visual output on display 110 (independent of a housing of mobile device 102) to offset, reduce, or minimize the overall image movement that user 102 perceives. User 104 may then be able to read visual output from mobile device 102 with little or no reference movement or motion, i.e., little or no differential movement between the eye's of user 104 and the visual output of mobile device 102.
- mobile device 102 may adjust a portion of the visual output of display 1 10 rather than all of the visual output. For example, mobile device 102 may determine, based on images captured from user-facing camera 1 12, that the eyes of user 104 are focused on a particular portion of the visual output, e.g., an upper left-hand corner of display 110. Mobile device 102 may be configured to adjust the particular portion of the visual output that is focused on without adjust the remaining portion of the visual output. Selectively focusing on portions of the visual output may decrease processing power by mobile device 102 and may extend the life of a battery or power supply of mobile device 102.
- mobile device 102 may use eye-tracking features to improve motion compensation performance of mobile device 102.
- eye- tracking features may enable mobile device 102 to improve characterization of environmental motion based on movement of the eyes of user 104 while attempting to focus on the visual output.
- eye or face tracking features may be used to identify the user and associate the acquired environmental motion and the calculated motion compensation with a particular user.
- Each user of mobile device 102 may select an account from which individual motion compensation settings and historic motion compensation and patterns may be retrieved.
- Figure 4 illustrates a method 400 of operating the remote computing device 106 to support motion compensation for mobile device 102.
- remote computing device 106 may receive data associated with motion.
- remote computing device 106 may receive data associated with motion in one or more environments from one or more computing devices that are located remotely from remote computing device 106, e.g., mobile device 102.
- Remote computing device 106 may receive environmental motion data from mobile device 102 and/or from other similar mobile devices operated by one or more users.
- Remote computing device 106 may receive environmental motion data through one or more networks, such as network 108.
- Remote computing device 106 may accumulate and store environmental motion data in a relational data structure, such as a database, and may associate burdensome environmental motion data with a corresponding geographical locations, terrain, and/or modes of travel.
- remote computing device 106 may provide the received data to the one or more computing devices.
- remote computing device 106 may provide the received data to one or more computing devices to enable the one or more computing devices to calculate motion compensation.
- the one or more computing devices may be configured to use the provided data to calculate motion compensation for all or a part of visual output rendered by the one or more computing devices.
- remote computing device 106 may be configured to calculate motion compensation based on the received data. Remote computing device 106 may then provide the calculated motion compensation to the one or more computing devices to enable the one or more computing devices to adjust the respective visual outputs to the respective environmental motion in which each of the one or more computing devices is operated. According to other embodiments, remote computing device 106 may be configured to determine and/or generate patterns based on the calculated motion compensation, the received data, or both. Remote computing device 106 may transmit or provide the determined patterns to the one or more computing devices to support motion compensation calculations by the one or more computing devices.
- FIG. 5 illustrates a computing device 500 in accordance with one implementation of an embodiment of the invention.
- computing device 500 may be suitable for use as mobile device 102 or remote computing device 106 of Figure 1.
- computing device 500 may house a motherboard 502.
- Motherboard 502 may include a number of components, including but not limited to a processor 504 and at least one communication chip 506.
- Processor 504 may be physically and electrically coupled to motherboard 502.
- the at least one communication chip 506 may also be physically and electrically coupled to motherboard 502.
- the communication chip 506 may be part of the processor 504.
- the above enumerated may be coupled together in alternate manners without employment of motherboard 502.
- computing device 500 may include other components that may or may not be physically and electrically coupled to motherboard 502. These other components include, but are not limited to, volatile memory (e.g., DRAM 508), non-volatile memory (e.g., ROM 510), flash memory 51 1, a graphics processor 512, a digital signal processor 513, a crypto processor (not shown), a chipset 514, an antenna 516, a display (not shown), a touchscreen display 518, a touchscreen controller 520, a battery 522, an audio codec (not shown), a video codec (not shown), a power amplifier 524, a global positioning system (GPS) device 526, a compass 528, an accelerometer, a gyroscope, a speaker 530, user and away facing image capture devices 532, and a mass storage device (such as hard disk drive, compact disk (CD), digital versatile disk (DVD), and so forth).
- volatile memory e.g., DRAM 508
- volatile memory e.g., DRAM 508
- non-volatile memory e.g., ROM 510
- flash memory 51 may include instructions to be executed by processor 504, graphics processor 512, digital signal processor 513, and/or crypto processor, to practice various aspects of the methods and apparatuses described earlier with references to Figures 2-4 on mobile devices 102 and/or computing device 500.
- the communication chip 506 may enable wired and/or wireless communications for the transfer of data to and from the computing device 500 through one or more networks.
- wireless and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non- solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not.
- the communication chip 506 may implement any of a number of wireless standards or protocols, including but not limited to Wi-Fi (IEEE 802.1 1 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, derivatives thereof, as well as any other wireless protocols that are designated as 3G, 4G, 5G, and beyond.
- the computing device 500 may include a plurality of communication chips 506.
- a first communication chip 506 may be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and a second communication chip 506 may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.
- the processor 504 of the computing device 500 may include an integrated circuit die packaged within the processor 504.
- the term "processor” may refer to any device or portion of a device (e.g., a processor core) that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory.
- the communication chip 506 also includes an integrated circuit die packaged within the communication chip 506.
- another component housed within the computing device 500 may contain an integrated circuit die that includes one or more devices, such as processor cores, cache and one or more memory controllers.
- the computing device 500 may be a laptop, a netbook, a notebook, an ultrabook, a smartphone, a tablet, a personal digital assistant (PDA), an ultra mobile PC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, an entertainment control unit, a digital camera, a portable music player, or a digital video recorder.
- the computing device 500 may be any other electronic device that processes data.
- one or more computer-readable media may have instructions that may be configured to, in response to execution of the instructions by a mobile device, enable the mobile device to acquire data associated with motion of an environment in which the mobile device is situated, and calculate motion compensation for at least a portion of visual output of an application of the mobile device.
- the motion compensation calculation may be based at least in part on the data and may be for use by the application to adapt at least the portion of visual output of the application.
- the data associated with motion may include one or more of accelerometer data, global positioning system (GPS) data, and mode of travel data.
- the mode of travel data may indicate a mode of travel selected from a travel mode group having at least one of a car mode, a truck mode, a train mode, an airplane mode, a boat mode, a ship mode, a walking mode, a jogging mode, a running mode, a mobile chair mode, a bicycle mode, a horse carriage mode, and a motorcycle mode.
- the instructions may be further configured to, in response to execution by the mobile device, enable the mobile device to determine a current focus of a user of the mobile device, and identify the portion of visual output of the application, based at least in part on the current focus.
- the instructions to enable the mobile device to determine may include instructions to receive real time captured images of the user, determine the current focus based at least in part on the real time captured images.
- the instructions may be configured, in response to execution of the instructions by the mobile device, to enable the mobile device to acquire at least some of the data associated with motion from the user.
- the instructions may be configured, in response to execution of the instructions by the mobile device, to enable the mobile device to adapt at least the portion of visual output of the application based at least in part on the motion compensation.
- the instructions to enable the mobile device to calculate the motion compensation may include instructions to accumulate at least part of the data associated with motion in memory of the mobile device, and determine patterns for the motion compensation based on the data associated with motion that has been accumulated.
- the instructions may further be configured to, in response to execution by the mobile device, enable the mobile device to adapt at least the portion of visual output of the application based on the patterns.
- the instructions may be further configured to, in response to execution by the mobile device, enable the mobile device to acquire at least part of the data associated with motion of the environment in which the mobile device is situated from a remote computing device.
- the environment in which the mobile device is situated may be one of a plurality of environments in which the mobile device may be operable.
- the remote computing device may be configured to store the data associated with motion within the plurality of environments.
- a method may include receiving, by sensors of a mobile device, data that is characteristic of an environment in which the mobile device is operated.
- the method may include determining, by the mobile device, motion compensation based on the data for a visual output displayed by the mobile device.
- the method may include accumulating the data as historic motion data in memory of the mobile device, and determining the motion compensation with a combination of the data that is received in real-time and the historic motion data.
- Determining the motion compensation may include determining, with the mobile device, patterns based on the data received in real-time and the historic motion data. The patterns approximate motion of the mobile device that may occur in the environment during operation of the mobile device.
- the method may include determining the motion compensation based on the patterns.
- the method may include receiving, from a user, inputs to enable the mobile device to determine which of the historical motion data to use while determining the predictive motion compensation.
- the inputs may include one or more of a geographical location of operation of the mobile device, a rate of motion of the mobile device, and a type of vehicle within which the mobile device is operated.
- the data may include eye-tracking data of a user.
- the method may further include monitoring eye movements of a user based on images captured by an image capture device of the mobile device, and determining motion differences between one or more eyes of a user and the mobile device.
- a mobile system may include a housing configured to carry one or more electronic circuits, a display coupled to the housing and configured to display visual output of an application of the mobile system, and a user- oriented image capture device carried by the housing.
- the mobile system may include memory carried by the housing and configured to store a number of instructions.
- the mobile system may include one or more processors configured, in response to execution of the instructions, to acquire data associated with motion of an environment in which the mobile device is situated, and to calculate motion compensation for at least a portion of the visual output, based at least in part on the data associated with motion, for use by the application to adapt at least the portion of the visual output of the application.
- the one or more processors may be further configured to, in response to execution of the instructions, monitor eye movement of a user, and identify at least the portion of the visual output based on said monitoring of the eye movement.
- the mobile system may be one of a smart phone, tablet computing device, laptop, netbook, and personal digital assistant.
- the display may be a touch screen display.
- one or more computer-readable media may have instructions that may be configured to, in response to execution of the instructions by a computing device, enable the computing device to receive data associated with motion in one or more environments from one or more remote computing devices, and to provide the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by the application to adapt at least the portion of the visual output of the application.
- the instructions may be further configured to, in response to execution by the computing device, enable the computing device to accumulate the data, determine patterns from the data to support calculations of motion compensation by the one or more remote computing devices, and provide the patterns with the data to the remote computing devices.
- a computing device may include a network interface to communicate with one or more remote computing devices, memory to store the data and to store instructions, and one or more processors configured to, in response to execution of the instructions receive data associated with motion in one or more environments from one or more remote computing devices.
- the one or more processors may be configured to provide the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by the application to adapt at least the portion of the visual output of the application.
- the one or more processors may be configured to, in response to execution of the instructions, provide the data in response to a request for the data by the one or more remote computing devices.
- the one or more processors may be further configured to, in response to execution of the instructions, determine patterns of motion for each of the one or more environments based on the data received, and provide the patterns to the one or more remote computing devices.
- each of the features described for each of the computer readable media, methods, and apparatus may be combined with other features of each of the computer readable media, methods, and apparatuses.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Systems, storage medium, and methods associated with motion compensation of visual output on a mobile device are disclosed herein. In embodiments, a storage medium may have instructions to enable the mobile device to acquire data associated with motion of an environment in which the mobile device may be situated. The instructions may also enable the mobile device to calculate motion compensation for at least a portion of visual output of an application of the mobile device. The instruction may enable the mobile device to calculate motion compensation based at least in part on the data associated with motion, for use by the application to adapt at least the portion of visual output of the application. Other embodiments may be disclosed or claimed.
Description
ADAPTIVE VISUAL OUTPUT BASED ON MOTION COMPENSATION OF A
MOBILE DEVICE
Cross Reference to Related Application
The present application claims priority to U.S. Patent Application No. 13/592,298, filed August 22, 2012, entitled "ADAPTIVE VISUAL OUTPUT BASED ON MOTION COMPENSATION OF A MOBILE DEVICE."
Technical Field
This application relates to the technical field of data processing, more specifically to methods and apparatuses associated with adaptive visual output of a mobile device based on motion compensation.
Background
The background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
A handheld device such as tablet, smartphone, or other mobile device may be very difficult to read or watch when the device is in motion. This includes when the user is reading or watching content on the device while traveling in a car, while walking, or while engaging in other human motion. Environmental motion caused by a car or the user is transferred to the mobile device, and the user's eyes have to try to follow the changing position of the content, e.g., reading material. For many people, focusing on moving words or images may cause nausea, especially while trying to read in a car.
Brief Description of the Drawings
Embodiments of the present invention will be described by way of exemplary embodiments, but not limitations, illustrated in the accompanying drawings in which like references denote similar elements, and in which:
Figure 1 illustrates an arrangement for adaptive visual output of a mobile device based on motion compensation;
Figure 2 illustrates an operational diagram for motion compensation by the mobile device of Figure 1 ;
Figure 3 illustrates a method of performing motion compensation by the mobile device of Figure 1 ;
Figure 4 illustrates a method of operating the remote computing device of Figure 1 ;
and
Figure 5 illustrates an example of a mobile device of Figure 1 ; all arranged in accordance with embodiments of the present disclosure.
Detailed Description
Various aspects of the illustrative embodiments will be described using terms commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. However, it will be apparent to those skilled in the art that alternate embodiments may be practiced with only some of the described aspects. For purposes of explanation, specific numbers, materials, and configurations are set forth in order to provide a thorough understanding of the illustrative embodiments. However, it will be apparent to one skilled in the art that alternate embodiments may be practiced without the specific details. In other instances, well-known features are omitted or simplified in order not to obscure the illustrative embodiments.
Various operations will be described as multiple discrete operations, in turn, in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations need not be performed in the order of presentation. Further, descriptions of operations as separate operations should not be construed as requiring that the operations be necessarily performed independently and/or by separate entities. Descriptions of entities and/or modules as separate modules should likewise not be construed as requiring that the modules be separate and/or perform separate operations. In various embodiments, illustrated and/or described operations, entities, data, and/or modules may be merged, broken into further sub-parts, and/or omitted.
The phrase "in one embodiment" or "in an embodiment" is used repeatedly. The phrase generally does not refer to the same embodiment; however, it may. The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A), (B), or (A and B)". The phrase "at least one of A, B and C" means "(A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C)".
Embodiments of the present disclosure may enable a mobile device such as a smartphone, tablet, notebook, or the like, to compensate for the environmental motion arising while the user of the mobile device is traveling in a vehicle, is walking, or is moving with the mobile device in some other manner.
Figure 1 illustrates a perspective view of a system 100 in which a mobile device is configured to adjust visual output of an application using motion compensation, and illustrates an example arrangement of the mobile device, in accordance with various embodiments. Visual output, as used herein, may refer to any visual information rendered or displayed by a mobile device and may include, user-interfaces, movies, applications, games, system utilities, documents, productivity tools, buttons, dialog windows, and the like. Motion compensation, as used herein, may refer to compensation of motion that is caused by or due to (but not limited to) the environment in which the mobile device is operated. This motion may be referred to as environmental motion. As illustrated, system 100 may include a mobile device 102 which may be operated and/or manipulated, by a user 104. Mobile device 102 may be communicatively coupled to remote computing device 106 via one or more networks 108.
According to embodiments, mobile device 102 may be configured to measure, in real-time, various characteristics of the environment in which mobile device 102 is being operated and acquire data associated with environmental motion. For example, mobile device 102 may be configured to measure acceleration, speed, vertical displacements (i.e., roughness of terrain), and lateral displacements by using one or more of accelerometer data, global positioning system (GPS) data, images of the surrounding landscape, and eye or face movement of user 104.
Mobile device 102 may be configured to use the various data associated with motion of an environment to calculate motion compensation for visual output rendered by mobile device 102. For example, mobile device 102 may be configured to use accelerometer data, GPS data, and image data to determine a pattern or frequency of movement or other environmental motion of mobile device 102. Mobile device 102 may be configured to compensate for movement or other environmental motion by adjusting visual output of mobile device 102. According to some embodiments, mobile device 102 may be configured to reduce or eliminate a frequency of movement or environmental motion.
Mobile device 102 may be configured to generate patterns based on accumulated data associated with environmental motion, i.e., historic motion characteristics, and use the patterns to estimate future environmental motion. Based on the estimated future environmental motion, mobile device 102 may be configured to proactively decrease the effects expected by the future environmental motion. For example, mobile device 102 may be configured to recall environmental motion data acquired during previous travels
through a specific geographic terrain and implement a motion compensation scheme particular to the specific geographic terrain, e.g., to compensate for motion associated with traveling over a rocky road in a car.
Mobile device 102 may also be configured to use face-tracking or eye-tracking information while calculating motion compensation. For example, mobile device 102 may be configured to use face-tracking or eye-tracking information to determine motion of mobile device 102 relative to user 104 and compensate for the relative motion. According to other embodiments, mobile device 102 may use eye-tracking information to determine a current reading speed of user 104 and select between multiple alternate compensation calculations based on the determined reading speed. Once mobile device 102 calculates or determines environmental motion, mobile device may adjust visual output of mobile device 102 to improve the reading and/or interactive experience of user 104.
Mobile device 102 may include display device 1 10, user-facing image capture device 112, away-facing image capture device 1 12, motion-related sensors 1 16, a network interface 118, a peripheral interface 120, storage 122, and one or more processors 124, coupled with each other, via e.g., one or more communication buses 126.
Display 1 10 may be any one of a number of display technologies suitable for use on a mobile device. For example, display 1 10 may be a liquid crystal display (LCD), a thin- film transistor LCD, a plasma display, or the like. According to various
embodiments, display 1 10 may be a touch sensitive display, i.e., a touchscreen. As a touchscreen, display 1 10 may be one of a number of types of touch screen, such as acoustic, capacitive, resistive, infrared, or the like.
User-facing image capture device 112 may be disposed on the mobile device 102 and oriented to face user 104. User-facing image capture device 112 may be configured to capture images from a user-facing direction. User-facing image capture device 1 12 may be a complimentary metal oxide semiconductor (CMOS) image sensor, a charge-coupled device (CCD) image sensor, or one or more antennas configured to construct or create images in response to received electromagnetic signals.
Away-facing image capture device 1 14 may be oriented towards a direction opposite to user 104. Image capture device 114 may be configured to optically capture images from an outward facing direction. Image capture device 1 14 may be a
complimentary metal oxide semiconductor (CMOS) image sensor, a charge-coupled device (CCD) image sensor, or one or more antennas configured to construct or create images in response to received electromagnetic signals. According to embodiments,
mobile device 102 may use away-facing image capture device 1 14 to capture images of portions of the environment facing away from user 104 and use the captured images to calculate motion compensation.
Motion-related sensors 1 16 may be configured to capture data related to environmental motion in various types of vehicles or modes of travel. As described above, briefly, motion-related sensors 1 16 may include accelerometer, a GPS unit, and the like. According to embodiments, antennas on mobile device 102 may be used to determine motion of mobile device 102 using triangulation techniques based on wirelessly received data. According to various embodiments, motion-related sensors 116 may be configured to acquire data associated with environmental motion caused by operation of mobile device 102 within any one of a variety of vehicles or transportation techniques. For example, motion-related sensors 1 16 may be configured to acquire data associated with environmental motion caused by traveling in a car, a truck, a train, an airplane, a boat, a ship, a mobile chair, a bicycle, a horse carriage, a motorcycle, and the like.
According to other embodiments, motion-related sensors 1 16 may be configured to capture data related to environmental motion caused by walking, jogging, or running by user 104.
Network interface 118 may be configured to couple mobile device 102 to remote computing device 106 through one or more networks 108, hereinafter network 108.
Network interface 118 may be a wireless local area network interface, such as a WiFi® interface in compliance with one of the IEEE 802.1 1 standards. (IEEE= Institute of Electrical and Electronics Engineers.) Network interface 118 may include a wireless wide area network interface, such as 3 G or 4G telecommunication interface. (3G and 4G refer to the 3rd and 4th Generation of Mobil Telecommunication Standards as defined by International Telecommunication Union.)
Peripheral interface 120 may enable a variety of user interfaces, such as mice, keyboards, monitors, and/or audio commands. For example, peripheral interface 120 may enable USB ports, PS/2 ports, Firewire® ports, Bluetooth®, and the like, according to various embodiments.
Storage 122 may be volatile memory, non-volatile memory, and/or a combination of volatile memory and non-volatile memory. Storage 122 may also include optical, electro-magnetic and/or solid state storage. Storage 122 may store a plurality of instructions which, when executed by processor 124, may cause mobile device 102 to perform various functions related to motion compensation, as discussed above and as will
be discussed below in further detail.
One or more processors 124 (hereinafter processor 124) may be configured to execute the plurality of instructions stored in storage 122. Processor 124 may be any one of a number of single or multi-core processors. In response to execution of the plurality of instructions, processor 124 may be configured to enable mobile device 102 to perform any one or more of the various functions disclosed herein. For example, processor 124 may be configured to cause mobile device 102 to acquire data associated with environmental motion using one or more of user-facing image capture device 112, facing-away image capture device 1 14, and motion-related sensors 116. Processor 124 may also be configured to calculate motion compensation and adjust visual output on display 110 to reduce or eliminate environmental motion and/or motion relative to user 104.
According to various embodiments, remote computing device 106 may be communicatively coupled to mobile device 102 via network 108. Remote computing device 106 may be located remotely from mobile device 102, such that each of remote computing device 106 and mobile device 102 are remote devices with respect to each other device. Remote computing device 106 may be configured to receive and store various data related to environmental motion and/or motion compensation from mobile device 102 and/or one or more devices that may be similar to mobile device 102. Remote computing device 106 may be configured to receive, via network 108, data from motion related sensors 116, data from user-facing image capture device 112, and/or data from facing-away image capture device 114. According to embodiments, remote computing device 106 may calculate motion compensation for visual output of mobile device 102 and may transmit the calculated motion compensation to mobile device 102. According to other embodiments, remote computing device 106 may accumulate various environmental motion data specific to various geographical locations and related to various modes of travel. Remote computing device 106 may then be configured to selectively provide environmental motion data to mobile device 102 or similar devices, in response to receiving a request for such information via the network 108.
Remote computing device 106 may include storage 128, processor 130, network interface 132, and peripheral interface 134.
Storage 128 may be volatile memory, non-volatile memory, and/or a combination of volatile memory and non-volatile memory. Storage 128 may also include optical, electro-magnetic and/or solid state storage. Storage 128 may store a plurality of instructions which, when executed by processor one or more processors 130, may cause
remote computing device 102 to perform various functions related to supporting motion compensation in mobile device 102.
One or more processors 130 (hereinafter processor 130) may be configured to execute the plurality of instructions stored in storage 128. Processor 130 may be any one of a number of single or multi-core processors. In response to execution of the plurality of instructions, processor 130 may be configured to enable remote computing device 106 to perform any one or more of the various motion compensation-related functions disclosed herein. Processor 130 may be configured to cause remote computing device 130 to transfer data related to environmental motion and/or motion compensation to and/or from mobile device 102 via network 108.
Network interface 132 may be configured to couple remote computing device 106 to mobile device 102 through network 108. Network interface 132 may be a wireless local area network interface, such as a WiFi® interface in compliance with one of the IEEE 802.11 standards. (IEEE= Institute of Electrical and Electronics Engineers.) Network interface 132 may include a wireless wide area network interface, such as 3G or 4G telecommunication interface. (3G and 4G refer to the 3rd and 4th Generation of Mobil Telecommunication Standards as defined by International Telecommunication Union.)
Peripheral interface 134 may enable a variety of user interfaces, such as mice, keyboards, monitors, and/or audio commands. For example, peripheral interface 134 may enable USB ports, PS/2 ports, Firewire® ports, Bluetooth®, and the like, according to various embodiments.
Figure 2 illustrates an operational diagram 200 for motion compensation of mobile device 102, according to various embodiments. Diagram 200 includes a motion characteristics collection module 202, display frame motion compensation application module 204, and a historic motion compensation learning module 206, communicatively coupled with each other as shown.
Motion characteristics collection module 202 may be configured to collect various data and/or information about environmental motion in which mobile device 102 is operated. Motion characteristics collection module 202 may be configured to provide motion compensation calculations to display frame motion compensation application module 204. Motion characteristics collection module 202 may be configured to receive patterns or data related to historic motion compensation calculations from historic motion compensation learning module 206. Motion characteristics collection module 202 may include eye tracking module 208, position tracking module 210, and motion compensation
calculation module 212. Eye tracking module 208 and position tracking module 210 may be configured to provide information to motion compensation calculation module 212.
Eye tracking module 208 may be a service or application configured to use images from user-facing image capture device 1 12 to track a face or eyes of a user, e.g. user 104. Eye tracking module 208 may be configured to track and orientation of the user's eyes or face relative to an orientation of a display 1 10 to enhance the calculations of motion compensation. Eye tracking module 208 may be configured to determine where on display 110 of mobile device 102 a user's eyes are focused and/or to determine a reading speed of user.
Position tracking module 210 may be a service or application configured to track the specific GPS or location coordinates, rate of movement, accelerometer data, and the like, of mobile device 102.
Motion compensation calculation module 212 may receive input data from eye tracking module 208 and position tracking module 210 and may be configured to calculate motion compensation based on the received input data. Motion compensation calculation module 212 may be a service or an application configured to calculate the real-time motion compensation, for example, by generating real-time motion compensation vectors. According to embodiments, motion compensation calculation module 212 may receive and use information from eye tracking module 208, position tracking module 210 and historic motion compensation learning module 206 to predictively calculate real-time motion compensation. The calculated motion compensation may be used to adjust visual output of display 1 10.
Display frame motion compensation application module 204 may be configured to receive motion compensation calculations from motion characteristics collection module 202. Display frame motion compensation application module may be a service or application that may be integrated into display 110 or other portions of mobile device 102 to apply motion compensation to visual output. Display frame motion compensation application module may be configured to adjust a portion of visual output display 110 independent of the overall movement of mobile device 102. Such adjustment may result in visual output that is stable and independent of environmental motion. According to some embodiments, display frame motion compensation application module 204 may apply motion compensation to all or to a portion of display 110. For example, motion compensation may be applied just to a portion of display 1 10 upon which the eyes of user 104 are focused.
Display frame motion compensation application module 204 may be selectively enabled by user 104. For example, mobile device 102 may continuously acquire environmental motion data and may continuously update patterns generated by historic motion compensation learning module 206 for a specific geographic location, mode of travel, and rate of travel. However, user 104 may selectively enable or disable display frame motion compensation application module 204, for example, via a switch, button, and/or user interface included in visual output of display 110.
Historic motion compensation learning module 206 may be a service or application configured to provide historical and cumulative learning or information about the motion characteristics, i.e., environmental motion, and motion compensation data collected and/or calculated for mobile device 102 while having a particular location and travel rate. For example, mobile device 102 may be operated during travels over a freeway, e.g., US 10 in Phoenix, at 65 mph to have specific environmental motion or motion characteristics from which a particular set of motion compensation data or vectors may be generated. By contrast, mobile device 102 may be operated during travels on a two-lane road in the suburbs to have environmental motion or motion characteristics from which an entirely different set of motion compensation data or vectors may be generated. Historic motion compensation learning module 206 may receive motion compensation calculations from display frame motion compensation application module 204. Alternatively, historic motion compensation learning module 206 may receive motion compensation calculations directly from motion characteristics collection module 202. According to some embodiments, historic motion compensation learning module 206 may be enabled to operate independent of whether display frame motion compensation application module 204 is enabled.
Historic motion compensation learning module 206 may include a pattern recognition engine or other learning algorithm, such as the Hidden Markov Model, etc., as a core of the recognition engine or learning system. Other pattern recognition techniques, such as those known by one of ordinary skill in the art, may be applied. The pattern recognition engine or learning algorithm may be configured to analyze patterns in motion compensation calculations. The patterns may be used to estimate future environmental motion to support predictive motion compensation calculations. The pattern recognition engine or learning algorithm may be configured to analyze patterns in environmental motion that are caused in part by a user. For example, some environmental motion may be caused by a user's inability to hold a mobile device 102 steady while traveling in a vehicle,
or during some other movement.
According to some embodiments, instructions for historic motion compensation learning module 206 may be stored locally on mobile device 102, e.g., in storage 122. According to other embodiments, instructions for historic motion compensation learning module 206 may be stored locally on mobile device 102 and/or remotely from mobile device 102. For example, instructions for historic motion compensation learning module 206 may be stored on remote computing device 106, e.g. in storage 128.
Figure 3 illustrates a method 300 of operating mobile device 102 to compensate for environmental motion, according to various embodiments.
At block 302, mobile device 102 may acquire data associated with motion of mobile device 102. In particular, mobile device 102 may be configured to acquire data associated with motion of an environment in which mobile device 102 is situated and/or operated. The data associated with the motion of mobile device 102 may include accelerometer data, GPS data, and/or mode of travel data. Mode of travel data may indicate a mode of travel selected from a travel mode group. The travel mode group may include at least one of a car mode, a truck mode, a train mode, an air plane mode, a boat mode, a ship mode, a walking mode, a jogging mode, a running mode, a mobile chair mode, a bicycle mode, a horse carriage mode, and a motorcycle mode.
At block 304, mobile device 102 may calculate motion compensation based on the data acquired at block 302. In particular, mobile device 102 may calculate motion compensation for at least a portion of visual output of mobile device 102. Mobile device 102 may calculate motion compensation based on the acquired data and may calculate motion compensation to adapt a portion of visual output generated by the application.
Mobile device 102 may be configured to provide user 104 with the option of selecting an automatic mode or a manual mode. In automatic mode, mobile device 102 may automatically acquire environmental motion data and use a default setting for mode of travel based upon speed and geographic location of mobile device 102. In manual mode, mobile device 102 may request input from user 104. For example, mobile device 102 may receive input from user 104 related to geographic location, and mode of travel, terrain, and the like. According to some embodiments, mobile device 102 may be configured to request feedback from user 104 regarding whether user 104 perceives improvement in the visual output. Based on the feedback from user 104, mobile device 102 may require environmental motion data and/or re-calculate motion compensation.
According to one use scenario, mobile device 102 may be a tablet displaying visual
output associated with an electronic book (ebook). User 104 may be reading visual output on mobile device 102 while sitting in the passenger seat of an automobile driving on a highway. As the car is moving, mobile device 102 and user 104 may be physically moving from environmental motion of the car. This environmental motion may be considered a reference movement or motion to be compensated for. Reading the visual output on mobile device 102 may be difficult while riding in the car because of the constant movement of display 110, which may be difficult to focus on. The constant movement of display 110 may also cause user 104 to become nauseous while trying to focus on the moving mobile device 102. User 104 may switch ON a motion compensation option of mobile device 102, in accordance with embodiments disclosed herein, and select a "vehicle motion" setting. According to method 300 and other disclosed embodiments, mobile device 102 may begin collecting motion data from the current motion environment. The collected motion data may include accelerometer data, GPS coordinate movement, eye and facial movement of user 104 as measured by user- facing image capture device 112, and the like. The collected motion data may also include surrounding environmental movement as measured by facing-away image capture device 1 14. Mobile device 102 may also access previously learned motion compensation data, i.e. historic motion compensation, for a user 104 at the current specific GPS location, and for the terrain associated with the acquired GPS location. Mobile device 102 may then calculate motion compensation in real-time, e.g., by calculating predictive motion compensation vectors. Mobile device 102 may use the calculated motion compensation to adjust visual output on display 110 (independent of a housing of mobile device 102). The adjusted visual output may offset, reduce, or minimize the overall image movement that user 104 perceives. User 104 may then be able to read in the car without becoming nauseous.
People may become more or less nauseous at particular frequencies of motion. For example, a vehicle and/or a mobile device moving at approximately .2 hertz may increase the discomfort experienced by user 104 while viewing mobile device 102. According to some embodiments, mobile device 102 may use the calculated motion compensation to adjust the visual output to increase a frequency of motion or decrease a frequency of motion to avoid a frequency of motion that may increase motion sickness or nausea experienced by user 104.
According to another use scenario, user 104 may be reading visual output on mobile device 102 while walking on a sidewalk. While user 104 is walking, user 104 and mobile device 102 may both be moving but may both be moving out of synchronization as
user 104 attempts to hold mobile device 102 steady. The difference between the walking movement and the movement of mobile device 102 may be considered the reference movement to be compensated. Generally, walking may create enough environmental motion to make it difficult to focus on mobile device 102. User 104 may then switch ON a motion compensation option of mobile device 102 and select a "walking motion" setting. According to method 300 and other disclosed embodiments, mobile device 102 may begin collecting motion data from the current motion environment. Mobile device 102 may access historic motion compensation data, locally or remotely. Mobile device 102 may then calculate motion compensation in real-time, such as real-time predictive motion compensation vectors. Based on the calculated motion compensation, mobile device 102 may adjust the visual output on display 110 (independent of a housing of mobile device 102) to offset, reduce, or minimize the overall image movement that user 102 perceives. User 104 may then be able to read visual output from mobile device 102 with little or no reference movement or motion, i.e., little or no differential movement between the eye's of user 104 and the visual output of mobile device 102.
According to some embodiments, mobile device 102 may adjust a portion of the visual output of display 1 10 rather than all of the visual output. For example, mobile device 102 may determine, based on images captured from user-facing camera 1 12, that the eyes of user 104 are focused on a particular portion of the visual output, e.g., an upper left-hand corner of display 110. Mobile device 102 may be configured to adjust the particular portion of the visual output that is focused on without adjust the remaining portion of the visual output. Selectively focusing on portions of the visual output may decrease processing power by mobile device 102 and may extend the life of a battery or power supply of mobile device 102.
According to other embodiments, mobile device 102 may use eye-tracking features to improve motion compensation performance of mobile device 102. For example, eye- tracking features may enable mobile device 102 to improve characterization of environmental motion based on movement of the eyes of user 104 while attempting to focus on the visual output. As another example, eye or face tracking features may be used to identify the user and associate the acquired environmental motion and the calculated motion compensation with a particular user. Each user of mobile device 102 may select an account from which individual motion compensation settings and historic motion compensation and patterns may be retrieved.
Figure 4 illustrates a method 400 of operating the remote computing device 106 to
support motion compensation for mobile device 102.
At block 402, remote computing device 106 may receive data associated with motion. In particular, remote computing device 106 may receive data associated with motion in one or more environments from one or more computing devices that are located remotely from remote computing device 106, e.g., mobile device 102. Remote computing device 106 may receive environmental motion data from mobile device 102 and/or from other similar mobile devices operated by one or more users. Remote computing device 106 may receive environmental motion data through one or more networks, such as network 108. Remote computing device 106 may accumulate and store environmental motion data in a relational data structure, such as a database, and may associate burdensome environmental motion data with a corresponding geographical locations, terrain, and/or modes of travel.
At block 404, remote computing device 106 may provide the received data to the one or more computing devices. In particular, remote computing device 106 may provide the received data to one or more computing devices to enable the one or more computing devices to calculate motion compensation. The one or more computing devices may be configured to use the provided data to calculate motion compensation for all or a part of visual output rendered by the one or more computing devices.
In some embodiments, remote computing device 106 may be configured to calculate motion compensation based on the received data. Remote computing device 106 may then provide the calculated motion compensation to the one or more computing devices to enable the one or more computing devices to adjust the respective visual outputs to the respective environmental motion in which each of the one or more computing devices is operated. According to other embodiments, remote computing device 106 may be configured to determine and/or generate patterns based on the calculated motion compensation, the received data, or both. Remote computing device 106 may transmit or provide the determined patterns to the one or more computing devices to support motion compensation calculations by the one or more computing devices.
Figure 5 illustrates a computing device 500 in accordance with one implementation of an embodiment of the invention. Depending on the actual components included, computing device 500 may be suitable for use as mobile device 102 or remote computing device 106 of Figure 1. In embodiments, computing device 500 may house a motherboard 502. Motherboard 502 may include a number of components, including but not limited to a processor 504 and at least one communication chip 506. Processor 504 may be
physically and electrically coupled to motherboard 502. In some implementations the at least one communication chip 506 may also be physically and electrically coupled to motherboard 502. In further implementations, the communication chip 506 may be part of the processor 504. In alternate embodiments, the above enumerated may be coupled together in alternate manners without employment of motherboard 502.
Depending on its applications, computing device 500 may include other components that may or may not be physically and electrically coupled to motherboard 502. These other components include, but are not limited to, volatile memory (e.g., DRAM 508), non-volatile memory (e.g., ROM 510), flash memory 51 1, a graphics processor 512, a digital signal processor 513, a crypto processor (not shown), a chipset 514, an antenna 516, a display (not shown), a touchscreen display 518, a touchscreen controller 520, a battery 522, an audio codec (not shown), a video codec (not shown), a power amplifier 524, a global positioning system (GPS) device 526, a compass 528, an accelerometer, a gyroscope, a speaker 530, user and away facing image capture devices 532, and a mass storage device (such as hard disk drive, compact disk (CD), digital versatile disk (DVD), and so forth).
In various embodiments, volatile memory (e.g., DRAM 508), non-volatile memory (e.g., ROM 510), and/or flash memory 51 1, may include instructions to be executed by processor 504, graphics processor 512, digital signal processor 513, and/or crypto processor, to practice various aspects of the methods and apparatuses described earlier with references to Figures 2-4 on mobile devices 102 and/or computing device 500.
The communication chip 506 may enable wired and/or wireless communications for the transfer of data to and from the computing device 500 through one or more networks. The term "wireless" and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non- solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. The communication chip 506 may implement any of a number of wireless standards or protocols, including but not limited to Wi-Fi (IEEE 802.1 1 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, derivatives thereof, as well as any other wireless protocols that are designated as 3G, 4G, 5G, and beyond. The computing device 500 may include a plurality of communication chips 506. For instance, a first communication chip 506 may
be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and a second communication chip 506 may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.
The processor 504 of the computing device 500 may include an integrated circuit die packaged within the processor 504. The term "processor" may refer to any device or portion of a device (e.g., a processor core) that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory.
The communication chip 506 also includes an integrated circuit die packaged within the communication chip 506.
In further implementations, another component housed within the computing device 500 may contain an integrated circuit die that includes one or more devices, such as processor cores, cache and one or more memory controllers.
In various implementations, the computing device 500 may be a laptop, a netbook, a notebook, an ultrabook, a smartphone, a tablet, a personal digital assistant (PDA), an ultra mobile PC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, an entertainment control unit, a digital camera, a portable music player, or a digital video recorder. In further implementations, the computing device 500 may be any other electronic device that processes data.
According to various embodiments, one or more computer-readable media may have instructions that may be configured to, in response to execution of the instructions by a mobile device, enable the mobile device to acquire data associated with motion of an environment in which the mobile device is situated, and calculate motion compensation for at least a portion of visual output of an application of the mobile device. The motion compensation calculation may be based at least in part on the data and may be for use by the application to adapt at least the portion of visual output of the application. The data associated with motion may include one or more of accelerometer data, global positioning system (GPS) data, and mode of travel data. The mode of travel data may indicate a mode of travel selected from a travel mode group having at least one of a car mode, a truck mode, a train mode, an airplane mode, a boat mode, a ship mode, a walking mode, a jogging mode, a running mode, a mobile chair mode, a bicycle mode, a horse carriage mode, and a motorcycle mode.
In embodiments, the instructions may be further configured to, in response to execution by the mobile device, enable the mobile device to determine a current focus of a
user of the mobile device, and identify the portion of visual output of the application, based at least in part on the current focus. The instructions to enable the mobile device to determine may include instructions to receive real time captured images of the user, determine the current focus based at least in part on the real time captured images. The instructions may be configured, in response to execution of the instructions by the mobile device, to enable the mobile device to acquire at least some of the data associated with motion from the user. The instructions may be configured, in response to execution of the instructions by the mobile device, to enable the mobile device to adapt at least the portion of visual output of the application based at least in part on the motion compensation.
In embodiments, the instructions to enable the mobile device to calculate the motion compensation may include instructions to accumulate at least part of the data associated with motion in memory of the mobile device, and determine patterns for the motion compensation based on the data associated with motion that has been accumulated. The instructions may further be configured to, in response to execution by the mobile device, enable the mobile device to adapt at least the portion of visual output of the application based on the patterns.
According to embodiments, the instructions may be further configured to, in response to execution by the mobile device, enable the mobile device to acquire at least part of the data associated with motion of the environment in which the mobile device is situated from a remote computing device. The environment in which the mobile device is situated may be one of a plurality of environments in which the mobile device may be operable. The remote computing device may be configured to store the data associated with motion within the plurality of environments.
According to various embodiments, a method may include receiving, by sensors of a mobile device, data that is characteristic of an environment in which the mobile device is operated. The method may include determining, by the mobile device, motion compensation based on the data for a visual output displayed by the mobile device. The method may include accumulating the data as historic motion data in memory of the mobile device, and determining the motion compensation with a combination of the data that is received in real-time and the historic motion data. Determining the motion compensation may include determining, with the mobile device, patterns based on the data received in real-time and the historic motion data. The patterns approximate motion of the mobile device that may occur in the environment during operation of the mobile device. The method may include determining the motion compensation based on the patterns.
In embodiments, the method may include receiving, from a user, inputs to enable the mobile device to determine which of the historical motion data to use while determining the predictive motion compensation. The inputs may include one or more of a geographical location of operation of the mobile device, a rate of motion of the mobile device, and a type of vehicle within which the mobile device is operated. The data may include eye-tracking data of a user. The method may further include monitoring eye movements of a user based on images captured by an image capture device of the mobile device, and determining motion differences between one or more eyes of a user and the mobile device.
According to various embodiments, a mobile system may include a housing configured to carry one or more electronic circuits, a display coupled to the housing and configured to display visual output of an application of the mobile system, and a user- oriented image capture device carried by the housing. The mobile system may include memory carried by the housing and configured to store a number of instructions. The mobile system may include one or more processors configured, in response to execution of the instructions, to acquire data associated with motion of an environment in which the mobile device is situated, and to calculate motion compensation for at least a portion of the visual output, based at least in part on the data associated with motion, for use by the application to adapt at least the portion of the visual output of the application. The one or more processors may be further configured to, in response to execution of the instructions, monitor eye movement of a user, and identify at least the portion of the visual output based on said monitoring of the eye movement. The mobile system may be one of a smart phone, tablet computing device, laptop, netbook, and personal digital assistant. The display may be a touch screen display.
According to various embodiments, one or more computer-readable media may have instructions that may be configured to, in response to execution of the instructions by a computing device, enable the computing device to receive data associated with motion in one or more environments from one or more remote computing devices, and to provide the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by the application to adapt at least the portion of the visual output of the application.
In embodiments, the instructions may be further configured to, in response to execution by the computing device, enable the computing device to accumulate the data,
determine patterns from the data to support calculations of motion compensation by the one or more remote computing devices, and provide the patterns with the data to the remote computing devices.
According to various embodiments, a computing device may include a network interface to communicate with one or more remote computing devices, memory to store the data and to store instructions, and one or more processors configured to, in response to execution of the instructions receive data associated with motion in one or more environments from one or more remote computing devices. The one or more processors may be configured to provide the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by the application to adapt at least the portion of the visual output of the application. The one or more processors may be configured to, in response to execution of the instructions, provide the data in response to a request for the data by the one or more remote computing devices. The one or more processors may be further configured to, in response to execution of the instructions, determine patterns of motion for each of the one or more environments based on the data received, and provide the patterns to the one or more remote computing devices.
According to various embodiments, each of the features described for each of the computer readable media, methods, and apparatus may be combined with other features of each of the computer readable media, methods, and apparatuses.
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a wide variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described, without departing from the scope of the embodiments of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that the embodiments of the present disclosure be limited only by the claims.
Claims
1. A mobile system, comprising:
a housing configured to carry one or more electronic circuits;
a display coupled to the housing and configured to display visual output of an application of the mobile system;
a user-oriented image capture device carried by the housing;
memory carried by the housing and configured to store a plurality of instructions; and
one or more processors configured, in response to execution of the instructions, to: acquire data associated with motion of an environment in which the mobile device is situated; and
calculate motion compensation for at least a portion of the visual output, based at least in part on the data associated with motion, for use by the application to adapt at least the portion of the visual output of the application.
2. The mobile system of claim 1, wherein the one or more processors are further configured to, in response to execution of the instructions:
monitor eye movement of a user; and
identify at least the portion of the visual output based on said monitoring of the eye movement.
3. The mobile system of claim 1, wherein the mobile system is one of a smart phone, tablet computing device, laptop, netbook, and personal digital assistant.
4. The mobile system of any one of claims 1 - 3, wherein the display is a touch screen display.
5. A computing device, comprising:
a network interface to communicate with one or more remote computing devices; memory to store the data and to store instructions; and
one or more processors configured to, in response to execution of the instructions: receive data associated with motion in one or more environments from one or more remote computing devices; and
provide the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by
the application to adapt at least the portion of the visual output of the application.
6. The computing device of claim 5, wherein the one or more processors are configured to, in response to execution of the instructions, provide the data in response to a request for the data by the one or more remote computing devices.
7. The computing device of claim 5 or 6, wherein the one or more processors are further configured to, in response to execution of the instructions:
determine patterns of motion for each of the one or more environments based on the data received; and
provide the patterns to the one or more remote computing devices.
8. A method, comprising:
receiving, by sensors of a mobile device, data that is characteristic of an environment in which the mobile device is operated; and
determining, by the mobile device, motion compensation based on the data for a visual output displayed by the mobile device.
9. The method of claim 8, further comprising:
accumulating the data as historic motion data in memory of the mobile device; and determining the motion compensation with a combination of the data that is received in real-time and the historic motion data.
10. The method of claim 9, wherein determining the motion compensation includes:
determining, with the mobile device, patterns based on the data received in realtime and the historic motion data, wherein the patterns approximate motion of the mobile device that may occur in the environment during operation of the mobile device; and determining the motion compensation based on the patterns.
11. The method of claim 8, further comprising:
receiving, from a user, inputs to enable the mobile device to determine which of the historical motion data to use while determining the predictive motion compensation.
12. The method of claim 1 1, wherein the inputs include one or more of a geographical location of operation of the mobile device, a rate of motion of the mobile device, and a type of vehicle within which the mobile device is operated.
13. The method of claim 8, wherein the data includes eye-tracking data of a user, wherein the method further comprises:
monitoring eye movements of a user based on images captured by an image capture device of the mobile device; and
determining motion differences between one or more eyes of a user and the mobile device.
14. One or more computer-readable media having instructions configured to cause a mobile device, in response to execution of the instructions by a mobile device, to perform any one of the methods of claims 8 - 13.
15. A method comprising:
receiving, by a computing device data associated with motion in one or more environments from one or more remote computing devices; and
providing the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by the application to adapt at least the portion of the visual output of the application.
16. The method of claim 15, further comprising:
accumulating the data by the computing device;
determining, by the computing device, patterns from the data to support calculations of motion compensation by the one or more remote computing devices; and providing, by the computing device the patterns with the data to the remote computing devices.
17. One or more computer-readable media having instructions configured to cause a mobile device, in response to execution of the instructions by a mobile device, to perform any one of the methods of claims 15 - 16.
18. A mobile apparatus, comprising:
means for acquiring data associated with motion of an environment in which the mobile apparatus is situated; and
means for calculating motion compensation for at least a portion of the visual output, based at least in part on the data associated with motion, for use by the application to adapt at least the portion of the visual output of the application.
19. The mobile apparatus of claim 18, wherein means for acquiring comprises means for monitor eye movement of a user; and wherein the apparatus further comprises means for identify at least the portion of the visual output based on said monitoring of the eye movement.
20. A computing apparatus, comprising:
means for receiving data associated with motion in one or more environments from one or more remote computing devices; and
means for providing the data to the one or more remote computing devices to enable the one or more remote computing devices to calculate motion compensation for at least a portion of visual output of an application of the one or more remote computing devices, for use by the application to adapt at least the portion of the visual output of the application.
21. The computing apparatus of claim 20, further comprising means for providing the data in response to a request for the data by the one or more remote computing devices.
22. The computing apparatus of claim 20, further comprising:
means for determining patterns of motion for each of the one or more
environments based on the data received; and
means for providing the patterns to the one or more remote computing devices.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/592,298 | 2012-08-22 | ||
US13/592,298 US20140055339A1 (en) | 2012-08-22 | 2012-08-22 | Adaptive visual output based on motion compensation of a mobile device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014031342A1 true WO2014031342A1 (en) | 2014-02-27 |
Family
ID=50147525
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/054000 WO2014031342A1 (en) | 2012-08-22 | 2013-08-07 | Adaptive visual output based on motion compensation of a mobile device |
Country Status (2)
Country | Link |
---|---|
US (1) | US20140055339A1 (en) |
WO (1) | WO2014031342A1 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9544204B1 (en) * | 2012-09-17 | 2017-01-10 | Amazon Technologies, Inc. | Determining the average reading speed of a user |
FR3034244B1 (en) * | 2015-03-25 | 2018-02-02 | Atos Se | STABILIZED DISPLAY ON A SCREEN OF A MOBILE DEVICE BY MOTION DETECTION |
CN106155288B (en) * | 2015-04-10 | 2019-02-12 | 北京智谷睿拓技术服务有限公司 | Information acquisition method, information acquisition device and user equipment |
CN106293045B (en) * | 2015-06-30 | 2019-09-10 | 北京智谷睿拓技术服务有限公司 | Display control method, display control unit and user equipment |
CN106331464B (en) * | 2015-06-30 | 2019-10-15 | 北京智谷睿拓技术服务有限公司 | Filming control method, imaging control device and user equipment |
CN106293046B (en) * | 2015-06-30 | 2020-03-17 | 北京智谷睿拓技术服务有限公司 | Information processing method, information processing device and user equipment |
US10621698B2 (en) * | 2017-06-14 | 2020-04-14 | Hadal, Inc. | Systems and methods for virtual reality motion sickness prevention |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060171360A1 (en) * | 2005-01-31 | 2006-08-03 | Samsung Electronics Co., Ltd. | Apparatus and method for displaying data using afterimage effect in mobile communication terminal |
US20090186659A1 (en) * | 2008-01-17 | 2009-07-23 | Platzer Kasper | Active display readability enhancement for mobile devices depending on movement |
US20090201246A1 (en) * | 2008-02-11 | 2009-08-13 | Apple Inc. | Motion Compensation for Screens |
US20100079485A1 (en) * | 2008-09-26 | 2010-04-01 | Microsoft Corporation | Compensating for anticipated movement of a device |
US20100188331A1 (en) * | 2000-10-02 | 2010-07-29 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6317114B1 (en) * | 1999-01-29 | 2001-11-13 | International Business Machines Corporation | Method and apparatus for image stabilization in display device |
JP3773902B2 (en) * | 2003-01-08 | 2006-05-10 | パイオニア株式会社 | Navigation device, navigation map data acquisition method, navigation map data acquisition program, and recording medium recording the same |
FR2869752A1 (en) * | 2004-04-28 | 2005-11-04 | Thomson Licensing Sa | APPARATUS AND METHOD FOR DISPLAYING IMAGES |
JP4855654B2 (en) * | 2004-05-31 | 2012-01-18 | ソニー株式会社 | On-vehicle device, on-vehicle device information providing method, on-vehicle device information providing method program, and on-vehicle device information providing method program |
EP1681538A1 (en) * | 2005-01-18 | 2006-07-19 | Harman Becker Automotive Systems (Becker Division) GmbH | Junction view with 3-dimensional landmarks for a navigation system for a vehicle |
EP1681537A1 (en) * | 2005-01-18 | 2006-07-19 | Harman Becker Automotive Systems (Becker Division) GmbH | Navigation system with animated junction view |
US8775975B2 (en) * | 2005-09-21 | 2014-07-08 | Buckyball Mobile, Inc. | Expectation assisted text messaging |
EP1768387B1 (en) * | 2005-09-22 | 2014-11-05 | Samsung Electronics Co., Ltd. | Image capturing apparatus with image compensation and method therefor |
EP1990786B1 (en) * | 2006-02-28 | 2021-05-19 | Toyota Jidosha Kabushiki Kaisha | Object path prediction method and apparatus |
JP4050309B2 (en) * | 2006-05-16 | 2008-02-20 | 松下電器産業株式会社 | Traffic information providing apparatus, method, and program |
TWI345193B (en) * | 2006-06-15 | 2011-07-11 | Chimei Innolux Corp | Eye tracking compensated method and device thereof and hold-type display |
JP4746514B2 (en) * | 2006-10-27 | 2011-08-10 | シャープ株式会社 | Image display apparatus and method, image processing apparatus and method |
JP4303745B2 (en) * | 2006-11-07 | 2009-07-29 | シャープ株式会社 | Image display apparatus and method, image processing apparatus and method |
US7903166B2 (en) * | 2007-02-21 | 2011-03-08 | Sharp Laboratories Of America, Inc. | Methods and systems for display viewer motion compensation based on user image data |
US8099124B2 (en) * | 2007-04-12 | 2012-01-17 | Symbol Technologies, Inc. | Method and system for correlating user/device activity with spatial orientation sensors |
JPWO2009016693A1 (en) * | 2007-07-27 | 2010-10-07 | 株式会社ナビタイムジャパン | Map display system, map display device, and map display method |
JP4770858B2 (en) * | 2008-03-28 | 2011-09-14 | アイシン・エィ・ダブリュ株式会社 | Signalized intersection information acquisition apparatus, signalized intersection information acquisition method, and signalized intersection information acquisition program |
US8310556B2 (en) * | 2008-04-22 | 2012-11-13 | Sony Corporation | Offloading processing of images from a portable digital camera |
WO2010117762A2 (en) * | 2009-03-30 | 2010-10-14 | Lord Corporation | Land vehicles and systems with controllable suspension systems |
DE102009019561A1 (en) * | 2009-04-30 | 2010-11-04 | Volkswagen Ag | Method for displaying information in a motor vehicle and display device |
KR101630688B1 (en) * | 2010-02-17 | 2016-06-16 | 삼성전자주식회사 | Apparatus for motion estimation and method thereof and image processing apparatus |
US20120242698A1 (en) * | 2010-02-28 | 2012-09-27 | Osterhout Group, Inc. | See-through near-eye display glasses with a multi-segment processor-controlled optical layer |
US8739019B1 (en) * | 2011-07-01 | 2014-05-27 | Joel Nevins | Computer-implemented methods and computer program products for integrating and synchronizing multimedia content, including content displayed via interactive televisions, smartphones, electronic book readers, holographic imagery projectors, and other computerized devices |
US8903521B2 (en) * | 2010-08-26 | 2014-12-02 | Blast Motion Inc. | Motion capture element |
US8593558B2 (en) * | 2010-09-08 | 2013-11-26 | Apple Inc. | Camera-based orientation fix from portrait to landscape |
JP5776255B2 (en) * | 2011-03-25 | 2015-09-09 | ソニー株式会社 | Terminal device, object identification method, program, and object identification system |
KR101773845B1 (en) * | 2011-05-16 | 2017-09-01 | 삼성전자주식회사 | Method of processing input signal in portable terminal and apparatus teereof |
KR101818573B1 (en) * | 2011-07-07 | 2018-01-15 | 삼성전자 주식회사 | Method and apparatus for displaying of view mode using face recognition |
US9214128B2 (en) * | 2011-08-10 | 2015-12-15 | Panasonic Intellectual Property Corporation Of America | Information display device |
US8638344B2 (en) * | 2012-03-09 | 2014-01-28 | International Business Machines Corporation | Automatically modifying presentation of mobile-device content |
US9417666B2 (en) * | 2012-10-19 | 2016-08-16 | Microsoft Technology Licesning, LLC | User and device movement based display compensation |
-
2012
- 2012-08-22 US US13/592,298 patent/US20140055339A1/en not_active Abandoned
-
2013
- 2013-08-07 WO PCT/US2013/054000 patent/WO2014031342A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100188331A1 (en) * | 2000-10-02 | 2010-07-29 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US20060171360A1 (en) * | 2005-01-31 | 2006-08-03 | Samsung Electronics Co., Ltd. | Apparatus and method for displaying data using afterimage effect in mobile communication terminal |
US20090186659A1 (en) * | 2008-01-17 | 2009-07-23 | Platzer Kasper | Active display readability enhancement for mobile devices depending on movement |
US20090201246A1 (en) * | 2008-02-11 | 2009-08-13 | Apple Inc. | Motion Compensation for Screens |
US20100079485A1 (en) * | 2008-09-26 | 2010-04-01 | Microsoft Corporation | Compensating for anticipated movement of a device |
Also Published As
Publication number | Publication date |
---|---|
US20140055339A1 (en) | 2014-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140055339A1 (en) | Adaptive visual output based on motion compensation of a mobile device | |
US10916063B1 (en) | Dockable billboards for labeling objects in a display having a three-dimensional perspective of a virtual or real environment | |
US20180240220A1 (en) | Information processing apparatus, information processing method, and program | |
EP3965060A1 (en) | Method for determining motion information of image feature point, task execution method, and device | |
US9706086B2 (en) | Camera-assisted display motion compensation | |
US20190075224A1 (en) | Systems and methods involving edge camera assemblies in handheld devices | |
WO2016157677A1 (en) | Information processing device, information processing method, and program | |
US9958938B2 (en) | Gaze tracking for a mobile device | |
US20130181892A1 (en) | Image Adjusting | |
US10641865B2 (en) | Computer-readable recording medium, display control method and display control device | |
WO2021103841A1 (en) | Control vehicle | |
JP6501743B2 (en) | Image stabilization method and electronic device | |
CN105741314A (en) | Methods And Apparatuses For Directional View In Panoramic Content | |
JP5861667B2 (en) | Information processing apparatus, imaging system, imaging apparatus, information processing method, and program | |
EP4206788A1 (en) | Systems and methods to correct a vehicle induced change of direction | |
US20210014539A1 (en) | Dynamic video encoding and view adaptation in wireless computing environments | |
JPWO2013132886A1 (en) | Information processing apparatus, information processing method, and program | |
US20180121711A1 (en) | Display control method and apparatus | |
US9478045B1 (en) | Vibration sensing and canceling for displays | |
US20170097985A1 (en) | Information processing apparatus, information processing method, and program | |
CN112947474A (en) | Method and device for adjusting transverse control parameters of automatic driving vehicle | |
JP2017158169A (en) | Image display system, display device, and program | |
US11240482B2 (en) | Information processing device, information processing method, and computer program | |
US10021317B2 (en) | Image display apparatus capable of regulating focusing using a camera module having a single lens and method of operating the same | |
US9160923B1 (en) | Method and system for dynamic information display using optical data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13831538 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13831538 Country of ref document: EP Kind code of ref document: A1 |