US20140123507A1 - Reference coordinate system determination - Google Patents
Reference coordinate system determination Download PDFInfo
- Publication number
- US20140123507A1 US20140123507A1 US13/787,525 US201313787525A US2014123507A1 US 20140123507 A1 US20140123507 A1 US 20140123507A1 US 201313787525 A US201313787525 A US 201313787525A US 2014123507 A1 US2014123507 A1 US 2014123507A1
- Authority
- US
- United States
- Prior art keywords
- coordinate system
- reference coordinate
- orientation
- mobile device
- origin
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- Augmented Reality provides a view of a real-world environment that is augmented with computer-generated audio and/or visual content.
- the audio and/or visual content can be overlaid over or integrated into an image or video of the real-world environment captured using a camera of a mobile device, or displayed on a transparent or semi-transparent screen through which a user is viewing the real-world environment.
- an augmented reality application may be implemented on a mobile phone or tablet computer that includes a camera that can be used to capture images or video of a view of the real-world environment and a display that can be used to display an augmented view of the real-world environment, and/or on a head-mounted display (HMD).
- HMD head-mounted display
- the device can include one or more sensors that collect data that can be used to determine the position, orientation, speed, and/or direction of movement of the device. This information can be used to assist the device in generating augmentation content.
- the sensors can also be used to collect input information from a user, such as touchscreen selections or other input information that can be used to allow the user to navigate the augmented content displayed on the device.
- SLAM simultaneous localization and mapping
- a known reference target in the environment, e.g., in a field of view of a camera.
- a coordinate system for the SLAM may be defined arbitrarily.
- the pose (position and orientation) of the camera when tracking begins may be used to define the coordinate system for the SLAM.
- An example method of determining a reference coordinate system includes: obtaining information indicative of a direction of gravity relative to a device; and converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system.
- Implementations of such a method may include one or more of the following features.
- the method further includes obtaining information indicative of a direction perpendicular to gravity and wherein the converting comprises converting the orientation of the device coordinate system using the direction perpendicular to gravity.
- the direction perpendicular to gravity is one of magnetic north or a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity.
- the method further includes setting an origin of the reference coordinate system. Setting the origin comprises: obtaining a point cloud; and determining a geometric center of a substantially planar portion of the point cloud. Setting the origin comprises: obtaining a point cloud; and determining an intersection of a view direction of a camera of the device and a plane corresponding to a substantially planar portion of the point cloud.
- implementations of the method may include one or more of the following features.
- the method further includes calculating a scale value and producing the reference coordinate system using the scale value.
- Calculating the scale value comprises: obtaining a point cloud; and comparing a dimension of the point cloud to a fixed size.
- Calculating the scale value comprises calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the device.
- Calculating the scale value comprises using absolute measurements from one or more input sensors of the device.
- the one or more input sensors comprise an accelerometer or a plurality of cameras.
- the method further includes refining the reference coordinate system. Refining the reference coordinate system comprises using information from the reference coordinate system and at least one previously-determined coordinate system to generate a combination of the reference coordinate system and the at least one previously-determined coordinate system.
- An example device for determining a reference coordinate system includes: means for obtaining information indicative of a direction of gravity relative to the device; and means for converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system.
- the device further includes means for obtaining information indicative of a direction perpendicular to gravity and wherein the means for converting are for converting the orientation of the device coordinate system using the direction perpendicular to gravity.
- the direction perpendicular to gravity is one of magnetic north or a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity.
- the device further includes means for setting an origin of the reference coordinate system.
- the means for setting the origin comprise: means for obtaining a point cloud; and means for determining a geometric center of a substantially planar portion of the point cloud.
- the means for setting the origin comprise: means for obtaining a point cloud; and means for determining an intersection of a view direction of a camera of the device and a plane corresponding to a substantially planar portion of the point cloud.
- implementations of the device may include one or more of the following features.
- the device further includes means for calculating a scale value.
- the means for calculating the scale value comprise: means for obtaining a point cloud; and means for comparing a dimension of the point cloud to a fixed size.
- the means for calculating the scale value comprise means for calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the device.
- the means for calculating the scale value comprise means for using absolute measurements from one or more input sensors of the device.
- the one or more input sensors comprise an accelerometer or a plurality of cameras.
- An example mobile device includes: a sensor configured to determine a direction of gravity and to provide an indication of the direction of gravity relative to the mobile device; and an orientation module communicatively coupled to the sensor and configured to convert an orientation of a device coordinate system, of the mobile device, using the indication of the direction of gravity relative to the device to produce a reference coordinate system.
- the orientation module is further configured to convert the orientation of the device coordinate system using a direction perpendicular to gravity.
- the mobile device further includes an origin module communicatively coupled to the orientation module and configured to set an origin of the reference coordinate system by obtaining a point cloud and at least one of: determining a geometric center of a substantially planar portion of the point cloud; or determining an intersection of a view direction of a camera of the mobile device and a plane corresponding to a substantially planar portion of the point cloud.
- the mobile device further includes a scale module communicatively coupled to the orientation module and configured to set a scale value for the reference coordinate system relative to the device coordinate system by: (1) obtaining a point cloud and comparing a dimension of the point cloud to a fixed size; or (2) calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the mobile device; or (3) using absolute measurements from one or more input sensors of the mobile device.
- the orientation module is configured to produce a refined reference coordinate system using information from the reference coordinate system and at least one previously determined coordinate system.
- An example processor-readable storage medium of a mobile device includes processor-readable instructions configured to cause a processor to: obtain an indication of a direction of gravity relative to the mobile device; and convert an orientation of a device coordinate system, of the mobile device, using the indication of the direction of gravity relative to the device to produce a reference coordinate system.
- Implementations of such a storage medium may include one or more of the following features.
- the instructions configured to cause the processor to convert the orientation of the device coordinate system include instructions configured to use a direction perpendicular to gravity.
- the storage medium further includes instructions configured to cause the processor to set an origin of the reference coordinate system are configured to cause the processor to obtain a point cloud and at least one of: determine a geometric center of a substantially planar portion of the point cloud; or determine an intersection of a view direction of a camera of the mobile device and a plane corresponding to a substantially planar portion of the point cloud.
- the storage medium further includes instructions configured to cause the processor to set a scale value for the reference coordinate system relative to the device coordinate system by: (1) obtaining a point cloud and comparing a dimension of the point cloud to a fixed size; or (2) calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the mobile device; or (3) using absolute measurements from one or more input sensors of the mobile device.
- the storage medium further includes instructions configured to cause the processor to produce a refined reference coordinate system using information from the reference coordinate system and at least one previously determined coordinate system.
- a coordinate axis may be aligned parallel to the ground or another surface such as a table
- a coordinate axis may be aligned perpendicular to the ground, for example parallel to a wall or other structure.
- Coordinate systems may be provided where there is a known object in view or no known object in view. For cases with one or more objects in view, a more meaningful coordinate system may be established compared to prior techniques. For example, with multiple targets in view, the targets may be tracked and a coordinate system established.
- a coordinate system may be aligned with an object, but not one of the tracked targets, thus providing a global coordinate system for all the targets.
- Other capabilities may be provided and not every implementation according to the disclosure must provide any particular capability, let alone all of the capabilities, discussed. Further, it may be possible for an effect noted above to be achieved by means other than that noted, and a noted item/technique may not necessarily yield the noted effect.
- FIG. 1 is a simplified perspective view of an augmented reality system.
- FIG. 2 is a block diagram of a computer system that incorporates components of a mobile device shown in FIG. 1 .
- FIG. 3 is a functional block diagram of the computer system shown in FIG. 2 .
- FIGS. 4-5 are simplified plan views of the mobile device shown in FIG. 1 .
- FIG. 6 is a block flow diagram of a process of calculating a reference coordinate system.
- FIG. 7 is a perspective view of the system shown in FIG. 1 showing techniques for setting an origin of the reference coordinate system.
- FIGS. 8-10 are perspective view of the system shown in FIG. 1 showing techniques for setting an orientation of the reference coordinate system.
- FIGS. 11-13 are block flow diagrams of processes for determining reference coordinate systems.
- Techniques are provided to define an origin and/or an orientation, and optionally a scale, of a coordinate system.
- the coordinate system may be for a simultaneous localization and mapping system.
- the techniques may be used in a variety of situations, but at least some of the techniques may be particularly useful in situations where the origin and orientation are not given a priori by a known reference.
- a device such as a mobile device may use as inputs a camera pose (position and orientation), a point cloud, and sensor data (e.g., measured from inertial sensors such as an accelerometer and a magnetometer). Using these inputs, the mobile device can determine a three-dimensional point in space for the origin of the desired coordinate system, an orientation of the desired coordinate system (e.g., three orthogonal axes), and a scale. The mobile device can use this information to determine a translation from a coordinate system of the camera to the desired coordinate system.
- SLAM simultaneous localization and mapping
- AR augmented reality
- Selecting a meaningful origin, orientation (and scale) for the coordinate system may be independent tasks. Different techniques are provided for each of these tasks. Some techniques make use of a three-dimensional (3D) point cloud. Some techniques use sensors. The defined coordinate system may be refined over time. Different techniques for determining the origin and/or different techniques for determining the orientation for the coordinate system may be used, e.g., depending on a particular application used.
- a system for determining a coordinate system here an augmented reality (AR) system 10 which may comprise a SLAM system, includes a device, here an AR-enabled mobile device 12 , and an object 14 .
- the mobile device 12 may be any computing device with an input sensory unit, such as a camera, and a display.
- the mobile device 12 is a smart phone although the functionality described herein is not limited to smart phones.
- the mobile device 12 may be a digital camera, a camcorder, a tablet computer, a personal digital assistant, a video game console, an HMD or other wearable display, a projector device, or other device.
- the mobile device 12 includes a camera for capturing images of objects, here the object 14 , in a field of view 18 of the camera.
- a point cloud is a collection of points in three-dimensional space corresponding to at least a portion of an object, here the object 14 , that is visible in the field of view 18 by the camera of the mobile device 12 .
- the point cloud may be determined by one or more techniques performed by a processor of the mobile device 12 , for example based on image data from a camera 24 (described below) of the mobile device 12
- the mobile device 12 is configured to augment reality by capturing images of its environment, here capturing images of the object 14 , and displaying the additional imagery on a transparent or semi-transparent display through which the object 14 is visible or displaying an image of the object 14 supplemented with additional imagery, here a drawing 16 partially superimposed on and partially disposed above the object 14 , e.g., here for use in a graffiti AR application.
- the mobile device 12 includes sensors 20 , a sensor processor 22 , one or more cameras 24 , a camera processor 26 , a display 28 , a graphics processor 30 , a touch sensor 32 , a touch sensor processor 34 , a communication module 36 , a processor 38 , and memory 40 .
- the processors 22 , 26 , 30 , 34 , the communication module 36 , the processor 38 , and the memory 40 are communicatively coupled through a bus 42 , as illustrated, or may be directly coupled or coupled in another way.
- the processors 22 , 26 , 30 , 34 may be portions of a single processor, or implemented by different portions of software code stored in the memory 40 and executed by the processor 38 , or separate dedicated processors, or combinations of these (e.g., with one or more of the processors 22 , 26 , 30 , 34 being a dedicated processor or processors and others being part of the processor 38 ).
- the processors 22 , 26 , 30 , 34 , 38 may be one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like).
- the sensors 20 can provide an indication and/or indications of various sensed parameters, e.g., an indication of a direction of gravity, e.g., relative to an orientation of the mobile device 12 (relative to a device coordinate system of the mobile device 12 ).
- the sensors 20 and the sensor processor 22 are configured to determine an orientation of the mobile device 12 .
- the sensors 20 are orientation sensors configured to sense information for use in determining an orientation of the mobile device 12 .
- the sensors 20 may include one or more inertial sensors such as gyroscopes, one or more accelerometers, an inertial measurement unit (IMU), and/or another type and/or other types of sensors.
- the sensor processor 22 is configured to process data measured/collected by the sensors 20 to determine the orientation of the mobile device 12 .
- the camera(s) 24 and the camera processor 26 are configured to capture and produce visual information.
- the camera(s) 24 is(are) configured to capture images and/or video of a real-world scene that can be augmented (with augmentation, e.g., text or designs placed on a real-world surface) using augmentation logic.
- the camera processor 26 is configured to process the data collected by the camera(s) 26 to convert the data into a format that can be used by the augmentation logic.
- the camera processor 26 is configured to perform various types of image or video processing on the data collected from the camera(s) 24 to prepare the content for display on display 28 .
- the display 28 and the graphics processor 30 are configured to provide visual information based on the data captured by the camera(s) 24 and processed by the camera processor 26 , and to provide visual information based on information produced by the augmentation logic.
- the display 28 can be a touch screen interface that includes the touch sensor 32 .
- the graphics processor 30 is configured to generate graphical data for display on the display 28 .
- the graphics processor 30 is configured to use information provided by the augmentation logic to display augmented image or video content.
- the touch sensor processor 34 can process data output by the touch sensor 32 to identify when a user touches the touch screen.
- the touch sensor process 34 can be configured to identify various touch gestures, including multi-finger touches of the touch screen.
- the augmentation logic can use the gesture information determined by the touch sensor processor 34 to determine, at least in part, how the augmentation should react in response to user input.
- the communication module 36 is configured to enable the mobile device 12 to communicate using one more wireless protocols.
- the communication module 36 is configured to allow the mobile device 12 to send and receive data from nearby wireless devices, including wireless access points and other AR-enabled devices.
- the communication module 36 may include a modem, a wireless network card, an infrared communication device, a wireless communication device and/or chipset (such as a short-range wireless device such as a BluetoothTM device, or an 802.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc.), or the like.
- the communication module 36 may permit data to be exchanged with a network, other computer systems, and/or other devices.
- the processor 38 is configured to control one or more of the sensor processor 22 , the camera processor 26 , the graphics processor 30 , or the touch sensor processor 34 .
- One or more of the sensor processor 22 , camera processor 26 , the graphics processor 30 , or the touch sensor processor 34 may also be implemented by the processor 38 .
- the memory 40 includes volatile and/or persistent, non-transitory memory for storing data used by various components of the AR-enabled mobile device 12 .
- the memory 40 may include local and/or network accessible storage, a disk drive, a drive array, an optical storage device, a solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like.
- RAM random access memory
- ROM read-only memory
- Such storage devices may be configured to implement any appropriate data storage, including without limitation, various file systems, database structures, and/or the like.
- the memory 40 stores processor-readable, processor-executable software program code 44 for one or more of the processors included in the mobile device 12 .
- the software code 44 contains instructions for controlling the processor(s) to perform functions described below (although the description may read that the software or a processor performs the function(s)).
- augmentation logic can be implemented as processor-executable instructions stored in the memory 40 .
- the software 44 includes an operating system, device drivers, executable libraries, and/or other software code instructions, such as one or more application programs, that may implement methods described herein. For example, one or more procedures described herein might be implemented as software code instructions executed by a processor. Such instructions can cause a general purpose computer (or other device) to perform one or more operations as described herein.
- the software 44 may be stored on a non-removable portion of the memory 40 incorporated within the mobile device 12 or may be stored on a removable medium, such as a compact disc, and/or provided in an installation package.
- the instructions may take the form of executable code, which is directly executable by a processor, or alternatively the instructions may take the form of source and/or installable code, which, upon compilation and/or installation (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
- One or more of the elements described above with respect to the mobile device 12 may be omitted.
- the communication module 36 and/or the touch sensor 32 and touch sensor processor 34 may be omitted.
- processors 22 , 26 , 30 , 34 may be combined with, or implemented in, the processor 38 , and/or some or all functionality of one or more of the processors 22 , 26 , 30 , 34 may be implemented by instructions stored in the memory 40 .
- the mobile device 12 includes an origin module (means for setting a coordinate system origin) 46 , an orientation module (orientation means or means for setting a coordinate system orientation) 48 , a scale module (means for setting a coordinate system scale) 50 , and a refining module (means for refining a coordinate system) 52 .
- the origin module 46 , the orientation module 48 , the scale module 50 , and the refining module 52 are communicatively coupled to each other.
- the modules 46 , 48 , 50 , 52 are functional modules that may be implemented by the processors 22 , 26 , 38 and/or the software 44 stored in the memory 40 , although the modules 46 , 48 , 50 , 52 could be implemented in hardware, firmware, or software, or combinations of these. Reference to the modules 46 , 48 , 50 , 52 performing or being configured to perform a function is shorthand for one or more of the processors 22 , 26 , 38 performing or being configured to perform the function in accordance with the software 44 (and/or firmware, and/or hardware of the processors 22 , 26 , 38 ).
- determining an origin of a coordinate system, or determining an orientation of a coordinate system, or setting a scale of a coordinate system is equivalent to the origin module 46 , or the orientation module 48 , or the scale module 50 , respectively, performing the function.
- the origin module 46 or means for setting a coordinate system origin, is configured to set an origin of a coordinate system, e.g., an AR coordinate system based on information captured by the camera(s) 24 . Examples of operation of the origin module 46 are discussed below with respect to FIG. 6 , although the origin module 46 may operate in one or more other ways as well.
- the orientation module 48 is configured to set an orientation of a coordinate system, e.g., an AR coordinate system using information captured by the camera(s) 24 and/or information obtained from the sensor(s) 22 .
- the orientation module 48 or means for setting a coordinate system orientation, include converting means for converting an orientation of a device coordinate system of the mobile device 12 to the orientation of a reference coordinate system. Examples of operation of the orientation module 48 are discussed below with respect to FIG. 6 , although the orientation module 48 may operate in one or more other ways as well.
- the scale module 50 is configured to set (including determine) a scale value for a coordinate system, e.g., an AR coordinate system, using techniques discussed below or other appropriate techniques.
- the refining module 52 is configured to use previously-determined information, e.g., origin and/or orientation and/or scale value, to refine the reference coordinate system, e.g., change the origin and/or orientation and/or scale value of the reference coordinate system.
- the modules 46 , 48 , 50 , 52 are configured to establish a reference coordinate system to enhance performance of the device 12 .
- the modules 46 , 48 , 50 , 52 help with the provision of AR images so that the AR images comport with user expectations, e.g., so that a gaming experience is more realistic and/or believable.
- the modules 46 , 48 , 50 , 52 can help ensure that a painting application provides a vertical easel, a board-game application provides a horizontal gameboard, a sports application is coordinated with sports equipment (e.g., a basketball hoop) in view of the camera 24 , etc.
- modules may cause AR images to appear to displayed by the device 12 as an appropriate size, in an expected orientation, and/or in a proper augmentation with respect to a surface, for example to appear to be walking on or anchored to a surface instead of floating above it. Further, these modules may be used to establish a reference system by the device 12 for uses other than AR, for example in other applications where tracking and/or mapping is utilized, such as in a different type of SLAM system.
- the mobile device 12 may include two cameras 24 k , 24 2 in at least some configurations.
- the camera 24 1 is disposed on a front side 54 of the mobile device 12 , along with the display 28 , and the camera 24 2 is disposed on a back side or rear side 56 of the mobile device 12 .
- the camera 24 1 is referred to as a front-facing camera and the camera 24 2 is referred to herein as a rear-facing camera.
- the mobile device 12 is typically held such that the front-facing camera 24 1 faces a user of the mobile device 12 and the rear-facing camera 24 2 faces away from the user of the mobile device 12 . Alternatively, the converse may be true depending on how the mobile device 12 is held by the user.
- either of the cameras 24 1 24 2 may have the field of view (FOV) 18 .
- the camera 24 2 is described below as having the FOV 18 , but the camera 24 1 or another camera may have the FOV 18 in other situations or implementations.
- a process 60 of establishing a reference coordinate system includes the stages shown.
- the process 60 is, however, an example only and not limiting.
- the process 60 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently.
- stages 70 and 74 discussed below could be eliminated, or two or more of the stages 64 , 66 , 68 could be performed in a different order than illustrated.
- the reference coordinate system may be a SLAM coordinate system such as an AR coordinate system.
- the mobile device 12 obtains an image frame and in some embodiments also obtains sensor data.
- a triggering event such as at power up, activation of an AR application, user selection, or other triggering event
- the mobile device 12 captures an image from one of the cameras 24 , e.g., the rear-facing camera 24 2 , as processed by the camera processor 26 .
- the mobile device 12 may further obtain information from the sensors 20 as processed by the sensor processor 22 .
- the mobile device 12 obtains information regarding the direction of gravity relative to the coordinate system of the mobile device 12 .
- an accelerometer and a magnetometer the mobile device 12 obtains the direction of magnetic north relative to the coordinate system of the mobile device 12 .
- two non-linear vectors are used.
- the two non-linear vectors may comprise a gravity vector (pointed downward) and a vector of magnetic north (pointed toward magnetic north), or a combination of one of these vectors and another vector, or a combination of two other non-linear vectors.
- the gravity vector and the vector of magnetic north i.e., a magnetic north vector
- the mobile device 12 determines the reference coordinate system origin.
- the origin module 46 may set the origin in one of a variety of ways depending upon the circumstances. In a first technique, the origin module 46 sets the origin at a present 3D position 80 of the camera 24 2 as shown in FIG. 7 . In a second technique, the origin module 46 sets the origin 82 at a geometric center associated with a point cloud, e.g. a geometric center of the object 14 as shown in FIG. 7 . This technique may be used, for example, when the point cloud is known or determined to have a planar surface (e.g., a table, a poster on a wall, etc.). The geometric center associated with the point cloud may be determined as the geometric center of the planar surface.
- a planar surface e.g., a table, a poster on a wall, etc.
- the origin module 46 sets the origin 84 at a point of the point cloud whose two-dimensional (2D) projection lies close to a center of the camera image as shown in FIG. 7 .
- This technique affixes or assigns the origin to a physical object, because any point of the point cloud corresponds to a physical structure in the environment (around the mobile device 12 ).
- This technique may be used in various situations, but may be particularly useful where variance in the depth in the physical environment is small, with the points of the point cloud being at similar distances from the camera 24 .
- the origin module 46 sets the origin 86 of the reference coordinate system at an intersection between the camera view direction 88 (i.e., a ray through the center of the camera frame, i.e., in the center of the field of view of the camera 24 ) and a plane fit to the point cloud.
- the plane can be fit to a large or small portion of the point cloud depending upon whether the point cloud is planar (represents a planar surface) or does not represent a substantial planar surface.
- the origin module 46 can fit the plane to some points of the point cloud whose 2D projections lie close to the camera center.
- the origins 82 , 84 , 86 are in different locations but close to each other. This is an example only, and in other examples, one or more of the origins 82 , 84 , 86 may be co-located, and/or one or more of the origins 82 , 84 , 86 may be displaced from one or more of the other origins 82 , 84 , 86 further or lesser than shown.
- the mobile device 12 determines the reference coordinate system orientation, for example using two non-parallel vectors.
- the orientation of the reference coordinate system defines the directions for elevation (up/down, +/ ⁇ z-direction), azimuth (left/right, +/ ⁇ x-direction), and forward/backward (+/ ⁇ y-direction), assuming an x-y-z orthogonal coordinate system. Because the reference coordinate system is orthogonal, it is enough for the orientation module 48 to define the z-axis and the y-axis, with the final axis resulting from the orthogonality between the axes (and sidedness such as right-handedness). In some techniques, a different combination of two axes is used. Several techniques are available for determining the orientation.
- the orientation module 48 sets the orientation of the reference coordinate system as the orientation of the present camera image frame, as shown in FIG. 8 .
- This technique is applied to the origin 86 , but the technique could be applied to any origin, selected using any technique.
- This technique is independent of sensor measurements and can be used when the sensor information is not available (e.g., the sensors 20 are inoperative or it is desirable to conserve power).
- the resulting reference coordinate system will be oriented without regard to gravity and may be askew relative to the physical environment (e.g., with azimuth being non-parallel to the floor).
- a second technique as shown in FIG.
- the orientation module 48 sets the orientation with respect to gravity (shown as vector g) and to a direction perpendicular to gravity or within a tolerance, e.g., 10°, of perpendicular to gravity (i.e., substantially horizontal orientation with respect to Earth or a surface of Earth), such as magnetic north (or, alternatively, true north).
- the orientation module 48 may use information from the sensor processor 22 to set the positive z-axis (Z R ) of the reference coordinate system as the inverse direction of gravity or within a tolerance, e.g., 10°, of this direction (i.e., substantially vertical orientation with respect to Earth or a surface of Earth) based on information from the accelerometer, and set the positive y-axis (Y R ) of the reference coordinate system as the direction of magnetic north (based on information from the magnetometer and/or accelerometer).
- This technique may be used in various situations, and may be particularly useful in situations where alignment with geographic directions (e.g., for a navigation system) is useful or important.
- a third technique as shown in FIG.
- the orientation module 48 sets the orientation of the reference coordinate system with respect to gravity and a present view direction of the camera 24 .
- the orientation module 48 sets the positive z-axis as the inverse direction of gravity (using the accelerometer) and sets the positive y-axis as the current view direction of the rear-facing camera 24 2 , projected onto a plane perpendicular to the z-axis (here the top surface of the table 14 ).
- This technique may be used in various situations, and may be particularly useful in situations where alignment with geographic directions is of little or no importance or use and alignment with the camera viewing direction is useful or important.
- a process 110 of determining a reference coordinate system using a direction of gravity includes the stages shown.
- the process 110 is, however, an example only and not limiting.
- the process 110 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently. While x-y-z coordinate systems are discussed below, other coordinate systems could be used.
- the process 110 includes techniques shown in both FIG. 9 and FIG. 10 and discussed above for orienting a reference coordinate system.
- the process 110 includes obtaining information indicative of a direction of gravity relative to a device.
- the orientation module 48 obtains information regarding a direction of gravity from the sensor processor 22 , that the sensor processor 22 computed from sensor data from one or more of the sensors 20 .
- the information from the sensor processor 22 regarding the direction of gravity may be relative to the device 12 , e.g., relative to a device coordinate system (X D -Y D -Z D ) or may be converted by the orientation module 48 to be relative to the device coordinate system.
- the process 110 includes converting an orientation of the device coordinate system (X D -Y D -Z D ) using the direction of gravity relative to the device to produce the reference coordinate system (X R -Y R -Z R ).
- Known techniques can be used to convert (translate the origin and/or rotate the orientation) of the device coordinate system to yield the reference coordinate system.
- Translate is used here to indicate linear motion although the term translating (and its conjugations) was used in U.S. Provisional Application No. 61/722,023 to mean converting as used herein.
- a process 120 of determining a reference coordinate system based on an application that will use the reference coordinate system includes the stages shown.
- the process 120 is, however, an example only and not limiting.
- the process 120 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently.
- the process 120 includes techniques for orienting the reference coordinate system.
- the process 120 includes obtaining information indicative of an orientation of a device.
- the orientation module 48 obtains information from the sensor processor 22 , that the sensor processor 22 computed from sensor data from one or more of the sensors 20 .
- the information indicates (either directly or after processing by the orientation module 48 ) an orientation of the device coordinate system, e.g., here the X D -Y D -Z D coordinate system shown in FIGS. 9-10 .
- the process 120 includes determining an orientation of the reference coordinate system based on a selected application for the reference coordinate system.
- the reference coordinate system may be initially defined and/or set to align with a surface determined or hypothesized by the mobile device 12 , e.g., via an AR application executed on the mobile device 12 , to be useful or important.
- two axes of an x-y-z coordinate system may be defined to lie in a plane of a surface of a table (e.g., a tabletop) for a gaming application, a wall for a graffiti application, or other surface or an arbitrary orientation, e.g., as discussed below with respect to provided examples.
- the processor 38 can analyze information from one or more camera images to determine a plane of a point cloud and assign two axes to be orthogonal to each other and to lie in the plane, with the third axis being orthogonal to both of the other two axes.
- Determining the orientation of the reference coordinate system may comprise receiving the orientation of the reference coordinate system, e.g., from the AR application and/or from user input, and a means for determining the orientation may thus include means for receiving the orientation of the reference coordinate system, e.g., from the AR application and/or from user input.
- the orientation of the reference coordinate system may be determined based on a type of augmentation associated with the AR application.
- the type of augmentation may comprise text and/or designs placed on a real-world surface, and/or a character moving on a real-world surface.
- Such augmentations could indicate a plane that should be occupied, e.g., by an x-y plane of the reference coordinate system.
- the orientation module 48 may determine which technique or techniques to use to determine the orientation of the reference coordinate system based on the application. For example, the orientation module 48 may select one or more of the techniques discussed above with respect to stage 66 based on the application. If more than one technique is selected, then results may be combined, e.g., averaged (including weighted averaging or non-weighted averaging) to determine the reference coordinate system orientation if appropriate, or one of the techniques selected otherwise. The orientation module 48 may determine (or select) the orientation technique(s) based on one or more properties or benefits of the techniques relative to the application.
- the orientation may be desirable to be relative to gravity while for a first person shooter game in a spaceship, the orientation may be desirable to be relative to the camera image frame at the start of the application.
- the benefits) and/or property(ies) of the application can be matched to the orientation technique, e.g., by analysis by the orientation module 48 , by user input, by predetermined settings (e.g., by an application developer), etc. If more than one technique is selected, then results may be combined, e.g., averaged (including weighted averaging or non-weighted averaging) to determine the reference coordinate system orientation if appropriate, or one of the techniques selected otherwise.
- the orientation module 48 may combine the orientations if practical, or may select one of the techniques based on a priority for the particular application (e.g., as predefined), or may allow a user to select the orientation technique, etc.
- the process 120 includes converting an orientation of a device coordinate system to the determined orientation to produce the reference coordinate system.
- the orientation module 48 converts the orientation of the device coordinate system to the orientation of the determined orientation from stage 124 to produce the reference coordinate system, e.g., using known techniques to rotate the device coordinate system to the determined orientation.
- the process 120 may include stages for obtaining information indicative of an initial origin, determining a desired origin of the reference coordinate system, and converting the initial origin to the desired origin as the origin of the reference coordinate system. These stages may be done as part of the process 120 , or as a separate process, independent of the process 120 .
- the origin module 46 may determine which technique or techniques to use to determine the origin of the reference coordinate system based on the application. For example, the origin module 46 may select one or more of the techniques discussed above with respect to stage 64 based on the application. The origin module 46 may determine (or select) the origin technique(s) based on one or more properties or benefits of the techniques relative to the application.
- the origin may be desirable to be associated with sports equipment, e.g., a basketball basket, for a painting application, the origin may be desirable to be a center of a planar surface such as a wall or a canvas on an easel, while for a guidance application the origin may be desirable to be the origin of the camera 24 .
- the benefits) and/or property(ies) of the application can be matched to the origin technique, e.g., by analysis by the origin module 46 , by user input, by predetermined settings (e.g., by an application developer), etc.
- results may be combined, e.g., averaged (including weighted averaging or non-weighted averaging) to determine the reference coordinate system origin if appropriate, or one of the techniques selected otherwise. That is, in situations where multiple origin techniques would each be desirable, the origin module 46 may combine the origins if practical, or may select one of the techniques based on a priority for the particular application (e.g., as predefined), or may allow a user to select the origin technique, etc.
- the mobile device 12 determines the scale of the reference coordinate system, although doing so is optional as a scale value may be predetermined.
- the scale value may be determined using a variety of techniques.
- the scale module 50 sets a scale value by comparing and scaling the point cloud to a fixed size, e.g., one unit. To do so, the scale module 50 may scale any of a variety of dimensions of the point cloud such as a height, a width, or a longest dimension seen from the camera 24 .
- the scale module 50 sets the scale value such that the AR coordinate origin has a specific predetermined distance, e.g., one unit, to the present camera position, e.g., the origin of the camera coordinate system.
- the scale module 50 sets the scale value based on absolute measurements, e.g., as provided by an accelerometer or a stereo camera system (e.g., two rear-view cameras 24 displaced on the back 56 of the mobile device 12 ).
- the absolute measurements can be directly used to define the scale of the AR coordinate system. For example, a measurement such as 1 cm from the absolute measurements could be used as 1 unit in the AR coordinate system.
- the mobile device 12 refines the present reference coordinate system determined in stages 64 , 66 , 68 .
- the reference coordinate system may be refined over time such that the reference coordinate system can be improved (e.g., corrected if the initial hypothesis is incorrect).
- the mobile device 12 refines the present reference coordinate system (origin, orientation, and possibly scale) using information from a previously-determined reference coordinate system, here the most-recently determined reference coordinate system, if any.
- the processor 38 may iterate the reference coordinate system, e.g., origin and/or orientation and/or scale, over time using a Kalman filter by using one or more of the values and/or sensor data from previous iterations. This stage is optional, as is the stage 74 discussed below.
- the mobile device 12 outputs the present reference coordinate system.
- the present reference coordinate system may be output to an AR application such as a drawing application, a board game application, a first-person shooter application, etc.
- the present reference coordinate system may be a refined reference coordinate system if a prior reference coordinate system was available at stage 70 inducing modification to the system determined at stages 64 , 66 , 68 and will be an unrefined coordinate system if no prior reference coordinate system was available at stage 70 or if the prior reference coordinate system induces no modification to the system determined at stages 64 , 66 , 68 .
- stage 74 the present reference coordinate system is stored.
- the present reference coordinate system is stored in the memory 40 for use as the prior reference coordinate system in stage 70 .
- stage 74 is optional.
- the process 60 returns to stage 62 after stage 72 .
- the process 60 continues until the mobile device 12 is powered down or the AR application being run on the mobile device 12 is closed.
- the process 60 can be initiated or repeated based on or in response to one or more of various criteria. For example, the process 60 could be performed in response to: each image being captured; a threshold number of images having been captured; an application request; a user request; passage of a threshold amount of time; determination of a threshold amount of drift in origin and/or orientation having occurred; etc.
- a process 130 of determining a reference coordinate system by refining a determined reference coordinate system includes the stages shown.
- the process 130 is, however, an example only and not limiting.
- the process 130 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently.
- the process 130 includes obtaining first information indicative of an orientation of a device at a first time.
- the processor 38 may obtain the information from the sensor processor in accordance with measurements taken by the sensors 20 , and/or may obtain the information from the memory 40 , etc.
- stage 132 may comprises stage 62 discussed above.
- the process 130 includes determining the reference coordinate system based on the first information.
- the processor 38 determines the reference coordinate system, e.g., as discussed above with respect to stages 64 , 66 , and, optionally, stage 68 .
- the process 130 includes refining the reference coordinate system based on second information indicative of the orientation of the device at a second time after the first time(i.e., the second information is obtained, e.g., sensed, subsequently to the first information being obtained, e.g., sensed).
- the processor 38 can refine the reference coordinate system as discussed above with respect to stage 70 using stored information regarding previously-determined reference coordinate system information associated with at least one previously-determined coordinate system. Movement of the mobile device 12 between the first time and the second time may improve the refinement.
- the preferred choice of a reference coordinate system may be affected by the application for which the reference coordinate system is to be used.
- the origin module 46 and/or the orientation module 48 may select a technique or techniques for determining the origin and/or orientation of the reference coordinate system based on the application to be used.
- an example of a meaningful coordinate system is one that is upright and aligned with a horizontal surface, e.g., of a table.
- the processor 38 can assume that the point cloud is reasonably planar for calculating the origin of the reference coordinate system, e.g., using the first or fourth origin-setting techniques discussed above.
- the orientation of the reference coordinate system can be calculated by the processor 38 and/or the orientation module 48 with respect to gravity and to the view direction of the camera 24 , e.g., using the third orientation-setting technique discussed above.
- a meaningful coordinate system is typically aligned with a wall and facing the outer side of the wall.
- the processor 38 can calculate the origin using the second or fourth origin-setting techniques discussed above, with the point cloud being substantially/reasonably planar (e.g., having a substantially planar portion, e.g., planar within an allowed tolerance, e.g., 10% deviation of height versus length or width of the plane).
- the processor 38 can calculate the orientation of the reference coordinate system using the third orientation-setting technique discussed above and then rotating the coordinate system by 90 degrees around the x-axis so that the z-axis points horizontally out of the wall rather than upwards.
- an example of a meaningful coordinate system is one that is centered on the robot's initial position and aligned to gravity and to north.
- the processor 38 and/or the origin module 46 may use the first origin-setting technique discussed above to calculate the origin of the reference coordinate system, and may use the second orientation-setting technique discussed above to orient the reference coordinate system correctly with respect to gravity and north.
- such reference coordinate system may be used for SLAM, and the robot navigates the unknown environment using SLAM.
- an example of a meaningful reference coordinate system is one that is aligned to the initial camera position and orientation, and that is independent of gravity.
- the processor 38 may use the first origin-setting technique and the first orientation-setting technique discussed above.
- Proper setting of a reference coordinate system may help improve a user experience.
- proper setting of the reference coordinate system may help ensure that augmentations are in appropriate proportions (e.g., a basketball being smaller than basketball basket), in appropriate locations (e.g., heavy objects not floating in mid-air, fish not flying, etc.), and at appropriate orientations (e.g., people standing upright, cars driving horizontally, etc.).
- “or” as used in a list of items prefaced by “at least one of” indicates a disjunctive list such that, for example, a list of “at least one of A, B, or C” means A or B or C or AB or AC or BC or ABC (i.e., A and B and C), or combinations with more than one feature (e.g., AA, AAB, ABBC, etc.).
- a statement that a function or operation is “based on” an item or condition means that the function or operation is based on the stated item or condition and may be based on one or more items and/or conditions in addition to the stated item or condition.
- machine-readable medium and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion.
- various computer-readable media might be involved in providing instructions/code to processor(s) for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals).
- a computer-readable medium is a physical and/or tangible storage medium.
- Such a medium may take many forms, including but not limited to, non-volatile media and volatile media.
- Non-volatile media include, for example, optical and/or magnetic disks.
- Volatile media include, without limitation, dynamic memory.
- Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to one or more processors for execution.
- the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
- a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by a computer system.
- configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional stages or functions not included in the figure.
- examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- a method of determining a reference coordinate system comprising:
- the reference coordinate system comprises a coordinate system for use with an application of the device
- the determining comprises converting an orientation of a coordinate system of the device to the reference coordinate system for use with the application of the device.
- the reference coordinate system for use with an application of the device comprises a SLAM coordinate system.
- a first of the two non-linear vectors comprises a gravity vector derived from one or more accelerometer measurements.
- a second of the two non-linear vectors comprises a vector pointed in a direction of magnetic north, the second of the two vectors being derived from at least a magnetometer, or wherein the second of the two non-linear vectors comprises a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity.
- determining comprises setting an origin of the reference coordinate system and/or determining a scale value of the reference coordinate system.
- a mobile device comprising:
- the mobile device of claim 10 wherein the means for determining are configured to convert an orientation of a coordinate system of the mobile device to the reference coordinate system for use with an application of the mobile device.
- the mobile device of claim 10 wherein the means for obtaining are configured to obtain two non-linear vectors as the first information, at least one of the two non-linear vectors being a gravity vector or a vector of magnetic north.
- the mobile device of claim 13 wherein the means for determining are configured to set an origin of the reference coordinate system and/or determine a scale value of the reference coordinate system.
- a processor-readable storage medium of a mobile device comprising processor-readable instructions configured to cause a processor to:
- instructions configured to cause the processor to determine the reference coordinate system include instructions configured to cause the processor to convert an orientation of a coordinate system of the mobile device to the reference coordinate system for use with an application of the mobile device.
- instructions configured to cause the processor to obtain the first information include instructions configured to cause the processor to obtain two non-linear vectors as the first information, at least one of the two non-linear vectors being a gravity vector or a vector of magnetic north.
- instructions configured to cause the processor to determine the reference coordinate system include instructions configured to cause the processor to set an origin of the reference coordinate system and/or determine a scale value of the reference coordinate system.
- a mobile device comprising:
- a processor communicatively coupled to the memory and configured to:
- a method of determining a reference coordinate system comprising:
- the method of claim 22 further comprising refining the reference coordinate system based on subsequently obtained information indicative of an orientation of the device.
- orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth.
- orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth.
- a mobile device comprising:
- orientation means communicatively coupled to the obtaining means, for determining an orientation of a reference coordinate system based on a selected application for the reference coordinate system, the orientation means including converting means for converting an orientation of a device coordinate system of the mobile device to the orientation of the reference coordinate system to produce the reference coordinate system.
- the converting means comprise refining means for refining the reference coordinate system based on subsequently obtained information indicative of an orientation of the device.
- the reference coordinate system comprises a SLAM coordinate system.
- orientation means comprise means for receiving an orientation from the AR application.
- the mobile device of claim 38 wherein the type of augmentation comprises text or designs placed on a real-world surface.
- the mobile device of claim 38 wherein the type of augmentation comprises a character moving on a real-world surface.
- the mobile device of claim 33 wherein the orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth.
- the mobile device of claim 33 wherein the orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth.
- the mobile device of claim 33 further comprising means for setting an origin of the reference coordinate system based on the selected application.
- a mobile device comprising:
- a processor communicatively coupled to the memory and configured to:
- the mobile device of claim 44 wherein the processor is further configured to refine the reference coordinate system based on subsequently obtained information indicative of an orientation of the device.
- the mobile device of claim 44 wherein the reference coordinate system comprises a SLAM coordinate system.
- the mobile device of claim 47 wherein the processor is configured to determine the orientation of the reference coordinate system based on a type of augmentation associated with the AR application.
- the mobile device of claim 49 wherein the type of augmentation comprises text or designs placed on a real-world surface.
- the mobile device of claim 49 wherein the type of augmentation comprises a character moving on a real-world surface.
- the mobile device of claim 44 wherein the orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth.
- the mobile device of claim 44 wherein the orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth.
- the mobile device of claim 44 wherein the processor is configured to set an origin of the reference coordinate system based on the selected application.
- a processor-readable storage medium of a mobile device comprising processor-readable instructions configured to cause a processor to:
- the storage medium of claim 55 further comprising instructions configured to cause the processor to refine the reference coordinate system based on subsequently obtained information indicative of an orientation of the device.
- the storage medium of claim 58 further comprising instructions configured to cause the processor to receive an orientation from the AR application.
- the storage medium of claim 58 wherein the instructions configured to cause the processor to determine the orientation of the reference coordinate system include instructions configured to cause the processor to determine the orientation of the reference coordinate system based on a type of augmentation associated with the AR application.
- the storage medium of claim 60 wherein the type of augmentation comprises text or designs placed on a real-world surface.
- the storage medium of claim 60 wherein the type of augmentation comprises a character moving on a real-world surface.
- orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth.
- orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth.
- instructions configured to cause the processor to determine the orientation of the reference coordinate system include instructions configured to cause the processor to set an origin of the reference coordinate system based on the selected application.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
- Studio Devices (AREA)
- Position Input By Displaying (AREA)
- Processing Or Creating Images (AREA)
- Digital Computer Display Output (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
Abstract
A method of determining a reference coordinate system includes: obtaining information indicative of a direction of gravity relative to a device; and converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system. The method may also include setting an origin of the reference coordinate system and/or determining a scale value of the reference coordinate system. The method may also include refining the reference coordinate system.
Description
- This application claims the benefit of U.S. Provisional Application No. 61/722,023, filed Nov. 2, 2012, entitled “Simultaneous Localization and Mapping Coordinate Systems,” that is incorporated by reference herein for all purposes.
- Augmented Reality (AR) provides a view of a real-world environment that is augmented with computer-generated audio and/or visual content. The audio and/or visual content can be overlaid over or integrated into an image or video of the real-world environment captured using a camera of a mobile device, or displayed on a transparent or semi-transparent screen through which a user is viewing the real-world environment. For example, an augmented reality application may be implemented on a mobile phone or tablet computer that includes a camera that can be used to capture images or video of a view of the real-world environment and a display that can be used to display an augmented view of the real-world environment, and/or on a head-mounted display (HMD).
- The device can include one or more sensors that collect data that can be used to determine the position, orientation, speed, and/or direction of movement of the device. This information can be used to assist the device in generating augmentation content. The sensors can also be used to collect input information from a user, such as touchscreen selections or other input information that can be used to allow the user to navigate the augmented content displayed on the device.
- In a simultaneous localization and mapping (SLAM) system (e.g., AR, robotics, etc.), typically there is a known reference target in the environment, e.g., in a field of view of a camera. If no known reference target is in the environment, a coordinate system for the SLAM may be defined arbitrarily. For example, the pose (position and orientation) of the camera when tracking begins may be used to define the coordinate system for the SLAM.
- An example method of determining a reference coordinate system includes: obtaining information indicative of a direction of gravity relative to a device; and converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system.
- Implementations of such a method may include one or more of the following features. The method further includes obtaining information indicative of a direction perpendicular to gravity and wherein the converting comprises converting the orientation of the device coordinate system using the direction perpendicular to gravity. The direction perpendicular to gravity is one of magnetic north or a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity. The method further includes setting an origin of the reference coordinate system. Setting the origin comprises: obtaining a point cloud; and determining a geometric center of a substantially planar portion of the point cloud. Setting the origin comprises: obtaining a point cloud; and determining an intersection of a view direction of a camera of the device and a plane corresponding to a substantially planar portion of the point cloud.
- Also or alternatively, implementations of the method may include one or more of the following features. The method further includes calculating a scale value and producing the reference coordinate system using the scale value. Calculating the scale value comprises: obtaining a point cloud; and comparing a dimension of the point cloud to a fixed size. Calculating the scale value comprises calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the device. Calculating the scale value comprises using absolute measurements from one or more input sensors of the device. The one or more input sensors comprise an accelerometer or a plurality of cameras. The method further includes refining the reference coordinate system. Refining the reference coordinate system comprises using information from the reference coordinate system and at least one previously-determined coordinate system to generate a combination of the reference coordinate system and the at least one previously-determined coordinate system.
- An example device for determining a reference coordinate system includes: means for obtaining information indicative of a direction of gravity relative to the device; and means for converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system.
- Implementations of such a device may include one or more of the following features. The device further includes means for obtaining information indicative of a direction perpendicular to gravity and wherein the means for converting are for converting the orientation of the device coordinate system using the direction perpendicular to gravity. The direction perpendicular to gravity is one of magnetic north or a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity. The device further includes means for setting an origin of the reference coordinate system. The means for setting the origin comprise: means for obtaining a point cloud; and means for determining a geometric center of a substantially planar portion of the point cloud. The means for setting the origin comprise: means for obtaining a point cloud; and means for determining an intersection of a view direction of a camera of the device and a plane corresponding to a substantially planar portion of the point cloud.
- Also or alternatively, implementations of the device may include one or more of the following features. The device further includes means for calculating a scale value. The means for calculating the scale value comprise: means for obtaining a point cloud; and means for comparing a dimension of the point cloud to a fixed size. The means for calculating the scale value comprise means for calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the device. The means for calculating the scale value comprise means for using absolute measurements from one or more input sensors of the device. The one or more input sensors comprise an accelerometer or a plurality of cameras.
- An example mobile device includes: a sensor configured to determine a direction of gravity and to provide an indication of the direction of gravity relative to the mobile device; and an orientation module communicatively coupled to the sensor and configured to convert an orientation of a device coordinate system, of the mobile device, using the indication of the direction of gravity relative to the device to produce a reference coordinate system.
- Implementations of such a mobile device may include one or more of the following features. The orientation module is further configured to convert the orientation of the device coordinate system using a direction perpendicular to gravity. The mobile device further includes an origin module communicatively coupled to the orientation module and configured to set an origin of the reference coordinate system by obtaining a point cloud and at least one of: determining a geometric center of a substantially planar portion of the point cloud; or determining an intersection of a view direction of a camera of the mobile device and a plane corresponding to a substantially planar portion of the point cloud. The mobile device further includes a scale module communicatively coupled to the orientation module and configured to set a scale value for the reference coordinate system relative to the device coordinate system by: (1) obtaining a point cloud and comparing a dimension of the point cloud to a fixed size; or (2) calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the mobile device; or (3) using absolute measurements from one or more input sensors of the mobile device. The orientation module is configured to produce a refined reference coordinate system using information from the reference coordinate system and at least one previously determined coordinate system.
- An example processor-readable storage medium of a mobile device includes processor-readable instructions configured to cause a processor to: obtain an indication of a direction of gravity relative to the mobile device; and convert an orientation of a device coordinate system, of the mobile device, using the indication of the direction of gravity relative to the device to produce a reference coordinate system.
- Implementations of such a storage medium may include one or more of the following features. The instructions configured to cause the processor to convert the orientation of the device coordinate system include instructions configured to use a direction perpendicular to gravity. The storage medium further includes instructions configured to cause the processor to set an origin of the reference coordinate system are configured to cause the processor to obtain a point cloud and at least one of: determine a geometric center of a substantially planar portion of the point cloud; or determine an intersection of a view direction of a camera of the mobile device and a plane corresponding to a substantially planar portion of the point cloud. The storage medium further includes instructions configured to cause the processor to set a scale value for the reference coordinate system relative to the device coordinate system by: (1) obtaining a point cloud and comparing a dimension of the point cloud to a fixed size; or (2) calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the mobile device; or (3) using absolute measurements from one or more input sensors of the mobile device. The storage medium further includes instructions configured to cause the processor to produce a refined reference coordinate system using information from the reference coordinate system and at least one previously determined coordinate system.
- Items and/or techniques described herein may provide one or more of the following capabilities, as well as other capabilities not mentioned. Meaningful ways are provided to define a coordinate system for augmented reality applications. For example, in a board game augmented reality application, a coordinate axis may be aligned parallel to the ground or another surface such as a table, and in a graffiti augmented reality application a coordinate axis may be aligned perpendicular to the ground, for example parallel to a wall or other structure. Coordinate systems may be provided where there is a known object in view or no known object in view. For cases with one or more objects in view, a more meaningful coordinate system may be established compared to prior techniques. For example, with multiple targets in view, the targets may be tracked and a coordinate system established. Alternatively, with multiple targets in view, a coordinate system may be aligned with an object, but not one of the tracked targets, thus providing a global coordinate system for all the targets. Other capabilities may be provided and not every implementation according to the disclosure must provide any particular capability, let alone all of the capabilities, discussed. Further, it may be possible for an effect noted above to be achieved by means other than that noted, and a noted item/technique may not necessarily yield the noted effect.
-
FIG. 1 is a simplified perspective view of an augmented reality system. -
FIG. 2 is a block diagram of a computer system that incorporates components of a mobile device shown inFIG. 1 . -
FIG. 3 is a functional block diagram of the computer system shown inFIG. 2 . -
FIGS. 4-5 are simplified plan views of the mobile device shown inFIG. 1 . -
FIG. 6 is a block flow diagram of a process of calculating a reference coordinate system. -
FIG. 7 is a perspective view of the system shown inFIG. 1 showing techniques for setting an origin of the reference coordinate system. -
FIGS. 8-10 are perspective view of the system shown inFIG. 1 showing techniques for setting an orientation of the reference coordinate system. -
FIGS. 11-13 are block flow diagrams of processes for determining reference coordinate systems. - Techniques are provided to define an origin and/or an orientation, and optionally a scale, of a coordinate system. The coordinate system may be for a simultaneous localization and mapping system. The techniques may be used in a variety of situations, but at least some of the techniques may be particularly useful in situations where the origin and orientation are not given a priori by a known reference.
- Techniques are provided for defining a reference coordinate system, e.g., for a simultaneous localization and mapping (SLAM) system, e.g., for augmented reality (AR) applications. Techniques are provided for determining the origin and/or orientation, and optionally the scale, of a coordinate system. For example, a device such as a mobile device may use as inputs a camera pose (position and orientation), a point cloud, and sensor data (e.g., measured from inertial sensors such as an accelerometer and a magnetometer). Using these inputs, the mobile device can determine a three-dimensional point in space for the origin of the desired coordinate system, an orientation of the desired coordinate system (e.g., three orthogonal axes), and a scale. The mobile device can use this information to determine a translation from a coordinate system of the camera to the desired coordinate system.
- Selecting a meaningful origin, orientation (and scale) for the coordinate system may be independent tasks. Different techniques are provided for each of these tasks. Some techniques make use of a three-dimensional (3D) point cloud. Some techniques use sensors. The defined coordinate system may be refined over time. Different techniques for determining the origin and/or different techniques for determining the orientation for the coordinate system may be used, e.g., depending on a particular application used.
- Referring to
FIG. 1 , a system for determining a coordinate system, here an augmented reality (AR)system 10 which may comprise a SLAM system, includes a device, here an AR-enabledmobile device 12, and anobject 14. Themobile device 12 may be any computing device with an input sensory unit, such as a camera, and a display. Here, themobile device 12 is a smart phone although the functionality described herein is not limited to smart phones. For example, themobile device 12 may be a digital camera, a camcorder, a tablet computer, a personal digital assistant, a video game console, an HMD or other wearable display, a projector device, or other device. Further, instead of themobile device 12, a device such as a personal computer (e.g., desktop computer), or other non-hand-held device or device not typically labeled a mobile device, could be used. Themobile device 12 includes a camera for capturing images of objects, here theobject 14, in a field ofview 18 of the camera. As used herein, a point cloud is a collection of points in three-dimensional space corresponding to at least a portion of an object, here theobject 14, that is visible in the field ofview 18 by the camera of themobile device 12. The point cloud may be determined by one or more techniques performed by a processor of themobile device 12, for example based on image data from a camera 24 (described below) of themobile device 12 Themobile device 12 is configured to augment reality by capturing images of its environment, here capturing images of theobject 14, and displaying the additional imagery on a transparent or semi-transparent display through which theobject 14 is visible or displaying an image of theobject 14 supplemented with additional imagery, here a drawing 16 partially superimposed on and partially disposed above theobject 14, e.g., here for use in a graffiti AR application. - Referring to
FIG. 2 , themobile device 12 includessensors 20, asensor processor 22, one ormore cameras 24, acamera processor 26, adisplay 28, agraphics processor 30, atouch sensor 32, atouch sensor processor 34, acommunication module 36, aprocessor 38, andmemory 40. Theprocessors communication module 36, theprocessor 38, and thememory 40 are communicatively coupled through abus 42, as illustrated, or may be directly coupled or coupled in another way. Theprocessors memory 40 and executed by theprocessor 38, or separate dedicated processors, or combinations of these (e.g., with one or more of theprocessors processors sensors 20 can provide an indication and/or indications of various sensed parameters, e.g., an indication of a direction of gravity, e.g., relative to an orientation of the mobile device 12 (relative to a device coordinate system of the mobile device 12). - The
sensors 20 and thesensor processor 22 are configured to determine an orientation of themobile device 12. Thesensors 20 are orientation sensors configured to sense information for use in determining an orientation of themobile device 12. For example, thesensors 20 may include one or more inertial sensors such as gyroscopes, one or more accelerometers, an inertial measurement unit (IMU), and/or another type and/or other types of sensors. Thesensor processor 22 is configured to process data measured/collected by thesensors 20 to determine the orientation of themobile device 12. - The camera(s) 24 and the
camera processor 26 are configured to capture and produce visual information. The camera(s) 24 is(are) configured to capture images and/or video of a real-world scene that can be augmented (with augmentation, e.g., text or designs placed on a real-world surface) using augmentation logic. Thecamera processor 26 is configured to process the data collected by the camera(s) 26 to convert the data into a format that can be used by the augmentation logic. Thecamera processor 26 is configured to perform various types of image or video processing on the data collected from the camera(s) 24 to prepare the content for display ondisplay 28. - The
display 28 and thegraphics processor 30 are configured to provide visual information based on the data captured by the camera(s) 24 and processed by thecamera processor 26, and to provide visual information based on information produced by the augmentation logic. Thedisplay 28 can be a touch screen interface that includes thetouch sensor 32. Thegraphics processor 30 is configured to generate graphical data for display on thedisplay 28. Thegraphics processor 30 is configured to use information provided by the augmentation logic to display augmented image or video content. - The
touch sensor processor 34 can process data output by thetouch sensor 32 to identify when a user touches the touch screen. Thetouch sensor process 34 can be configured to identify various touch gestures, including multi-finger touches of the touch screen. The augmentation logic can use the gesture information determined by thetouch sensor processor 34 to determine, at least in part, how the augmentation should react in response to user input. - The
communication module 36 is configured to enable themobile device 12 to communicate using one more wireless protocols. Thecommunication module 36 is configured to allow themobile device 12 to send and receive data from nearby wireless devices, including wireless access points and other AR-enabled devices. Thecommunication module 36 may include a modem, a wireless network card, an infrared communication device, a wireless communication device and/or chipset (such as a short-range wireless device such as a Bluetooth™ device, or an 802.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc.), or the like. Thecommunication module 36 may permit data to be exchanged with a network, other computer systems, and/or other devices. - The
processor 38 is configured to control one or more of thesensor processor 22, thecamera processor 26, thegraphics processor 30, or thetouch sensor processor 34. One or more of thesensor processor 22,camera processor 26, thegraphics processor 30, or thetouch sensor processor 34 may also be implemented by theprocessor 38. - The
memory 40 includes volatile and/or persistent, non-transitory memory for storing data used by various components of the AR-enabledmobile device 12. Thememory 40 may include local and/or network accessible storage, a disk drive, a drive array, an optical storage device, a solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like. Such storage devices may be configured to implement any appropriate data storage, including without limitation, various file systems, database structures, and/or the like. - The
memory 40 stores processor-readable, processor-executablesoftware program code 44 for one or more of the processors included in themobile device 12. Thesoftware code 44 contains instructions for controlling the processor(s) to perform functions described below (although the description may read that the software or a processor performs the function(s)). In some instances, augmentation logic can be implemented as processor-executable instructions stored in thememory 40. Thesoftware 44 includes an operating system, device drivers, executable libraries, and/or other software code instructions, such as one or more application programs, that may implement methods described herein. For example, one or more procedures described herein might be implemented as software code instructions executed by a processor. Such instructions can cause a general purpose computer (or other device) to perform one or more operations as described herein. Thesoftware 44 may be stored on a non-removable portion of thememory 40 incorporated within themobile device 12 or may be stored on a removable medium, such as a compact disc, and/or provided in an installation package. The instructions may take the form of executable code, which is directly executable by a processor, or alternatively the instructions may take the form of source and/or installable code, which, upon compilation and/or installation (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code. One or more of the elements described above with respect to themobile device 12 may be omitted. For example, thecommunication module 36 and/or thetouch sensor 32 andtouch sensor processor 34 may be omitted. Further, additional elements may be included in some embodiments. One or more of theprocessors processor 38, and/or some or all functionality of one or more of theprocessors memory 40. - Referring to
FIG. 3 , themobile device 12 includes an origin module (means for setting a coordinate system origin) 46, an orientation module (orientation means or means for setting a coordinate system orientation) 48, a scale module (means for setting a coordinate system scale) 50, and a refining module (means for refining a coordinate system) 52. Theorigin module 46, theorientation module 48, thescale module 50, and therefining module 52 are communicatively coupled to each other. Themodules processors software 44 stored in thememory 40, although themodules modules processors processors processors origin module 46, or theorientation module 48, or thescale module 50, respectively, performing the function. Theorigin module 46, or means for setting a coordinate system origin, is configured to set an origin of a coordinate system, e.g., an AR coordinate system based on information captured by the camera(s) 24. Examples of operation of theorigin module 46 are discussed below with respect toFIG. 6 , although theorigin module 46 may operate in one or more other ways as well. Theorientation module 48, or means for setting a coordinate system orientation, is configured to set an orientation of a coordinate system, e.g., an AR coordinate system using information captured by the camera(s) 24 and/or information obtained from the sensor(s) 22. Theorientation module 48, or means for setting a coordinate system orientation, include converting means for converting an orientation of a device coordinate system of themobile device 12 to the orientation of a reference coordinate system. Examples of operation of theorientation module 48 are discussed below with respect toFIG. 6 , although theorientation module 48 may operate in one or more other ways as well. Thescale module 50 is configured to set (including determine) a scale value for a coordinate system, e.g., an AR coordinate system, using techniques discussed below or other appropriate techniques. Therefining module 52 is configured to use previously-determined information, e.g., origin and/or orientation and/or scale value, to refine the reference coordinate system, e.g., change the origin and/or orientation and/or scale value of the reference coordinate system. - The
modules device 12. For example, with the reference coordinate system being an AR coordinate system, themodules modules camera 24, etc. Use of these modules may cause AR images to appear to displayed by thedevice 12 as an appropriate size, in an expected orientation, and/or in a proper augmentation with respect to a surface, for example to appear to be walking on or anchored to a surface instead of floating above it. Further, these modules may be used to establish a reference system by thedevice 12 for uses other than AR, for example in other applications where tracking and/or mapping is utilized, such as in a different type of SLAM system. - Referring to
FIGS. 4-5 , themobile device 12 may include twocameras camera 24 1 is disposed on afront side 54 of themobile device 12, along with thedisplay 28, and thecamera 24 2 is disposed on a back side orrear side 56 of themobile device 12. Thecamera 24 1 is referred to as a front-facing camera and thecamera 24 2 is referred to herein as a rear-facing camera. Themobile device 12 is typically held such that the front-facingcamera 24 1 faces a user of themobile device 12 and the rear-facingcamera 24 2 faces away from the user of themobile device 12. Alternatively, the converse may be true depending on how themobile device 12 is held by the user. Referring also toFIG. 1 , either of thecameras 24 1 24 2 may have the field of view (FOV) 18. Thecamera 24 2 is described below as having theFOV 18, but thecamera 24 1 or another camera may have theFOV 18 in other situations or implementations. - Referring to
FIG. 6 , with further reference toFIGS. 1-5 , aprocess 60 of establishing a reference coordinate system includes the stages shown. Theprocess 60 is, however, an example only and not limiting. Theprocess 60 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently. For example, stages 70 and 74 discussed below could be eliminated, or two or more of thestages - At
stage 62, themobile device 12 obtains an image frame and in some embodiments also obtains sensor data. At a triggering event such as at power up, activation of an AR application, user selection, or other triggering event, themobile device 12 captures an image from one of thecameras 24, e.g., the rear-facingcamera 24 2, as processed by thecamera processor 26. Themobile device 12 may further obtain information from thesensors 20 as processed by thesensor processor 22. In this example, using an accelerometer, themobile device 12 obtains information regarding the direction of gravity relative to the coordinate system of themobile device 12. Further, using an accelerometer and a magnetometer, themobile device 12 obtains the direction of magnetic north relative to the coordinate system of themobile device 12. In some embodiments, two non-linear vectors are used. The two non-linear vectors may comprise a gravity vector (pointed downward) and a vector of magnetic north (pointed toward magnetic north), or a combination of one of these vectors and another vector, or a combination of two other non-linear vectors. The gravity vector and the vector of magnetic north (i.e., a magnetic north vector) may both be derived from one or more accelerometer measurements. - At
stage 64, themobile device 12 determines the reference coordinate system origin. Theorigin module 46 may set the origin in one of a variety of ways depending upon the circumstances. In a first technique, theorigin module 46 sets the origin at apresent 3D position 80 of thecamera 24 2 as shown inFIG. 7 . In a second technique, theorigin module 46 sets theorigin 82 at a geometric center associated with a point cloud, e.g. a geometric center of theobject 14 as shown inFIG. 7 . This technique may be used, for example, when the point cloud is known or determined to have a planar surface (e.g., a table, a poster on a wall, etc.). The geometric center associated with the point cloud may be determined as the geometric center of the planar surface. In a third technique, theorigin module 46 sets theorigin 84 at a point of the point cloud whose two-dimensional (2D) projection lies close to a center of the camera image as shown inFIG. 7 . This technique affixes or assigns the origin to a physical object, because any point of the point cloud corresponds to a physical structure in the environment (around the mobile device 12). This technique may be used in various situations, but may be particularly useful where variance in the depth in the physical environment is small, with the points of the point cloud being at similar distances from thecamera 24. In a fourth technique, theorigin module 46 sets theorigin 86 of the reference coordinate system at an intersection between the camera view direction 88 (i.e., a ray through the center of the camera frame, i.e., in the center of the field of view of the camera 24) and a plane fit to the point cloud. The plane can be fit to a large or small portion of the point cloud depending upon whether the point cloud is planar (represents a planar surface) or does not represent a substantial planar surface. For example, theorigin module 46 can fit the plane to some points of the point cloud whose 2D projections lie close to the camera center. This technique helps ensure that the origin will be at or near the center of the camera's view and also associated with a physical geometry in the environment around themobile device 12. As shown inFIG. 7 , theorigins origins origins other origins - At
stage 66, themobile device 12 determines the reference coordinate system orientation, for example using two non-parallel vectors. The orientation of the reference coordinate system defines the directions for elevation (up/down, +/−z-direction), azimuth (left/right, +/−x-direction), and forward/backward (+/−y-direction), assuming an x-y-z orthogonal coordinate system. Because the reference coordinate system is orthogonal, it is enough for theorientation module 48 to define the z-axis and the y-axis, with the final axis resulting from the orthogonality between the axes (and sidedness such as right-handedness). In some techniques, a different combination of two axes is used. Several techniques are available for determining the orientation. In a first technique, theorientation module 48 sets the orientation of the reference coordinate system as the orientation of the present camera image frame, as shown inFIG. 8 . This technique, as shown inFIG. 8 , is applied to theorigin 86, but the technique could be applied to any origin, selected using any technique. This technique is independent of sensor measurements and can be used when the sensor information is not available (e.g., thesensors 20 are inoperative or it is desirable to conserve power). The resulting reference coordinate system will be oriented without regard to gravity and may be askew relative to the physical environment (e.g., with azimuth being non-parallel to the floor). In a second technique, as shown inFIG. 9 , theorientation module 48 sets the orientation with respect to gravity (shown as vector g) and to a direction perpendicular to gravity or within a tolerance, e.g., 10°, of perpendicular to gravity (i.e., substantially horizontal orientation with respect to Earth or a surface of Earth), such as magnetic north (or, alternatively, true north). Theorientation module 48 may use information from thesensor processor 22 to set the positive z-axis (ZR) of the reference coordinate system as the inverse direction of gravity or within a tolerance, e.g., 10°, of this direction (i.e., substantially vertical orientation with respect to Earth or a surface of Earth) based on information from the accelerometer, and set the positive y-axis (YR) of the reference coordinate system as the direction of magnetic north (based on information from the magnetometer and/or accelerometer). This technique may be used in various situations, and may be particularly useful in situations where alignment with geographic directions (e.g., for a navigation system) is useful or important. In a third technique, as shown inFIG. 10 , theorientation module 48 sets the orientation of the reference coordinate system with respect to gravity and a present view direction of thecamera 24. For example, theorientation module 48 sets the positive z-axis as the inverse direction of gravity (using the accelerometer) and sets the positive y-axis as the current view direction of the rear-facingcamera 24 2, projected onto a plane perpendicular to the z-axis (here the top surface of the table 14). This technique may be used in various situations, and may be particularly useful in situations where alignment with geographic directions is of little or no importance or use and alignment with the camera viewing direction is useful or important. - Referring also to
FIG. 11 , aprocess 110 of determining a reference coordinate system using a direction of gravity includes the stages shown. Theprocess 110 is, however, an example only and not limiting. Theprocess 110 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently. While x-y-z coordinate systems are discussed below, other coordinate systems could be used. Theprocess 110 includes techniques shown in bothFIG. 9 andFIG. 10 and discussed above for orienting a reference coordinate system. - At
stage 112, theprocess 110 includes obtaining information indicative of a direction of gravity relative to a device. Theorientation module 48 obtains information regarding a direction of gravity from thesensor processor 22, that thesensor processor 22 computed from sensor data from one or more of thesensors 20. The information from thesensor processor 22 regarding the direction of gravity may be relative to thedevice 12, e.g., relative to a device coordinate system (XD-YD-ZD) or may be converted by theorientation module 48 to be relative to the device coordinate system. - At
stage 114, theprocess 110 includes converting an orientation of the device coordinate system (XD-YD-ZD) using the direction of gravity relative to the device to produce the reference coordinate system (XR-YR-ZR). Known techniques can be used to convert (translate the origin and/or rotate the orientation) of the device coordinate system to yield the reference coordinate system. Translate is used here to indicate linear motion although the term translating (and its conjugations) was used in U.S. Provisional Application No. 61/722,023 to mean converting as used herein. - Referring further to
FIG. 12 , aprocess 120 of determining a reference coordinate system based on an application that will use the reference coordinate system includes the stages shown. Theprocess 120 is, however, an example only and not limiting. Theprocess 120 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently. Theprocess 120 includes techniques for orienting the reference coordinate system. - At
stage 122, theprocess 120 includes obtaining information indicative of an orientation of a device. Theorientation module 48 obtains information from thesensor processor 22, that thesensor processor 22 computed from sensor data from one or more of thesensors 20. The information indicates (either directly or after processing by the orientation module 48) an orientation of the device coordinate system, e.g., here the XD-YD-ZD coordinate system shown inFIGS. 9-10 . - At
stage 124, theprocess 120 includes determining an orientation of the reference coordinate system based on a selected application for the reference coordinate system. The reference coordinate system may be initially defined and/or set to align with a surface determined or hypothesized by themobile device 12, e.g., via an AR application executed on themobile device 12, to be useful or important. For instance, two axes of an x-y-z coordinate system may be defined to lie in a plane of a surface of a table (e.g., a tabletop) for a gaming application, a wall for a graffiti application, or other surface or an arbitrary orientation, e.g., as discussed below with respect to provided examples. For example, theprocessor 38 can analyze information from one or more camera images to determine a plane of a point cloud and assign two axes to be orthogonal to each other and to lie in the plane, with the third axis being orthogonal to both of the other two axes. Determining the orientation of the reference coordinate system may comprise receiving the orientation of the reference coordinate system, e.g., from the AR application and/or from user input, and a means for determining the orientation may thus include means for receiving the orientation of the reference coordinate system, e.g., from the AR application and/or from user input. Also or alternatively, the orientation of the reference coordinate system may be determined based on a type of augmentation associated with the AR application. For example, the type of augmentation may comprise text and/or designs placed on a real-world surface, and/or a character moving on a real-world surface. Such augmentations could indicate a plane that should be occupied, e.g., by an x-y plane of the reference coordinate system. - The
orientation module 48 may determine which technique or techniques to use to determine the orientation of the reference coordinate system based on the application. For example, theorientation module 48 may select one or more of the techniques discussed above with respect to stage 66 based on the application. If more than one technique is selected, then results may be combined, e.g., averaged (including weighted averaging or non-weighted averaging) to determine the reference coordinate system orientation if appropriate, or one of the techniques selected otherwise. Theorientation module 48 may determine (or select) the orientation technique(s) based on one or more properties or benefits of the techniques relative to the application. For example, for a sports application (e.g., a basketball game), the orientation may be desirable to be relative to gravity while for a first person shooter game in a spaceship, the orientation may be desirable to be relative to the camera image frame at the start of the application. The benefits) and/or property(ies) of the application can be matched to the orientation technique, e.g., by analysis by theorientation module 48, by user input, by predetermined settings (e.g., by an application developer), etc. If more than one technique is selected, then results may be combined, e.g., averaged (including weighted averaging or non-weighted averaging) to determine the reference coordinate system orientation if appropriate, or one of the techniques selected otherwise. That is, in situations where multiple orientation techniques would each be desirable, theorientation module 48 may combine the orientations if practical, or may select one of the techniques based on a priority for the particular application (e.g., as predefined), or may allow a user to select the orientation technique, etc. - At
stage 126, theprocess 120 includes converting an orientation of a device coordinate system to the determined orientation to produce the reference coordinate system. Theorientation module 48 converts the orientation of the device coordinate system to the orientation of the determined orientation fromstage 124 to produce the reference coordinate system, e.g., using known techniques to rotate the device coordinate system to the determined orientation. - Further, the
process 120 may include stages for obtaining information indicative of an initial origin, determining a desired origin of the reference coordinate system, and converting the initial origin to the desired origin as the origin of the reference coordinate system. These stages may be done as part of theprocess 120, or as a separate process, independent of theprocess 120. Theorigin module 46 may determine which technique or techniques to use to determine the origin of the reference coordinate system based on the application. For example, theorigin module 46 may select one or more of the techniques discussed above with respect to stage 64 based on the application. Theorigin module 46 may determine (or select) the origin technique(s) based on one or more properties or benefits of the techniques relative to the application. For example, for a sports application (e.g., a basketball game), the origin may be desirable to be associated with sports equipment, e.g., a basketball basket, for a painting application, the origin may be desirable to be a center of a planar surface such as a wall or a canvas on an easel, while for a guidance application the origin may be desirable to be the origin of thecamera 24. The benefits) and/or property(ies) of the application can be matched to the origin technique, e.g., by analysis by theorigin module 46, by user input, by predetermined settings (e.g., by an application developer), etc. If more than one technique is selected, then results may be combined, e.g., averaged (including weighted averaging or non-weighted averaging) to determine the reference coordinate system origin if appropriate, or one of the techniques selected otherwise. That is, in situations where multiple origin techniques would each be desirable, theorigin module 46 may combine the origins if practical, or may select one of the techniques based on a priority for the particular application (e.g., as predefined), or may allow a user to select the origin technique, etc. - Returning to
FIG. 6 , atstage 68, themobile device 12 determines the scale of the reference coordinate system, although doing so is optional as a scale value may be predetermined. The scale value may be determined using a variety of techniques. In a first technique, thescale module 50 sets a scale value by comparing and scaling the point cloud to a fixed size, e.g., one unit. To do so, thescale module 50 may scale any of a variety of dimensions of the point cloud such as a height, a width, or a longest dimension seen from thecamera 24. In a second technique, thescale module 50 sets the scale value such that the AR coordinate origin has a specific predetermined distance, e.g., one unit, to the present camera position, e.g., the origin of the camera coordinate system. In a third technique, thescale module 50 sets the scale value based on absolute measurements, e.g., as provided by an accelerometer or a stereo camera system (e.g., two rear-view cameras 24 displaced on theback 56 of the mobile device 12). The absolute measurements can be directly used to define the scale of the AR coordinate system. For example, a measurement such as 1 cm from the absolute measurements could be used as 1 unit in the AR coordinate system. - At
stage 70, themobile device 12 refines the present reference coordinate system determined instages mobile device 12 refines the present reference coordinate system (origin, orientation, and possibly scale) using information from a previously-determined reference coordinate system, here the most-recently determined reference coordinate system, if any. For example, theprocessor 38 may iterate the reference coordinate system, e.g., origin and/or orientation and/or scale, over time using a Kalman filter by using one or more of the values and/or sensor data from previous iterations. This stage is optional, as is thestage 74 discussed below. - At
stage 72, themobile device 12 outputs the present reference coordinate system. The present reference coordinate system may be output to an AR application such as a drawing application, a board game application, a first-person shooter application, etc. The present reference coordinate system may be a refined reference coordinate system if a prior reference coordinate system was available atstage 70 inducing modification to the system determined atstages stage 70 or if the prior reference coordinate system induces no modification to the system determined atstages - At
stage 74, the present reference coordinate system is stored. The present reference coordinate system is stored in thememory 40 for use as the prior reference coordinate system instage 70. As withstage 70,stage 74 is optional. - The
process 60 returns to stage 62 afterstage 72. Theprocess 60 continues until themobile device 12 is powered down or the AR application being run on themobile device 12 is closed. Theprocess 60 can be initiated or repeated based on or in response to one or more of various criteria. For example, theprocess 60 could be performed in response to: each image being captured; a threshold number of images having been captured; an application request; a user request; passage of a threshold amount of time; determination of a threshold amount of drift in origin and/or orientation having occurred; etc. - Referring to
FIG. 13 , aprocess 130 of determining a reference coordinate system by refining a determined reference coordinate system includes the stages shown. Theprocess 130 is, however, an example only and not limiting. Theprocess 130 can be altered, e.g., by having stages added, removed, rearranged, combined, and/or performed concurrently. - At
stage 132, theprocess 130 includes obtaining first information indicative of an orientation of a device at a first time. Theprocessor 38 may obtain the information from the sensor processor in accordance with measurements taken by thesensors 20, and/or may obtain the information from thememory 40, etc. For example,stage 132 may comprises stage 62 discussed above. - At
stage 134, theprocess 130 includes determining the reference coordinate system based on the first information. Theprocessor 38 determines the reference coordinate system, e.g., as discussed above with respect tostages stage 68. - At
stage 136, theprocess 130 includes refining the reference coordinate system based on second information indicative of the orientation of the device at a second time after the first time(i.e., the second information is obtained, e.g., sensed, subsequently to the first information being obtained, e.g., sensed). For example, theprocessor 38 can refine the reference coordinate system as discussed above with respect to stage 70 using stored information regarding previously-determined reference coordinate system information associated with at least one previously-determined coordinate system. Movement of themobile device 12 between the first time and the second time may improve the refinement. - The preferred choice of a reference coordinate system may be affected by the application for which the reference coordinate system is to be used. The
origin module 46 and/or theorientation module 48 may select a technique or techniques for determining the origin and/or orientation of the reference coordinate system based on the application to be used. The following are examples of how theprocessor 38 can choose between various coordinate selection techniques for some applications, but other techniques for setting the coordinate system origins and/or coordinate system orientations may be used for the applications discussed or other applications. - In a game-board AR game, an example of a meaningful coordinate system is one that is upright and aligned with a horizontal surface, e.g., of a table. In this case, the
processor 38 can assume that the point cloud is reasonably planar for calculating the origin of the reference coordinate system, e.g., using the first or fourth origin-setting techniques discussed above. The orientation of the reference coordinate system can be calculated by theprocessor 38 and/or theorientation module 48 with respect to gravity and to the view direction of thecamera 24, e.g., using the third orientation-setting technique discussed above. Similarly, in a graffiti AR application, a meaningful coordinate system is typically aligned with a wall and facing the outer side of the wall. In this case, for example, theprocessor 38 can calculate the origin using the second or fourth origin-setting techniques discussed above, with the point cloud being substantially/reasonably planar (e.g., having a substantially planar portion, e.g., planar within an allowed tolerance, e.g., 10% deviation of height versus length or width of the plane). In this case, for example, theprocessor 38 can calculate the orientation of the reference coordinate system using the third orientation-setting technique discussed above and then rotating the coordinate system by 90 degrees around the x-axis so that the z-axis points horizontally out of the wall rather than upwards. - In an application with a robot navigating an unknown environment, an example of a meaningful coordinate system is one that is centered on the robot's initial position and aligned to gravity and to north. In this case, for example, the
processor 38 and/or theorigin module 46 may use the first origin-setting technique discussed above to calculate the origin of the reference coordinate system, and may use the second orientation-setting technique discussed above to orient the reference coordinate system correctly with respect to gravity and north. In some embodiments, such reference coordinate system may be used for SLAM, and the robot navigates the unknown environment using SLAM. - In a first-person shooting game in outer space, an example of a meaningful reference coordinate system is one that is aligned to the initial camera position and orientation, and that is independent of gravity. In this case, for example, the
processor 38 may use the first origin-setting technique and the first orientation-setting technique discussed above. - Proper setting of a reference coordinate system may help improve a user experience. For example, proper setting of the reference coordinate system may help ensure that augmentations are in appropriate proportions (e.g., a basketball being smaller than basketball basket), in appropriate locations (e.g., heavy objects not floating in mid-air, fish not flying, etc.), and at appropriate orientations (e.g., people standing upright, cars driving horizontally, etc.).
- The discussion gave examples using x-y-z coordinate systems, the principles involved, and/or the specific techniques discussed with little modification, could be applied to other coordinate systems.
- Other examples and implementations are within the scope and spirit of the disclosure and appended claims. For example, while the discussion above focused on augmented reality systems and/or SLAM systems, the techniques discussed may be applied to non-augmented reality systems. Further, due to the nature of software, functions described above can be implemented using software executed by a processor, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations. Also, as used herein, including in the claims, “or” as used in a list of items prefaced by “at least one of” indicates a disjunctive list such that, for example, a list of “at least one of A, B, or C” means A or B or C or AB or AC or BC or ABC (i.e., A and B and C), or combinations with more than one feature (e.g., AA, AAB, ABBC, etc.).
- As used herein, including in the claims, unless otherwise stated, a statement that a function or operation is “based on” an item or condition means that the function or operation is based on the stated item or condition and may be based on one or more items and/or conditions in addition to the stated item or condition.
- Substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
- The terms “machine-readable medium” and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. Using a computer system, various computer-readable media might be involved in providing instructions/code to processor(s) for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals). In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media include, for example, optical and/or magnetic disks. Volatile media include, without limitation, dynamic memory.
- Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to one or more processors for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by a computer system.
- The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and that various steps may be added, omitted, or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
- Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations provides a description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
- Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional stages or functions not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of operations may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bound the scope of the claims.
- Further, the following are example claims regarding various aspects of the discussion above.
- 1. A method of determining a reference coordinate system, the method comprising:
- obtaining first information indicative of an orientation of a device at a first time;
- determining the reference coordinate system based on the first information; and
- refining the reference coordinate system based on second information indicative of the orientation of the device at a second time after the first time.
- 2. The method of
claim 1 wherein the device has been moved between the first time and the second time. - 3. The method of
claim 1 wherein the reference coordinate system comprises a coordinate system for use with an application of the device, and wherein the determining comprises converting an orientation of a coordinate system of the device to the reference coordinate system for use with the application of the device. - 4. The method of claim 3 wherein the reference coordinate system for use with an application of the device comprises a SLAM coordinate system.
- 5. The method of claim 3 wherein the application comprises an AR application.
- 6. The method of
claim 1 wherein the first information comprises two non-linear vectors. - 7. The method of claim 6, wherein a first of the two non-linear vectors comprises a gravity vector derived from one or more accelerometer measurements.
- 8. The method of claim 7 wherein a second of the two non-linear vectors comprises a vector pointed in a direction of magnetic north, the second of the two vectors being derived from at least a magnetometer, or wherein the second of the two non-linear vectors comprises a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity.
- 9. The method of
claim 1 wherein the determining comprises setting an origin of the reference coordinate system and/or determining a scale value of the reference coordinate system. - 10. A mobile device comprising:
- means for obtaining first information indicative of an orientation of the mobile device at a first time;
- means for determining the reference coordinate system based on the first information; and
- means for refining the reference coordinate system based on second information indicative of the orientation of the device at a second time after the first time.
- 11. The mobile device of
claim 10 wherein the means for determining are configured to convert an orientation of a coordinate system of the mobile device to the reference coordinate system for use with an application of the mobile device. - 12. The mobile device of
claim 10 wherein the means for obtaining are configured to obtain two non-linear vectors as the first information, at least one of the two non-linear vectors being a gravity vector or a vector of magnetic north. - 13. The mobile device of
claim 10 wherein the means for determining are configured to set an origin of the reference coordinate system and/or determine a scale value of the reference coordinate system. - 14. A processor-readable storage medium of a mobile device, the storage medium comprising processor-readable instructions configured to cause a processor to:
- obtain first information indicative of an orientation of the mobile device at a first time;
- determine the reference coordinate system based on the first information; and
- refine the reference coordinate system based on second information indicative of the orientation of the device at a second time after the first time.
- 15. The storage medium of
claim 14 wherein the instructions configured to cause the processor to determine the reference coordinate system include instructions configured to cause the processor to convert an orientation of a coordinate system of the mobile device to the reference coordinate system for use with an application of the mobile device. - 16. The storage medium of
claim 14 wherein the instructions configured to cause the processor to obtain the first information include instructions configured to cause the processor to obtain two non-linear vectors as the first information, at least one of the two non-linear vectors being a gravity vector or a vector of magnetic north. - 17. The storage medium of
claim 14 wherein the instructions configured to cause the processor to determine the reference coordinate system include instructions configured to cause the processor to set an origin of the reference coordinate system and/or determine a scale value of the reference coordinate system. - 18. A mobile device comprising:
- a memory storing processor-readable instructions; and
- a processor communicatively coupled to the memory and configured to:
-
- obtain first information indicative of an orientation of the mobile device at a first time;
- determine a reference coordinate system based on the first information; and
- refine the reference coordinate system based on second information indicative of the orientation of the mobile device at a second time after the first time.
- 19. The mobile device of
claim 18 wherein the processor is configured to convert an orientation of a coordinate system of the mobile device to the reference coordinate system for use with an application of the mobile device. - 20. The mobile device of
claim 18 wherein the processor is configured to obtain two non-linear vectors as the first information, at least one of the two non-linear vectors being a gravity vector or a vector of magnetic north. - 21. The mobile device of
claim 18 wherein to determine the reference coordinate system the processor is configured to set an origin of the reference coordinate system and/or determine a scale value of the reference coordinate system. - 22. A method of determining a reference coordinate system, the method comprising:
- obtaining information indicative of an orientation of a device;
- determining an orientation of the reference coordinate system based on a selected application for the reference coordinate system; and
- converting an orientation of a device coordinate system to the orientation of the reference coordinate system to produce the reference coordinate system.
- 23. The method of
claim 22 further comprising refining the reference coordinate system based on subsequently obtained information indicative of an orientation of the device. - 24. The method of
claim 22 wherein the reference coordinate system comprises a SLAM coordinate system. - 25. The method of
claim 22 wherein the application comprises an AR application. - 26. The method of claim 25 wherein the determining comprises receiving an orientation from the AR application.
- 27. The method of claim 25 wherein the determining is based on a type of augmentation associated with the AR application.
- 28. The method of claim 27 wherein the type of augmentation comprises text or designs placed on a real-world surface.
- 29. The method of claim 27 wherein the type of augmentation comprises a character moving on a real-world surface.
- 30. The method of
claim 22 wherein the orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth. - 31. The method of
claim 22 wherein the orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth. - 32. The method of
claim 22 further comprising setting an origin of the reference coordinate system based on the selected application. - 33. A mobile device comprising:
- obtaining means for obtaining information indicative of an orientation of a device; and
- orientation means, communicatively coupled to the obtaining means, for determining an orientation of a reference coordinate system based on a selected application for the reference coordinate system, the orientation means including converting means for converting an orientation of a device coordinate system of the mobile device to the orientation of the reference coordinate system to produce the reference coordinate system.
- 34. The mobile device of claim 33 wherein the converting means comprise refining means for refining the reference coordinate system based on subsequently obtained information indicative of an orientation of the device.
- 35. The mobile device of claim 33 wherein the reference coordinate system comprises a SLAM coordinate system.
- 36. The mobile device of claim 33 wherein the application comprises an AR application.
- 37. The mobile device of
claim 36 wherein the orientation means comprise means for receiving an orientation from the AR application. - 38. The mobile device of
claim 36 wherein the determining is based on a type of augmentation associated with the AR application. - 39. The mobile device of
claim 38 wherein the type of augmentation comprises text or designs placed on a real-world surface. - 40. The mobile device of
claim 38 wherein the type of augmentation comprises a character moving on a real-world surface. - 41. The mobile device of claim 33 wherein the orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth.
- 42. The mobile device of claim 33 wherein the orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth.
- 43. The mobile device of claim 33 further comprising means for setting an origin of the reference coordinate system based on the selected application.
- 44. A mobile device comprising:
- a memory storing processor-readable instructions; and
- a processor communicatively coupled to the memory and configured to:
-
- obtain information indicative of an orientation of a device;
- determine an orientation of a reference coordinate system based on a selected application for the reference coordinate system; and
- convert an orientation of a device coordinate system of the mobile device to the orientation of the reference coordinate system to produce the reference coordinate system.
- 45. The mobile device of
claim 44 wherein the processor is further configured to refine the reference coordinate system based on subsequently obtained information indicative of an orientation of the device. - 46. The mobile device of
claim 44 wherein the reference coordinate system comprises a SLAM coordinate system. - 47. The mobile device of
claim 44 wherein the application comprises an AR application. - 48. The mobile device of claim 47 wherein the processor is configured to receive an orientation from the AR application.
- 49. The mobile device of claim 47 wherein the processor is configured to determine the orientation of the reference coordinate system based on a type of augmentation associated with the AR application.
- 50. The mobile device of claim 49 wherein the type of augmentation comprises text or designs placed on a real-world surface.
- 51. The mobile device of claim 49 wherein the type of augmentation comprises a character moving on a real-world surface.
- 52. The mobile device of
claim 44 wherein the orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth. - 53. The mobile device of
claim 44 wherein the orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth. - 54. The mobile device of
claim 44 wherein the processor is configured to set an origin of the reference coordinate system based on the selected application. - 55. A processor-readable storage medium of a mobile device, the storage medium comprising processor-readable instructions configured to cause a processor to:
- obtain information indicative of an orientation of a device;
- determine an orientation of a reference coordinate system based on a selected application for the reference coordinate system; and
- convert an orientation of a device coordinate system of the mobile device to the orientation of the reference coordinate system to produce the reference coordinate system.
- 56. The storage medium of claim 55 further comprising instructions configured to cause the processor to refine the reference coordinate system based on subsequently obtained information indicative of an orientation of the device.
- 57. The storage medium of claim 55 wherein the reference coordinate system comprises a SLAM coordinate system.
- 58. The storage medium of claim 55 wherein the application comprises an AR application.
- 59. The storage medium of claim 58 further comprising instructions configured to cause the processor to receive an orientation from the AR application.
- 60. The storage medium of claim 58 wherein the instructions configured to cause the processor to determine the orientation of the reference coordinate system include instructions configured to cause the processor to determine the orientation of the reference coordinate system based on a type of augmentation associated with the AR application.
- 61. The storage medium of
claim 60 wherein the type of augmentation comprises text or designs placed on a real-world surface. - 62. The storage medium of
claim 60 wherein the type of augmentation comprises a character moving on a real-world surface. - 63. The storage medium of claim 55 wherein the orientation of the reference coordinate system comprises a substantially horizontal orientation with respect to Earth.
- 64. The storage medium of claim 55 wherein the orientation of the reference coordinate system comprises a substantially vertical orientation with respect to Earth.
- 65. The storage medium of claim 55 wherein the instructions configured to cause the processor to determine the orientation of the reference coordinate system include instructions configured to cause the processor to set an origin of the reference coordinate system based on the selected application.
Claims (35)
1. A method of determining a reference coordinate system, the method comprising:
obtaining information indicative of a direction of gravity relative to a device; and
converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system.
2. The method of claim 1 further comprising obtaining information indicative of a direction perpendicular to gravity and wherein the converting comprises converting the orientation of the device coordinate system using the direction perpendicular to gravity.
3. The method of claim 2 wherein the direction perpendicular to gravity is one of magnetic north or a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity.
4. The method of claim 1 further comprising setting an origin of the reference coordinate system.
5. The method of claim 4 wherein setting the origin comprises:
obtaining a point cloud; and
determining a geometric center of a substantially planar portion of the point cloud.
6. The method of claim 4 wherein setting the origin comprises:
obtaining a point cloud; and
determining an intersection of a view direction of a camera of the device and a plane corresponding to a substantially planar portion of the point cloud.
7. The method of claim 1 further comprising calculating a scale value and producing the reference coordinate system using the scale value.
8. The method of claim 7 wherein calculating the scale value comprises:
obtaining a point cloud; and
comparing a dimension of the point cloud to a fixed size.
9. The method of claim 7 wherein calculating the scale value comprises calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the device.
10. The method of claim 7 wherein calculating the scale value comprises using absolute measurements from one or more input sensors of the device.
11. The method of claim 10 wherein the one or more input sensors comprise an accelerometer or a plurality of cameras.
12. The method of claim 1 further comprising refining the reference coordinate system using information from the reference coordinate system and at least one previously-determined coordinate system to generate a combination of the reference coordinate system and the at least one previously-determined coordinate system.
13. The method of claim 1 further comprising determining at least one of a first technique for converting the orientation or a second technique for setting an origin of the reference coordinate system based on an application for which the reference coordinate system is used.
14. The method of claim 1 wherein the reference coordinate system is an augmented reality coordinate system, the method further comprising displaying an augmented reality image disposed and directed in accordance with the reference coordinate system.
15. A device for determining a reference coordinate system, the device comprising:
means for obtaining information indicative of a direction of gravity relative to the device; and
means for converting an orientation of a device coordinate system using the direction of gravity relative to the device to produce the reference coordinate system.
16. The device of claim 15 further comprising means for obtaining information indicative of a direction perpendicular to gravity and wherein the means for converting are for converting the orientation of the device coordinate system using the direction perpendicular to gravity.
17. The device of claim 16 wherein the direction perpendicular to gravity is one of magnetic north or a projection of a viewing direction of a camera of the device onto a plane perpendicular to gravity.
18. The device of claim 15 further comprising means for setting an origin of the reference coordinate system.
19. The device of claim 18 wherein the means for setting the origin comprise:
means for obtaining a point cloud; and
means for determining a geometric center of a substantially planar portion of the point cloud.
20. The device of claim 18 wherein the means for setting the origin comprise:
means for obtaining a point cloud; and
means for determining an intersection of a view direction of a camera of the device and a plane corresponding to a substantially planar portion of the point cloud.
21. The device of claim 15 further comprising means for calculating a scale value such that an origin of the reference coordinate system will have a predetermined distance from the device.
22. The device of claim 21 wherein the means for calculating the scale value comprise:
means for obtaining a point cloud; and
means for comparing a dimension of the point cloud to a fixed size.
23. The device of claim 21 wherein the means for calculating the scale value comprise means for calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the device.
24. The device of claim 21 wherein the means for calculating the scale value comprise means for using absolute measurements from one or more input sensors of the device.
25. The device of claim 24 wherein the one or more input sensors comprise an accelerometer or a plurality of cameras.
26. The device of claim 15 further comprising means for refining the reference coordinate system using information from the reference coordinate system and at least one previously-determined coordinate system to generate a combination of the reference coordinate system and the at least one previously-determined coordinate system.
27. The device of claim 15 further comprising means for determining at least one of a first technique for converting the orientation or a second technique for setting an origin of the reference coordinate system based on an application for which the reference coordinate system is used.
28. The device of claim 15 wherein the reference coordinate system is an augmented reality coordinate system, the device further comprising means for displaying an augmented reality image disposed and directed in accordance with the reference coordinate system.
29. A mobile device comprising:
a sensor configured to determine a direction of gravity and to provide an indication of the direction of gravity relative to the mobile device; and
an orientation module communicatively coupled to the sensor and configured to convert an orientation of a device coordinate system, of the mobile device, using the indication of the direction of gravity relative to the mobile device to produce a reference coordinate system.
30. The mobile device of claim 29 wherein the orientation module is further configured to convert the orientation of the device coordinate system using a direction perpendicular to gravity.
31. The mobile device of claim 29 further comprising an origin module communicatively coupled to the orientation module and configured to set an origin of the reference coordinate system by obtaining a point cloud and at least one of:
determining a geometric center of a substantially planar portion of the point cloud; or
determining an intersection of a view direction of a camera of the mobile device and a plane corresponding to a substantially planar portion of the point cloud.
32. The mobile device of claim 29 further comprising a scale module communicatively coupled to the orientation module and configured to set a scale value for the reference coordinate system relative to the device coordinate system by:
(1) obtaining a point cloud and comparing a dimension of the point cloud to a fixed size; or
(2) calculating the scale value such that an origin of the reference coordinate system will have a predetermined distance from the mobile device; or
(3) using absolute measurements from one or more input sensors of the mobile device.
33. The mobile device of claim 29 wherein the orientation module is configured to produce a refined reference coordinate system using information from the reference coordinate system and at least one previously determined coordinate system.
34. The mobile device of claim 29 wherein the reference coordinate system is an augmented reality coordinate system, the mobile device further comprising a display and a processor configured to cause the display to display an augmented reality image disposed and directed in accordance with the reference coordinate system.
35. A processor-readable storage medium of a mobile device, the storage medium comprising processor-readable instructions configured to cause a processor to:
obtain an indication of a direction of gravity relative to the mobile device; and
convert an orientation of a device coordinate system, of the mobile device, using the indication of the direction of gravity relative to the mobile device to produce a reference coordinate system.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/787,525 US20140123507A1 (en) | 2012-11-02 | 2013-03-06 | Reference coordinate system determination |
CN201380055115.4A CN104756154A (en) | 2012-11-02 | 2013-09-11 | Reference coordinate system determination |
EP13770560.4A EP2915136A2 (en) | 2012-11-02 | 2013-09-11 | Reference coordinate system determination |
JP2015540668A JP2016507793A (en) | 2012-11-02 | 2013-09-11 | Determining the reference coordinate system |
PCT/US2013/059148 WO2014070312A2 (en) | 2012-11-02 | 2013-09-11 | Reference coordinate system determination |
KR1020157013924A KR20150082358A (en) | 2012-11-02 | 2013-09-11 | Reference coordinate system determination |
US15/182,750 US10309762B2 (en) | 2012-11-02 | 2016-06-15 | Reference coordinate system determination |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261722023P | 2012-11-02 | 2012-11-02 | |
US13/787,525 US20140123507A1 (en) | 2012-11-02 | 2013-03-06 | Reference coordinate system determination |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/182,750 Division US10309762B2 (en) | 2012-11-02 | 2016-06-15 | Reference coordinate system determination |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140123507A1 true US20140123507A1 (en) | 2014-05-08 |
Family
ID=50621027
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/787,525 Abandoned US20140123507A1 (en) | 2012-11-02 | 2013-03-06 | Reference coordinate system determination |
US15/182,750 Active US10309762B2 (en) | 2012-11-02 | 2016-06-15 | Reference coordinate system determination |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/182,750 Active US10309762B2 (en) | 2012-11-02 | 2016-06-15 | Reference coordinate system determination |
Country Status (6)
Country | Link |
---|---|
US (2) | US20140123507A1 (en) |
EP (1) | EP2915136A2 (en) |
JP (1) | JP2016507793A (en) |
KR (1) | KR20150082358A (en) |
CN (1) | CN104756154A (en) |
WO (1) | WO2014070312A2 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150105123A1 (en) * | 2013-10-11 | 2015-04-16 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20150211847A1 (en) * | 2014-01-29 | 2015-07-30 | Mitutoyo Corporation | Manual measuring system |
US20150235425A1 (en) * | 2014-02-14 | 2015-08-20 | Fujitsu Limited | Terminal device, information processing device, and display control method |
US20170177203A1 (en) * | 2015-12-18 | 2017-06-22 | Facebook, Inc. | Systems and methods for identifying dominant hands for users based on usage patterns |
US9773313B1 (en) * | 2014-01-03 | 2017-09-26 | Google Inc. | Image registration with device data |
CN107665506A (en) * | 2016-07-29 | 2018-02-06 | 成都理想境界科技有限公司 | Realize the method and system of augmented reality |
CN107665508A (en) * | 2016-07-29 | 2018-02-06 | 成都理想境界科技有限公司 | Realize the method and system of augmented reality |
CN107665507A (en) * | 2016-07-29 | 2018-02-06 | 成都理想境界科技有限公司 | The method and device of augmented reality is realized based on plane monitoring-network |
US9990773B2 (en) | 2014-02-06 | 2018-06-05 | Fujitsu Limited | Terminal, information processing apparatus, display control method, and storage medium |
US20180302161A1 (en) * | 2014-03-25 | 2018-10-18 | Osram Sylvania Inc. | Light-based communication (lcom) visual hotspots |
US10147398B2 (en) | 2013-04-22 | 2018-12-04 | Fujitsu Limited | Display control method and device |
GB2563731A (en) * | 2017-05-09 | 2018-12-26 | A9 Com Inc | Markerless image analysis for augmented reality |
US10217231B2 (en) * | 2016-05-31 | 2019-02-26 | Microsoft Technology Licensing, Llc | Systems and methods for utilizing anchor graphs in mixed reality environments |
US10309762B2 (en) | 2012-11-02 | 2019-06-04 | Qualcomm Incorporated | Reference coordinate system determination |
US10843068B2 (en) * | 2017-01-18 | 2020-11-24 | Xvisio Technology Corp. | 6DoF inside-out tracking game controller |
US20210407205A1 (en) * | 2020-06-30 | 2021-12-30 | Snap Inc. | Augmented reality eyewear with speech bubbles and translation |
US11226201B2 (en) * | 2015-06-11 | 2022-01-18 | Queen's University At Kingston | Automated mobile geotechnical mapping |
US11315321B2 (en) * | 2018-09-07 | 2022-04-26 | Intel Corporation | View dependent 3D reconstruction mechanism |
US11321921B2 (en) * | 2012-12-10 | 2022-05-03 | Sony Corporation | Display control apparatus, display control method, and program |
US11378376B2 (en) * | 2019-11-08 | 2022-07-05 | Kevin P. Terry | System and method for using a digital measuring device to install a structure |
US20230133168A1 (en) * | 2021-10-31 | 2023-05-04 | Hongfujin Precision Electrons (Yantai) Co., Ltd. | Method for identifying human postures and gestures for interaction purposes and portable hand-held device |
US20230219578A1 (en) * | 2022-01-07 | 2023-07-13 | Ford Global Technologies, Llc | Vehicle occupant classification using radar point cloud |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI672675B (en) * | 2017-04-10 | 2019-09-21 | 鈺立微電子股份有限公司 | Depth information processing device |
CN109550246B (en) * | 2017-09-25 | 2022-03-25 | 腾讯科技(深圳)有限公司 | Control method and device for game client, storage medium and electronic device |
US10565719B2 (en) | 2017-10-13 | 2020-02-18 | Microsoft Technology Licensing, Llc | Floor detection in virtual and augmented reality devices using stereo images |
CN108282651A (en) * | 2017-12-18 | 2018-07-13 | 北京小鸟看看科技有限公司 | Antidote, device and the virtual reality device of camera parameter |
KR102031331B1 (en) * | 2018-03-09 | 2019-10-11 | 주식회사 케이티 | Method for identifying moving object within image, apparatus and computer readable medium |
JP7254943B2 (en) * | 2019-09-20 | 2023-04-10 | マクセル株式会社 | Information terminal device and location recognition sharing method |
KR102158316B1 (en) * | 2020-04-14 | 2020-09-21 | 주식회사 맥스트 | Apparatus and method for processing point cloud |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030080979A1 (en) * | 2001-10-26 | 2003-05-01 | Canon Kabushiki Kaisha | Image display apparatus and method, and storage medium |
US20100208057A1 (en) * | 2009-02-13 | 2010-08-19 | Peter Meier | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3805231B2 (en) * | 2001-10-26 | 2006-08-02 | キヤノン株式会社 | Image display apparatus and method, and storage medium |
US7317456B1 (en) * | 2002-12-02 | 2008-01-08 | Ngrain (Canada) Corporation | Method and apparatus for transforming point cloud data to volumetric data |
JP4218952B2 (en) * | 2003-09-30 | 2009-02-04 | キヤノン株式会社 | Data conversion method and apparatus |
JP2005326275A (en) * | 2004-05-14 | 2005-11-24 | Canon Inc | Information processing method and device |
US20060078215A1 (en) | 2004-10-12 | 2006-04-13 | Eastman Kodak Company | Image processing based on direction of gravity |
WO2008099915A1 (en) * | 2007-02-16 | 2008-08-21 | Mitsubishi Electric Corporation | Road/feature measuring device, feature identifying device, road/feature measuring method, road/feature measuring program, measuring device, measuring method, measuring program, measured position data, measuring terminal, measuring server device, drawing device, drawing method, drawing program, and drawing data |
JP2008261755A (en) * | 2007-04-12 | 2008-10-30 | Canon Inc | Information processing apparatus and information processing method |
US8855819B2 (en) | 2008-10-09 | 2014-10-07 | Samsung Electronics Co., Ltd. | Method and apparatus for simultaneous localization and mapping of robot |
US8542252B2 (en) * | 2009-05-29 | 2013-09-24 | Microsoft Corporation | Target digitization, extraction, and tracking |
CN101598540B (en) * | 2009-06-24 | 2013-02-13 | 广东威创视讯科技股份有限公司 | Three-dimensional positioning method and three-dimensional positioning system |
JP4816789B2 (en) * | 2009-11-16 | 2011-11-16 | ソニー株式会社 | Information processing apparatus, information processing method, program, and information processing system |
DE112010004767B4 (en) * | 2009-12-11 | 2024-09-12 | Kabushiki Kaisha Topcon | Point cloud data processing apparatus, point cloud data processing method and point cloud data processing program |
JP5573238B2 (en) * | 2010-03-04 | 2014-08-20 | ソニー株式会社 | Information processing apparatus, information processing method and program |
JP2011197777A (en) * | 2010-03-17 | 2011-10-06 | Sony Corp | Information processing device, information processing method and program |
WO2012037157A2 (en) * | 2010-09-13 | 2012-03-22 | Alt Software (Us) Llc | System and method for displaying data having spatial coordinates |
EP2625845B1 (en) * | 2010-10-04 | 2021-03-03 | Gerard Dirk Smits | System and method for 3-d projection and enhancements for interactivity |
US20120113223A1 (en) * | 2010-11-05 | 2012-05-10 | Microsoft Corporation | User Interaction in Augmented Reality |
JP2012128779A (en) * | 2010-12-17 | 2012-07-05 | Panasonic Corp | Virtual object display device |
CN102609550B (en) | 2011-01-19 | 2015-11-25 | 鸿富锦精密工业(深圳)有限公司 | Product three-dimensional model ajusts system and method automatically |
JP5799521B2 (en) * | 2011-02-15 | 2015-10-28 | ソニー株式会社 | Information processing apparatus, authoring method, and program |
US9013469B2 (en) * | 2011-03-04 | 2015-04-21 | General Electric Company | Method and device for displaying a three-dimensional view of the surface of a viewed object |
JP5702653B2 (en) * | 2011-04-08 | 2015-04-15 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing system, and information processing method |
US8442307B1 (en) * | 2011-05-04 | 2013-05-14 | Google Inc. | Appearance augmented 3-D point clouds for trajectory and camera localization |
TW201248423A (en) | 2011-05-17 | 2012-12-01 | Ind Tech Res Inst | Localization device and localization method with the assistance of augmented reality |
US9600933B2 (en) * | 2011-07-01 | 2017-03-21 | Intel Corporation | Mobile augmented reality system |
WO2013034981A2 (en) * | 2011-09-08 | 2013-03-14 | Offshore Incorporations (Cayman) Limited, | System and method for visualizing synthetic objects withinreal-world video clip |
US8818133B2 (en) * | 2012-07-11 | 2014-08-26 | Raytheon Company | Point cloud construction with unposed camera |
US20140123507A1 (en) | 2012-11-02 | 2014-05-08 | Qualcomm Incorporated | Reference coordinate system determination |
-
2013
- 2013-03-06 US US13/787,525 patent/US20140123507A1/en not_active Abandoned
- 2013-09-11 EP EP13770560.4A patent/EP2915136A2/en not_active Withdrawn
- 2013-09-11 KR KR1020157013924A patent/KR20150082358A/en not_active Application Discontinuation
- 2013-09-11 WO PCT/US2013/059148 patent/WO2014070312A2/en active Application Filing
- 2013-09-11 CN CN201380055115.4A patent/CN104756154A/en active Pending
- 2013-09-11 JP JP2015540668A patent/JP2016507793A/en active Pending
-
2016
- 2016-06-15 US US15/182,750 patent/US10309762B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030080979A1 (en) * | 2001-10-26 | 2003-05-01 | Canon Kabushiki Kaisha | Image display apparatus and method, and storage medium |
US20100208057A1 (en) * | 2009-02-13 | 2010-08-19 | Peter Meier | Methods and systems for determining the pose of a camera with respect to at least one object of a real environment |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10309762B2 (en) | 2012-11-02 | 2019-06-04 | Qualcomm Incorporated | Reference coordinate system determination |
US12112443B2 (en) | 2012-12-10 | 2024-10-08 | Sony Corporation | Display control apparatus, display control method, and program |
US11321921B2 (en) * | 2012-12-10 | 2022-05-03 | Sony Corporation | Display control apparatus, display control method, and program |
US12051161B2 (en) | 2012-12-10 | 2024-07-30 | Sony Corporation | Display control apparatus, display control method, and program |
US10147398B2 (en) | 2013-04-22 | 2018-12-04 | Fujitsu Limited | Display control method and device |
US20150105123A1 (en) * | 2013-10-11 | 2015-04-16 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9773313B1 (en) * | 2014-01-03 | 2017-09-26 | Google Inc. | Image registration with device data |
US10282856B2 (en) | 2014-01-03 | 2019-05-07 | Google Llc | Image registration with device data |
US10066922B2 (en) | 2014-01-29 | 2018-09-04 | Mitutoyo Corporation | Manual measuring system |
US9651370B2 (en) * | 2014-01-29 | 2017-05-16 | Mitutoyo Corporation | Manual measuring system |
US20150211847A1 (en) * | 2014-01-29 | 2015-07-30 | Mitutoyo Corporation | Manual measuring system |
US9990773B2 (en) | 2014-02-06 | 2018-06-05 | Fujitsu Limited | Terminal, information processing apparatus, display control method, and storage medium |
US20150235425A1 (en) * | 2014-02-14 | 2015-08-20 | Fujitsu Limited | Terminal device, information processing device, and display control method |
US20180302161A1 (en) * | 2014-03-25 | 2018-10-18 | Osram Sylvania Inc. | Light-based communication (lcom) visual hotspots |
US11226201B2 (en) * | 2015-06-11 | 2022-01-18 | Queen's University At Kingston | Automated mobile geotechnical mapping |
US20170177203A1 (en) * | 2015-12-18 | 2017-06-22 | Facebook, Inc. | Systems and methods for identifying dominant hands for users based on usage patterns |
US10504232B2 (en) * | 2016-05-31 | 2019-12-10 | Microsoft Technology Licensing, Llc | Sharing of sparse slam coordinate systems |
US10217231B2 (en) * | 2016-05-31 | 2019-02-26 | Microsoft Technology Licensing, Llc | Systems and methods for utilizing anchor graphs in mixed reality environments |
CN107665507A (en) * | 2016-07-29 | 2018-02-06 | 成都理想境界科技有限公司 | The method and device of augmented reality is realized based on plane monitoring-network |
CN107665508A (en) * | 2016-07-29 | 2018-02-06 | 成都理想境界科技有限公司 | Realize the method and system of augmented reality |
CN107665506A (en) * | 2016-07-29 | 2018-02-06 | 成都理想境界科技有限公司 | Realize the method and system of augmented reality |
US10843068B2 (en) * | 2017-01-18 | 2020-11-24 | Xvisio Technology Corp. | 6DoF inside-out tracking game controller |
US11504608B2 (en) | 2017-01-18 | 2022-11-22 | Xvisio Technology Corp. | 6DoF inside-out tracking game controller |
GB2563731B (en) * | 2017-05-09 | 2021-04-14 | A9 Com Inc | Markerless image analysis for augmented reality |
US10733801B2 (en) | 2017-05-09 | 2020-08-04 | A9.Com. Inc. | Markerless image analysis for augmented reality |
US10339714B2 (en) | 2017-05-09 | 2019-07-02 | A9.Com, Inc. | Markerless image analysis for augmented reality |
GB2563731A (en) * | 2017-05-09 | 2018-12-26 | A9 Com Inc | Markerless image analysis for augmented reality |
US11315321B2 (en) * | 2018-09-07 | 2022-04-26 | Intel Corporation | View dependent 3D reconstruction mechanism |
US11378376B2 (en) * | 2019-11-08 | 2022-07-05 | Kevin P. Terry | System and method for using a digital measuring device to install a structure |
US20210407205A1 (en) * | 2020-06-30 | 2021-12-30 | Snap Inc. | Augmented reality eyewear with speech bubbles and translation |
US11869156B2 (en) * | 2020-06-30 | 2024-01-09 | Snap Inc. | Augmented reality eyewear with speech bubbles and translation |
US20230133168A1 (en) * | 2021-10-31 | 2023-05-04 | Hongfujin Precision Electrons (Yantai) Co., Ltd. | Method for identifying human postures and gestures for interaction purposes and portable hand-held device |
US12017657B2 (en) * | 2022-01-07 | 2024-06-25 | Ford Global Technologies, Llc | Vehicle occupant classification using radar point cloud |
US20230219578A1 (en) * | 2022-01-07 | 2023-07-13 | Ford Global Technologies, Llc | Vehicle occupant classification using radar point cloud |
Also Published As
Publication number | Publication date |
---|---|
US10309762B2 (en) | 2019-06-04 |
WO2014070312A3 (en) | 2014-06-26 |
EP2915136A2 (en) | 2015-09-09 |
JP2016507793A (en) | 2016-03-10 |
CN104756154A (en) | 2015-07-01 |
WO2014070312A2 (en) | 2014-05-08 |
KR20150082358A (en) | 2015-07-15 |
US20160300340A1 (en) | 2016-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10309762B2 (en) | Reference coordinate system determination | |
EP2915140B1 (en) | Fast initialization for monocular visual slam | |
US11481982B2 (en) | In situ creation of planar natural feature targets | |
US10175857B2 (en) | Image processing device, image processing method, and program for displaying an image in accordance with a selection from a displayed menu and based on a detection by a sensor | |
US9355451B2 (en) | Information processing device, information processing method, and program for recognizing attitude of a plane | |
US9390344B2 (en) | Sensor-based camera motion detection for unconstrained slam | |
EP2915137B1 (en) | Using a plurality of sensors for mapping and localization | |
CN111344644B (en) | Techniques for motion-based automatic image capture | |
JP6043856B2 (en) | Head pose estimation using RGBD camera | |
US9646384B2 (en) | 3D feature descriptors with camera pose information | |
KR101554797B1 (en) | Context aware augmentation interactions | |
EP3907704A1 (en) | Augmented reality (ar) capture&play | |
JP6609640B2 (en) | Managing feature data for environment mapping on electronic devices | |
CN112146649A (en) | Navigation method and device in AR scene, computer equipment and storage medium | |
WO2013103410A1 (en) | Imaging surround systems for touch-free display control | |
TWI758869B (en) | Interactive object driving method, apparatus, device, and computer readable storage meidum | |
Marto et al. | DinofelisAR demo augmented reality based on natural features | |
CN108028904B (en) | Method and system for light field augmented reality/virtual reality on mobile devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: QUALCOMM INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUPTA, PRINCE;WAGNER, DANIEL;REITMAYR, GERHARD;AND OTHERS;SIGNING DATES FROM 20130412 TO 20130415;REEL/FRAME:030265/0753 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |