Nothing Special   »   [go: up one dir, main page]

US20030098954A1 - Calibration-free eye gaze tracking - Google Patents

Calibration-free eye gaze tracking Download PDF

Info

Publication number
US20030098954A1
US20030098954A1 US09/844,682 US84468201A US2003098954A1 US 20030098954 A1 US20030098954 A1 US 20030098954A1 US 84468201 A US84468201 A US 84468201A US 2003098954 A1 US2003098954 A1 US 2003098954A1
Authority
US
United States
Prior art keywords
image
camera
eye
glint
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/844,682
Other versions
US6578962B1 (en
Inventor
Arnon Amir
Myron Flickner
David Koons
Gregory Russell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tobii AB
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US09/844,682 priority Critical patent/US6578962B1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AMIR, ARNON, FLICKNER, MYRON DALE, KOONS, DAVID BRUCE, RUSSELL, GREGORY FRASER
Publication of US20030098954A1 publication Critical patent/US20030098954A1/en
Application granted granted Critical
Publication of US6578962B1 publication Critical patent/US6578962B1/en
Assigned to IPG HEALTHCARE 501 LIMITED reassignment IPG HEALTHCARE 501 LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTERNATIONAL BUSINESS MACHINES CORPORATION
Assigned to TOBII TECHNOLOGY AB reassignment TOBII TECHNOLOGY AB ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IPG HEALTHCARE 501 LIMITED
Assigned to TOBII AB reassignment TOBII AB CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: TOBII TECHNOLOGY AB
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • This invention relates to the determination of a user's eye gaze vector and point of regard by analysis of images taken of a user's eye.
  • the invention relates more specifically to eye gaze tracking without the need to calibrate for specific users' eye geometries and to subsequently recalibrate for user head position.
  • Eye gaze tracking technology has proven to be useful in many different fields, including human-computer interfaces for assisting disabled people interact with a computer.
  • the eye gaze tracker can be used as a mouse emulator for a personal computer, for example, helping disabled people to move a cursor on a display screen to control their environment and communicate messages. Gaze tracking can also be used for industrial control, aviation, and emergency room situations where both hands are needed for tasks other than operation of a computer but where an available computer is useful.
  • Eye gaze tracking There is also significant research interest in eye gaze tracking for babies and animals to better understand such subjects' behavior and visual processes.
  • Commercial eye gaze tracking systems are made by ISCAN Incorporated (Burlington Mass.), LC Technologies (Fairfax Va.), and Applied Science Laboratories (Bedford Mass.).
  • Corneal reflection eye gaze tracking systems project light toward the eye and monitor the angular difference between pupil position and the reflection of the light beam. Near-infrared light is often employed, as users cannot see this light and are therefore not distracted by it. Usually only one eye is monitored, and it isn't critical which eye is monitored.
  • the light reflected from the eye has two major components. The first component is a ‘glint’, which is a very small and very bright virtual image of the light source reflected from the front surface of the corneal bulge of the eye. The glint position remains relatively fixed in an observer's image field as long as the user's head remains stationary and the corneal sphere rotates around a fixed point.
  • the second component is light that has entered the eye and has been reflected back out from the retina.
  • This light serves to illuminate the pupil of the eye from behind, causing the pupil to appear as a bright disk against a darker background.
  • This retroreflection, or “bright eye” effect familiar to flash photographers, provides a very high contrast image.
  • the pupil center's position in the image field moves significantly as the eye rotates.
  • An oculometer determines the center of the pupil and the glint, and the change in the distance and direction between the two as the eye is rotated.
  • the orientation of the eyeball can be inferred from the differential motion of the pupil center relative to the glint.
  • the eye is often modeled as a sphere of about 13.3 mm radius having a spherical corneal bulge of about 8 mm radius; the eyes of different users will have variations from these typical values, but individual dimensional values do not generally vary significantly in the short term.
  • the main components of a corneal reflection eye gaze tracking system include a video camera sensitive to near-infrared light, a near-infrared light source (often a light-emitting diode) typically mounted to shine along the optical axis of the camera, and a computer system for analyzing images captured by the camera.
  • the on-axis light source is positioned at or near the focal center of the camera.
  • Image processing techniques such as intensity thresholding and edge detection identify the glint and the pupil from the image captured by the camera using on-axis light, and locate the pupil center in the camera's field of view as shown in prior art FIG. 2.
  • the optical axis is a line going from the center of the spherical corneal bulge through the center of the pupil.
  • the optical axis and foveal axis are offset in each eye by an inward horizontal angle of about five degrees, with a variation of about one and one half degrees in the population.
  • the offsets of the foveal axes with respect to the optical axes of a user's eyes enable better stereoscopic vision of nearby objects.
  • the offsets vary from one individual to the next, but individual offsets do not vary significantly in the short term.
  • the gaze vector is defined as the optical axis of the eye.
  • the gaze position or point of regard is defined as the intersection point of the gaze vector with the object being viewed (e.g.
  • Adjustments for the foveal axis offsets are typically made after determination of the gaze vector; a default offset angle value may be used unless values from a onetime measurement of a particular user's offset angles are available.
  • a sample of corresponding gaze vectors is computed and used to adapt the system to the specific properties of the user's eye, reducing the error in the estimate of the gaze vector to an acceptable level for subsequent operation.
  • the user may also be asked to click a mouse button after visually fixating on a target, but this approach may add synchronization problems, i.e. the user could look away from the target and then click the mouse button. Also, with this approach the system would get only one mouse click for each target, so there would be no chance to average out involuntary eye movements.
  • the user may visually track a moving calibration icon on a display that traverses a discrete set of known screen coordinates. Calibration may need to be performed on a per-user or per-tracking-session basis, depending on the precision and repeatability of the tracking system.
  • White offers an improvement in remote eye gaze tracking in the presence of lateral head translations (e.g. parallel to a display screen) of up to 20 cm.
  • White uses a second light source to passively recalibrate the system.
  • the second light source creates a second glint.
  • White claims that a single initial static (no head motion) calibration can be dynamically adjusted as the head moves, leading to improved accuracy under an expanded range of head motions without a significantly increased system cost.
  • White's system compensates only for lateral head displacements, i.e. not for motion to/from the gaze position, and not for rotation. Rotation of a user's head is particularly troublesome for prior art gaze tracking systems as it changes the distance from the eye to both the object under observation and to the camera generating images of the eye.
  • two cameras each having a co-located and co-oriented light source are used to capture images of a user's eye. It is a related object of the preferred embodiment of the invention to capture images of a user's eye such that the pupil center in each image and glints generated by each light source may be readily identified and located in the image plane of each camera.
  • the intersection of the second plane with the display screen plane defines a second line containing the point of regard.
  • the point of regard is computed from the intersection of the gaze vector with the observed object, which corresponds to the intersection of the first line and the second line when the observed object is planar. Correction for foveal axis offsets may be added.
  • each of the two cameras require only light originally emitted by its own on-axis light source. It is a related object of the second embodiment of the invention to compute a first plane including a first glint position in the first camera's image plane, a pupil center position in the first camera's image plane, and the focal center of the first camera. Similarly, it is a related object of the second embodiment of the invention to compute a second plane including a second glint position in the second camera's image plane, a pupil center in the second camera's image plane, and the focal center of the second camera. The intersection of the first plane with the display screen plane defines a first line containing the point of regard.
  • the intersection of the second plane with the display screen plane defines a second line containing the point of regard.
  • the gaze vector is a line defined by the intersection between the first plane and the second plane and extending from the user's eye toward an observed object.
  • the point of regard is computed from the intersection of the gaze vector with the observed object, which corresponds to the intersection of the first line and the second line when the observed object is planar.
  • FIG. 1 is a prior art diagram of an eye gaze tracking system.
  • FIG. 2 is a prior art diagram of a user's eye as viewed by a camera.
  • FIG. 3 is a prior art diagram of the foveal and optical axes and their offset angle.
  • FIG. 4 is a diagram of the system of the preferred embodiment of the present invention.
  • FIG. 5 is a diagram of the user's eye according to the preferred embodiment of the present invention.
  • FIG. 6 is a diagram of the user's eye including a first plane Agp containing the gaze vector according to the preferred embodiment of the present invention.
  • FIG. 7 is a view of the user's eye as seen by the first camera according to the preferred embodiment of the present invention.
  • FIG. 8 is a diagram of the user's eye according to the preferred embodiment of the present invention.
  • FIG. 9 is a diagram of the user's eye including a second plane Bip containing the gaze vector according to the preferred embodiment of the present invention.
  • FIG. 10 is a view of the user's eye as seen by the second camera according to the preferred embodiment of the present invention.
  • FIG. 11 is a diagram of the user's eye including a gaze vector defined by the intersection of the first plane and the second plane, and a point of regard, according to the preferred embodiment of the present invention.
  • FIG. 12 is a flowchart of the eye gaze tracking method according to the preferred embodiment of the present invention.
  • FIG. 13 is a diagram of a second embodiment of the present invention.
  • FIG. 14 is a diagram of a third embodiment of the present invention.
  • the system preferably includes a computer 400 , a first camera 402 , a second camera 404 , a first light source 406 , a second light source 408 , a video decoder 410 , a first frame grabber 412 , and a second frame grabber 414 .
  • First camera 402 and second camera 404 are each video cameras, spaced apart, generating respective video signals representing repeating interlaced scans of a respective image field.
  • odd-numbered raster rows are typically scanned from left to right and then top to bottom, and then even-numbered raster rows are scanned in the same manner during each repetition.
  • Vertical and horizontal synchronization signals from first camera 402 are fed into video decoder 410 , which passes the synchronization signals to second camera 404 , which responsively scans its image field in time with the scans of first camera 402 .
  • each of the cameras could be driven by synchronization signals originating from computer 400 , video decoder 410 , or from another signal source.
  • Both cameras are aimed at and focused upon one of the user's eyes and is equipped with tracking mechanisms (not shown), well known to those of ordinary skill in the art, that actively keep the cameras aimed at the user's eye. These tracking mechanisms sometimes operate by rapidly adjusting the orientation of each camera to keep the brightest portion of the image centered in its respective field of view. Note that in the preferred embodiment no fixed rotational reference for either camera is required, i.e. either camera could be rolled about its optical axis without causing difficulties.
  • First light source 406 and second light source 408 are preferably light-emitting diodes (LEDs) that produce light of near-infrared wavelengths when energized.
  • First light source 406 is positioned to emit light substantially along the optical axis of first camera 402 in the direction of its field of view.
  • Second light source 408 is similarly positioned to emit light substantially along the optical axis of second camera 404 in the direction of its field of view.
  • the brightness of each light source when energized, is adjusted to keep the image brightness in the eye area of each camera's field of view substantially the same.
  • the duty cycle of each light source can be adjusted downward to enable production of pulses of brighter light intensity.
  • One method of acquiring a clearly defined and easy to process pupil image is to generate a difference image by effectively subtracting an unlit image of the eye from a lit image of the eye.
  • video decoder 410 generates an even field control signal 416 whenever even-numbered raster rows are being scanned by the cameras, and generates an odd field control signal 418 whenever odd-numbered raster rows are being scanned by the cameras.
  • Even field control signal 416 triggers the illumination of first light source 406
  • odd field control signal 418 triggers the illumination of second light source 408 .
  • the two light sources are thus alternately energized during each alternately interlaced camera scan.
  • each camera produces images composed of two fields, each illuminated by a different light source, one on-axis and the other off-axis. Images from the cameras are captured by first frame grabber 412 and second frame grabber 414 , digitized, and then forwarded to computer 400 for subsequent processing. Subtracting the rows exposed by off-axis light from the corresponding row exposed by the on-axis light in images from first camera 402 produces a difference image that very clearly identifies the pupil as seen by first camera 402 . A similar subtraction performed on images from second camera 404 produces a difference image that very clearly identifies the pupil as seen by second camera 404 , as described in U.S. Pat. No. 5,016,282. Alternate lighting is not an essential aspect of the invention but works particularly well.
  • first camera 402 , second camera 404 , and the object being viewed by the user are known from a onetime user-independent calibration of the system of the present invention performed when the system components are first deployed. Attachment of the cameras to the display screen at known points would simplify the initial calibration, but cameras need not be positioned on the display screen or in the plane of the display screen.
  • optical parameters of both cameras e.g. focal length
  • the size of the display screen are assumed to be known, and the user's cornea is assumed to be rotationally symmetric about the optical axis.
  • Point A is the position of first focal center 500 of first camera 402 and the position of first light source 406 .
  • a pinhole camera model is used with a perspective projection to the image plane.
  • Light from first light source 406 reflects from the user's cornea at point G back to first camera 402 , producing a first glint 508 in the image from first camera 402 .
  • Point B is the position of second focal center 502 of second camera 404 and the position of second light source 408 .
  • Light emitted from an off-axis light source e.g.
  • second light source 408 reflects from the user's cornea at point H and is visible by first camera 402 as second glint 510 . Identification of which glint is due to which light source is simplified by use of alternate lighting during image capture as described above.
  • Point C is the center of curvature 504 of the corneal bulge (note, the corneal bulge is usually modeled as spherical but of course in reality the corneal bulge is not a complete sphere within the eyeball).
  • Point P is pupil center 506 .
  • Points G and H lie on plane ABC.
  • Point P′ is the point of regard 514 on display screen 512 , i.e. the intersection point between line CP (which is the optical axis and gaze vector 516 ) and display screen 512 plane.
  • Image plane 518 is a plane orthogonal to the optical axis of first camera 402 (for clarity, image plane 518 is shown in front of first focal center 500 , but in reality image plane 518 will be behind first focal center 500 and points on image plane 518 will be projections).
  • Point g 520 is the image of (on-axis) first glint 508 in image plane 518 .
  • Point h 522 is the image of (off-axis) second glint 510 in image plane 518 .
  • Point p 524 is the image of pupil center 506 in image plane 518 .
  • Plane Agp 600 includes (on-axis) first light source 406 and first camera 402 focal center, the image of first glint 508 in image plane 518 (point g), and the image of pupil center 506 in image plane 518 (point p).
  • Points C, G, g, and A are collinear.
  • Points C, P, and P′ are collinear.
  • Points A, p, and P are collinear.
  • the plane Agp spanning lines CGA and CPP′ would therefore include lines PG and line AP′.
  • Plane Agp 600 can be considered to be plane ABC (which also includes points H and h) rotated around line CGA by a measurable angle ⁇ .
  • Line L 602 is the intersection between plane Agp and the screen plane. Hence the gaze vector intersects with the display screen plane at point P′ on line L. Determination of line L alone may be of particular utility, depending on the application that uses gaze information. For example, the intersection of line L with a scroll bar can determine the position of the scroll bar slider, assuming that the user is looking at the scroll bar at a specific time. Determination of partial gaze information, e.g. line L, is an object of this invention.
  • FIG. 7 a view of the user's eye as seen by first camera 402 is shown according to the preferred embodiment of the present invention.
  • the identities and locations in the image plane of first camera 402 of projected first glint 508 (at point g) and projected second glint 510 (at point h) are determined from analysis of the images taken by first camera 402 when first light source 406 and second light source 408 were energized, preferably in an alternating manner as described above.
  • first glint 508 is due to first light source 406
  • second glint 510 is due to second light source 408 , so if the light sources are alternately energized only one glint will appear in each interlaced scan made by first camera 402 .
  • Projected pupil center 506 (at point p) is also identified and located, preferably from the difference image generated by subtraction of even and odd interlaced scans and subsequent processing via conventional image analysis techniques.
  • Angle ⁇ separating plane ABC and Agp 600 is therefore merely the angle pgh between line gh and line gp in this Figure, which is a view along the axis of plane rotation.
  • line gp can be determined without estimating an exact point defining pupil center 506 location in image plane 518 .
  • Line gp can be a line that extends from the glint image through the pupil image to maximize the symmetry of the pupil image. If the portion of the pupil image on one side of line gp were “folded over” line gp onto the other portion of the pupil image, the overall differential pupil area would be minimized.
  • line gp can be chosen to go through the “center of mass” of the pupil image, i.e. a homogeneous slab of material shaped like the pupil image and of uniform thickness would balance if suspended on line gp.
  • the pupil image will not be circular nor even elliptical if there are distortions in the corneal lens.
  • the line of sight must lie on the plane passing through the glint and the symmetry line of the pupil as imaged via perspective projection onto a camera's image plane. Under this model, the line of sight may not pass through the measured pupil center due to the distortion the corneal lens induces on the pupil image.
  • FIG. 8 a diagram of the user's eye is shown in accordance with the preferred embodiment of the present invention.
  • This Figure is similar to FIG. 5, but describes the view of the user's eye as seen by second camera 404 .
  • Light from second light source 408 reflects from the user's cornea at point I back to second camera 404 , producing second glint 510 in the image plane 526 of second camera 404 .
  • Light emitted from first light source 406 reflects from the user's cornea at point H and is visible by second camera 404 as first glint 508 .
  • Points H and I lie on plane ABC.
  • Second image plane 526 is a plane orthogonal to the optical axis of second camera 402 .
  • Point i 528 is the image of second glint 510 in image plane 526 .
  • Point h 522 is the image of first glint 508 in image plane 526 .
  • Point p 524 is the image of pupil center 506 in image plane 526 .
  • Plane Bip 900 includes second light source 408 and second camera 404 , second glint 510 in image plane 522 , and pupil center 506 .
  • Points C, I, and B are collinear.
  • Points C, P, and P′ are collinear.
  • a plane spanning lines CIB and CPP′ would therefore include lines PI and line BP′.
  • Plane Bip 900 can be considered to be plane ABC (which is also plane ABH) rotated around line CIB by a particular angle ⁇ .
  • FIG. 10 a view of the user's eye as seen by second camera 404 is shown according to the preferred embodiment of the present invention.
  • the identities and locations in the image plane 526 of second camera 404 of first glint 508 (at point h) and second glint 510 (at point i) are determined from analysis of the images taken by second camera 402 when first light source 406 and second light source 408 were energized, preferably in an alternating manner as described above.
  • first glint 508 is due to first light source 406
  • second glint 510 is due to second light source 408 , so if the light sources are alternately energized only one glint will appear in each interlaced scan made by second camera 404 .
  • Pupil center 506 (at point p) is also identified and located in image plane 526 , preferably from the difference image generated by subtraction of interlaced scan rows and subsequent processing techniques as described above.
  • Angle ⁇ separating plane ABC and Bip 900 is therefore merely the angle hip between line ih and line ip in this Figure, which is a view along the axis of plane rotation.
  • FIG. 11 a diagram of a user's eye including first plane Agp 600 and second plane Bip 900 is shown according to the preferred embodiment of the present invention.
  • Line CPP′ is the intersection of first plane Agp 600 and second plane Bip 900 .
  • point C center of cornea curvature 504 , need not be explicitly computed to determine either gaze vector 516 or point of regard P′ 514 ; point C can be indirectly determined if needed.
  • the intersection of line CP (gaze vector 516 ) with the pre-defined display screen 512 plane (or another observed object, whether planar or not) is point of regard P′ 514 .
  • Point P′ 514 is known because the relative position of first camera 402 and second camera 404 to display screen 512 plane and to each other is known, and the relative positions of first glint 508 and second glint 510 and pupil center 506 in image planes 518 and 526 are known.
  • first camera 402 generates an image of the user's eye.
  • second camera 404 generates an image of the user's eye.
  • Each image may include interlaced scans and is passed to computer 400 as described above.
  • computer 400 identifies and locates pupil center 506 and first glint 508 and second glint 510 in the image planes.
  • computer 400 computes the plane rotation angles ⁇ and ⁇ .
  • computer 400 identifies gaze vector 516 as the intersection line of first plane 600 and second plane 900 .
  • step 1210 computer 400 identifies point of regard 514 from gaze vector 516 and data describing the spatial arrangement of first camera 402 , second camera 404 , and display screen 512 plane (or another observed object, whether planar or not).
  • step 1212 computer 400 generates outputs describing gaze vector 516 and point of regard 514 and begins another cycle of the method.
  • FIG. 13 a diagram of a user's eye according to a second embodiment of the present invention is shown.
  • the second embodiment is identical to the preferred embodiment, except that each of the two intersecting planes are computed from different data points.
  • the focal center Fx of the camera 1300 For each camera, the focal center Fx of the camera 1300 , the position of the pupil center Px 1302 as projected onto the image plane 1304 of the camera, and the position of the glint Gx 1306 produced by that camera's own light source projected onto the image plane 1304 of the camera define a plane FxPxGx.
  • the intersection of the first plane with display screen plane 512 defines a first line containing point of regard 514 .
  • the intersection of the second plane with display screen plane 512 defines a second line containing point of regard 514 .
  • the gaze vector 516 is a line defined by the intersection between the first plane and the second plane and extending from the user's eye toward an observed object.
  • the point of regard 514 is computed from the intersection of gaze vector 516 with the observed object, which corresponds to the intersection of the first line and the second line when the observed object is planar. While the invention has been described in a second embodiment employing two cameras, embodiments using more than two cameras are also included within the scope of the invention. Similarly, an embodiment employing two cameras, each of which tracks a different user eye, is also included within the scope of the invention.
  • FIG. 14 a diagram of a third embodiment of the present invention is shown.
  • This embodiment requires a one-time calibration of the radius of curvature of the user's cornea, and an estimate of the distance of the eye from display screen 512 plane or camera 402 .
  • the third embodiment system components are identical to those of the second embodiment except that the third embodiment omits second camera 404 , second light source 408 and second frame grabber 414 . Projections of first glint 508 (at point g) and pupil center 506 (at point p) are identified and located in image plane 518 , and the distance between points g and p is measured.
  • Angle gAp and the distance d from the camera 402 are used to compute distance PG, which is the actual distance between pupil center 506 and glint 508 on the eye. Because the radius of corneal curvature r is known, the angle ACP′ can be computed from distance PG via elementary trigonometry. Point of regard 514 and the gaze vector 516 are computed from the position of camera 402 . Camera 402 may alternately scan each of the user's eyes to allow two computations as described above, reducing the need for the distance d.
  • a general purpose computer is programmed according to the inventive steps herein.
  • the invention can also be embodied as an article of manufacture—a machine component—that is used by a digital processing apparatus to execute the present logic.
  • This invention is realized in a critical machine component that causes a digital processing apparatus to perform the inventive method steps herein.
  • the invention may be embodied by a computer program that is executed by a processor within a computer as a series of computer-executable instructions. These instructions may reside, for example, in RAM of a computer or on a hard drive or optical drive of the computer, or the instructions may be stored on a DASD array, magnetic tape, electronic read-only memory, or other appropriate data storage device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

A system and method for calibration-free tracking of a user's eye gaze vector and point of regard even if substantial head movement or rotation occurs. The preferred embodiment includes two synchronized interlaced cameras, each viewing the user's eye and having on-axis lighting that is alternately modulated. An image difference between lighted and unlighted images of the eye is used to identify a user's pupil. A plane containing the gaze vector is defined by rotating a base plane through the angle in a camera image plane between a pupil center, a first glint, and a second glint. The intersection of two such planes (one from each camera), defines the gaze vector. The gaze position is the intersection of the gaze vector with the object being viewed by the user. Alternate embodiments are also described.

Description

    FIELD OF THE INVENTION
  • This invention relates to the determination of a user's eye gaze vector and point of regard by analysis of images taken of a user's eye. The invention relates more specifically to eye gaze tracking without the need to calibrate for specific users' eye geometries and to subsequently recalibrate for user head position. [0001]
  • BACKGROUND OF THE INVENTION
  • Eye gaze tracking technology has proven to be useful in many different fields, including human-computer interfaces for assisting disabled people interact with a computer. The eye gaze tracker can be used as a mouse emulator for a personal computer, for example, helping disabled people to move a cursor on a display screen to control their environment and communicate messages. Gaze tracking can also be used for industrial control, aviation, and emergency room situations where both hands are needed for tasks other than operation of a computer but where an available computer is useful. There is also significant research interest in eye gaze tracking for babies and animals to better understand such subjects' behavior and visual processes. Commercial eye gaze tracking systems are made by ISCAN Incorporated (Burlington Mass.), LC Technologies (Fairfax Va.), and Applied Science Laboratories (Bedford Mass.). [0002]
  • There are many different schemes for detecting both the direction in which a user is looking and the point upon which the user's vision is fixated. Any particular eye gaze tracking technology should be relatively inexpensive, reliable, unobtrusive, easily learned and used and generally operator-friendly to be widely accepted. The corneal reflection method of eye gaze tracking is increasing in popularity, and is well-described in the following U.S. patents, which are hereby incorporated by reference: U.S. Pat. Nos. 4,595,990, 4,836,670, 4,950,069, 4,973,149, 5,016,282, 5,231,674, 5,471,542, 5,861,940, 6,204,828. These two articles also describe corneal reflection eye gaze tracking and are also hereby incorporated by reference: “Spatially Dynamic Calibration of an Eye-Tracking System”, K. White, Jr. et al., IEEE Transactions on Systems, Man, and Cybernetics, vol. 23, no. 4, July/August 1993, p. 1162-1168, referred to hereafter as White, and “Effectiveness of Pupil Area Detection Technique”, Y. Ebisawa et al., Proceedings of the 15[0003] th Annual International Conference of IEEE Engineering in Medicine and Biology Society, vol. 15, October 1993, p. 1268-1269.
  • Corneal reflection eye gaze tracking systems project light toward the eye and monitor the angular difference between pupil position and the reflection of the light beam. Near-infrared light is often employed, as users cannot see this light and are therefore not distracted by it. Usually only one eye is monitored, and it isn't critical which eye is monitored. The light reflected from the eye has two major components. The first component is a ‘glint’, which is a very small and very bright virtual image of the light source reflected from the front surface of the corneal bulge of the eye. The glint position remains relatively fixed in an observer's image field as long as the user's head remains stationary and the corneal sphere rotates around a fixed point. The second component is light that has entered the eye and has been reflected back out from the retina. This light serves to illuminate the pupil of the eye from behind, causing the pupil to appear as a bright disk against a darker background. This retroreflection, or “bright eye” effect familiar to flash photographers, provides a very high contrast image. Unlike the glint, the pupil center's position in the image field moves significantly as the eye rotates. An oculometer determines the center of the pupil and the glint, and the change in the distance and direction between the two as the eye is rotated. The orientation of the eyeball can be inferred from the differential motion of the pupil center relative to the glint. The eye is often modeled as a sphere of about 13.3 mm radius having a spherical corneal bulge of about 8 mm radius; the eyes of different users will have variations from these typical values, but individual dimensional values do not generally vary significantly in the short term. [0004]
  • As shown in prior art FIG. 1, the main components of a corneal reflection eye gaze tracking system include a video camera sensitive to near-infrared light, a near-infrared light source (often a light-emitting diode) typically mounted to shine along the optical axis of the camera, and a computer system for analyzing images captured by the camera. The on-axis light source is positioned at or near the focal center of the camera. Image processing techniques such as intensity thresholding and edge detection identify the glint and the pupil from the image captured by the camera using on-axis light, and locate the pupil center in the camera's field of view as shown in prior art FIG. 2. [0005]
  • Human eyes do not have equal resolution over the entire field of view, nor is the portion of the retina providing the most distinct vision located precisely on the optical axis. The eye directs its gaze with great accuracy because the photoreceptors of the human retina are not uniformly distributed but instead show a pronounced density peak in a small region known as the fovea centralis. In this region, which subtends a visual angle of about one degree, the receptor density increases to about ten times the average density. The nervous system thus attempts to keep the image of the region of current interest centered accurately on the fovea as this gives the highest visual acuity. A distinction is made between the optical axis of the user's eye versus the foveal axis along which the most acute vision is achieved. As shown in prior art FIG. 3, the optical axis is a line going from the center of the spherical corneal bulge through the center of the pupil. The optical axis and foveal axis are offset in each eye by an inward horizontal angle of about five degrees, with a variation of about one and one half degrees in the population. The offsets of the foveal axes with respect to the optical axes of a user's eyes enable better stereoscopic vision of nearby objects. The offsets vary from one individual to the next, but individual offsets do not vary significantly in the short term. For this application, the gaze vector is defined as the optical axis of the eye. The gaze position or point of regard is defined as the intersection point of the gaze vector with the object being viewed (e.g. a point on a display screen some distance from the eye). Adjustments for the foveal axis offsets are typically made after determination of the gaze vector; a default offset angle value may be used unless values from a onetime measurement of a particular user's offset angles are available. [0006]
  • Unfortunately, calibration is required for all existing eye gaze tracking systems to establish the parameters describing the mapping of camera image coordinates to display screen coordinates. Different calibration and gaze direction calculation methods may be categorized by the actual physical measures they require. Some eye gaze tracking systems use implicit models that map directly from pupil and glint positions in the camera's image plane to the point of regard in screen coordinates. Other systems use physically-based explicit models that take into account eyeball radius, radius of curvature of the cornea, offset angle between the optical axis and the foveal axis, head and eye position in space, and distance between the center of the eyeball and the center of the pupil as measured for a particular user. During calibration, the user may be asked to fix his or her gaze upon certain “known” points in a display. At each coordinate location, a sample of corresponding gaze vectors is computed and used to adapt the system to the specific properties of the user's eye, reducing the error in the estimate of the gaze vector to an acceptable level for subsequent operation. The user may also be asked to click a mouse button after visually fixating on a target, but this approach may add synchronization problems, i.e. the user could look away from the target and then click the mouse button. Also, with this approach the system would get only one mouse click for each target, so there would be no chance to average out involuntary eye movements. Alternately, during calibration, the user may visually track a moving calibration icon on a display that traverses a discrete set of known screen coordinates. Calibration may need to be performed on a per-user or per-tracking-session basis, depending on the precision and repeatability of the tracking system. [0007]
  • Prior art eye gaze tracking systems also require subsequent recalibration to accurately adjust for head motion. U.S. Pat. No. 5,016,282 teaches the use of three reference points on calibration glasses to create a model of the head and determine the position and orientation of the head for the eye gaze tracking system. However, it is not likely that users will generally be willing to wear special glasses merely to enable the system to account for head motion in everyday use. Other commercial eye gaze tracking systems are head mounted, and therefore have no relative head motion difficulties to resolve. However, these systems are mainly designed for military or virtual reality applications wherein the user typically also wears a head mounted display device coupled to the eye gaze tracking device. Head mounted displays are inconvenient and not generally suitable for long periods of computer work in office and home environments. Details of camera calibration and conversion of measured two-dimensional points in the image plane to three-dimensional coordinates in real space are described in “A Flexible New Technique for Camera Calibration”, Z. Zhang, IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11): 1330-1334, 2000, (also available as Technical Report MSR-TR-98-71 at http://research.microsoft.com/˜zhang/Papers/TR98-71.pdf), which is hereby incorporated by reference. [0008]
  • White offers an improvement in remote eye gaze tracking in the presence of lateral head translations (e.g. parallel to a display screen) of up to 20 cm. White uses a second light source to passively recalibrate the system. The second light source creates a second glint. White claims that a single initial static (no head motion) calibration can be dynamically adjusted as the head moves, leading to improved accuracy under an expanded range of head motions without a significantly increased system cost. Unfortunately, White's system compensates only for lateral head displacements, i.e. not for motion to/from the gaze position, and not for rotation. Rotation of a user's head is particularly troublesome for prior art gaze tracking systems as it changes the distance from the eye to both the object under observation and to the camera generating images of the eye. [0009]
  • While the aforementioned prior art methods are useful advances in the field of eye gaze tracking, systems that do not require calibration would increase user convenience and broaden the acceptance of eye gaze tracking technology. A system for providing eye gaze tracking requiring little or no knowledge of individual users' eye geometries, and requiring no subsequent calibration for head movement is therefore needed. [0010]
  • SUMMARY OF THE INVENTION
  • It is accordingly an object of this invention to devise a system and method for eye gaze tracking wherein calibration for individual users' eye geometries is not required. [0011]
  • It is a related object of the invention to devise a system and method for eye gaze tracking wherein subsequent recalibration for head movement is not required. [0012]
  • It is a related object of the invention to determine a gaze vector and to compute a point of regard as the intersection of the gaze vector and an observed object. [0013]
  • It is a related object of the preferred embodiment of the invention that two cameras each having a co-located and co-oriented light source are used to capture images of a user's eye. It is a related object of the preferred embodiment of the invention to capture images of a user's eye such that the pupil center in each image and glints generated by each light source may be readily identified and located in the image plane of each camera. [0014]
  • It is a related object of the preferred embodiment of the invention to compute a first angle between three points in the image plane of the first camera, specifically the angle between the pupil center, the first glint (generated by the first camera's light source) and the second glint (generated by the second camera's light source). Similarly, it is a related object of the preferred embodiment of the invention to compute a second angle between three points in the image plane of the second camera, specifically the angle between the pupil center, the second glint and the first glint. [0015]
  • It is a related object of the preferred embodiment to define a base plane spanning the first camera's focal center, the second camera's focal center, and the common point in space (on the eye) at which light from one camera's light source reflects to the other camera. It is a related object of the preferred embodiment of the invention to define a first plane by rotating the base plane by the first angle about a line from the focal center of the first camera and the first glint in the first camera's image plane. The intersection of the first plane with the display screen plane defines a first line containing the point of regard. Similarly, it is a related object of the preferred embodiment of the invention to define a second plane by rotating the base plane by the second angle about a line from the focal center of the second camera and the second glint in the second camera's image plane. The intersection of the second plane with the display screen plane defines a second line containing the point of regard. [0016]
  • It is a related object of the preferred embodiment of the invention to compute the gaze vector as a line defined by the intersection between the first plane and the second plane and extending from the user's eye toward an observed object. The point of regard is computed from the intersection of the gaze vector with the observed object, which corresponds to the intersection of the first line and the second line when the observed object is planar. Correction for foveal axis offsets may be added. [0017]
  • It is a related object of the second embodiment that each of the two cameras require only light originally emitted by its own on-axis light source. It is a related object of the second embodiment of the invention to compute a first plane including a first glint position in the first camera's image plane, a pupil center position in the first camera's image plane, and the focal center of the first camera. Similarly, it is a related object of the second embodiment of the invention to compute a second plane including a second glint position in the second camera's image plane, a pupil center in the second camera's image plane, and the focal center of the second camera. The intersection of the first plane with the display screen plane defines a first line containing the point of regard. The intersection of the second plane with the display screen plane defines a second line containing the point of regard. The gaze vector is a line defined by the intersection between the first plane and the second plane and extending from the user's eye toward an observed object. The point of regard is computed from the intersection of the gaze vector with the observed object, which corresponds to the intersection of the first line and the second line when the observed object is planar. [0018]
  • It is a related object of the third embodiment of the invention to use a single camera having a co-located and co-oriented light source to capture images of a user's eye including glints and a pupil center. It is a related object of the third embodiment of the invention to determine the distance in the camera's image plane between the pupil center and the glint. Using an estimated distance between the user's eye and an observed object, and a one-time measurement of the user's corneal curvature, the gaze vector and point of regard are determined. [0019]
  • The foregoing objects are believed to be satisfied by the embodiments of the present invention as described below. [0020]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a prior art diagram of an eye gaze tracking system. [0021]
  • FIG. 2 is a prior art diagram of a user's eye as viewed by a camera. [0022]
  • FIG. 3 is a prior art diagram of the foveal and optical axes and their offset angle. [0023]
  • FIG. 4 is a diagram of the system of the preferred embodiment of the present invention. [0024]
  • FIG. 5 is a diagram of the user's eye according to the preferred embodiment of the present invention. [0025]
  • FIG. 6 is a diagram of the user's eye including a first plane Agp containing the gaze vector according to the preferred embodiment of the present invention. [0026]
  • FIG. 7 is a view of the user's eye as seen by the first camera according to the preferred embodiment of the present invention. [0027]
  • FIG. 8 is a diagram of the user's eye according to the preferred embodiment of the present invention. [0028]
  • FIG. 9 is a diagram of the user's eye including a second plane Bip containing the gaze vector according to the preferred embodiment of the present invention. [0029]
  • FIG. 10 is a view of the user's eye as seen by the second camera according to the preferred embodiment of the present invention. [0030]
  • FIG. 11 is a diagram of the user's eye including a gaze vector defined by the intersection of the first plane and the second plane, and a point of regard, according to the preferred embodiment of the present invention. [0031]
  • FIG. 12 is a flowchart of the eye gaze tracking method according to the preferred embodiment of the present invention. [0032]
  • FIG. 13 is a diagram of a second embodiment of the present invention. [0033]
  • FIG. 14 is a diagram of a third embodiment of the present invention. [0034]
  • DETAILED DESCRIPTION
  • Referring now to FIG. 4, a diagram of the system of the preferred embodiment of the present invention is shown. The system preferably includes a [0035] computer 400, a first camera 402, a second camera 404, a first light source 406, a second light source 408, a video decoder 410, a first frame grabber 412, and a second frame grabber 414. First camera 402 and second camera 404 are each video cameras, spaced apart, generating respective video signals representing repeating interlaced scans of a respective image field. In a conventional interlaced video camera, odd-numbered raster rows are typically scanned from left to right and then top to bottom, and then even-numbered raster rows are scanned in the same manner during each repetition. Vertical and horizontal synchronization signals from first camera 402 are fed into video decoder 410, which passes the synchronization signals to second camera 404, which responsively scans its image field in time with the scans of first camera 402. Alternately, each of the cameras could be driven by synchronization signals originating from computer 400, video decoder 410, or from another signal source. Both cameras are aimed at and focused upon one of the user's eyes and is equipped with tracking mechanisms (not shown), well known to those of ordinary skill in the art, that actively keep the cameras aimed at the user's eye. These tracking mechanisms sometimes operate by rapidly adjusting the orientation of each camera to keep the brightest portion of the image centered in its respective field of view. Note that in the preferred embodiment no fixed rotational reference for either camera is required, i.e. either camera could be rolled about its optical axis without causing difficulties.
  • First [0036] light source 406 and second light source 408 are preferably light-emitting diodes (LEDs) that produce light of near-infrared wavelengths when energized. First light source 406 is positioned to emit light substantially along the optical axis of first camera 402 in the direction of its field of view. Second light source 408 is similarly positioned to emit light substantially along the optical axis of second camera 404 in the direction of its field of view. The brightness of each light source, when energized, is adjusted to keep the image brightness in the eye area of each camera's field of view substantially the same. The duty cycle of each light source can be adjusted downward to enable production of pulses of brighter light intensity.
  • One method of acquiring a clearly defined and easy to process pupil image is to generate a difference image by effectively subtracting an unlit image of the eye from a lit image of the eye. In the preferred embodiment, [0037] video decoder 410 generates an even field control signal 416 whenever even-numbered raster rows are being scanned by the cameras, and generates an odd field control signal 418 whenever odd-numbered raster rows are being scanned by the cameras. Even field control signal 416 triggers the illumination of first light source 406, and odd field control signal 418 triggers the illumination of second light source 408. The two light sources are thus alternately energized during each alternately interlaced camera scan. The result is that each camera produces images composed of two fields, each illuminated by a different light source, one on-axis and the other off-axis. Images from the cameras are captured by first frame grabber 412 and second frame grabber 414, digitized, and then forwarded to computer 400 for subsequent processing. Subtracting the rows exposed by off-axis light from the corresponding row exposed by the on-axis light in images from first camera 402 produces a difference image that very clearly identifies the pupil as seen by first camera 402. A similar subtraction performed on images from second camera 404 produces a difference image that very clearly identifies the pupil as seen by second camera 404, as described in U.S. Pat. No. 5,016,282. Alternate lighting is not an essential aspect of the invention but works particularly well.
  • The relative positions and orientations of [0038] first camera 402, second camera 404, and the object being viewed by the user (e.g. a display screen) are known from a onetime user-independent calibration of the system of the present invention performed when the system components are first deployed. Attachment of the cameras to the display screen at known points would simplify the initial calibration, but cameras need not be positioned on the display screen or in the plane of the display screen. Similarly, the optical parameters of both cameras (e.g. focal length) and the size of the display screen are assumed to be known, and the user's cornea is assumed to be rotationally symmetric about the optical axis.
  • Referring now to FIG. 5, a diagram of a user's eye is shown in accordance with the preferred embodiment of the present invention. Point A is the position of first [0039] focal center 500 of first camera 402 and the position of first light source 406. A pinhole camera model is used with a perspective projection to the image plane. Light from first light source 406 reflects from the user's cornea at point G back to first camera 402, producing a first glint 508 in the image from first camera 402. Point B is the position of second focal center 502 of second camera 404 and the position of second light source 408. Light emitted from an off-axis light source (e.g. second light source 408) reflects from the user's cornea at point H and is visible by first camera 402 as second glint 510. Identification of which glint is due to which light source is simplified by use of alternate lighting during image capture as described above. Point C is the center of curvature 504 of the corneal bulge (note, the corneal bulge is usually modeled as spherical but of course in reality the corneal bulge is not a complete sphere within the eyeball). Point P is pupil center 506. Points G and H lie on plane ABC. Point P′ is the point of regard 514 on display screen 512, i.e. the intersection point between line CP (which is the optical axis and gaze vector 516) and display screen 512 plane. Image plane 518 is a plane orthogonal to the optical axis of first camera 402 (for clarity, image plane 518 is shown in front of first focal center 500, but in reality image plane 518 will be behind first focal center 500 and points on image plane 518 will be projections). Point g 520 is the image of (on-axis) first glint 508 in image plane 518. Point h 522 is the image of (off-axis) second glint 510 in image plane 518. Point p 524 is the image of pupil center 506 in image plane 518.
  • Referring now to FIG. 6, a diagram of the user's eye is shown including a [0040] first plane Agp 600 according to the preferred embodiment of the present invention. Plane Agp 600 includes (on-axis) first light source 406 and first camera 402 focal center, the image of first glint 508 in image plane 518 (point g), and the image of pupil center 506 in image plane 518 (point p). Points C, G, g, and A are collinear. Points C, P, and P′ are collinear. Points A, p, and P are collinear. The plane Agp spanning lines CGA and CPP′ would therefore include lines PG and line AP′. Plane Agp 600 can be considered to be plane ABC (which also includes points H and h) rotated around line CGA by a measurable angle α. Line L 602 is the intersection between plane Agp and the screen plane. Hence the gaze vector intersects with the display screen plane at point P′ on line L. Determination of line L alone may be of particular utility, depending on the application that uses gaze information. For example, the intersection of line L with a scroll bar can determine the position of the scroll bar slider, assuming that the user is looking at the scroll bar at a specific time. Determination of partial gaze information, e.g. line L, is an object of this invention.
  • Referring now to FIG. 7, a view of the user's eye as seen by [0041] first camera 402 is shown according to the preferred embodiment of the present invention. The identities and locations in the image plane of first camera 402 of projected first glint 508 (at point g) and projected second glint 510 (at point h) are determined from analysis of the images taken by first camera 402 when first light source 406 and second light source 408 were energized, preferably in an alternating manner as described above. In other words, the image of first glint 508 is due to first light source 406, and the image of second glint 510 is due to second light source 408, so if the light sources are alternately energized only one glint will appear in each interlaced scan made by first camera 402. Projected pupil center 506 (at point p) is also identified and located, preferably from the difference image generated by subtraction of even and odd interlaced scans and subsequent processing via conventional image analysis techniques. Angle α separating plane ABC and Agp 600 is therefore merely the angle pgh between line gh and line gp in this Figure, which is a view along the axis of plane rotation.
  • Alternately, line gp can be determined without estimating an exact point defining [0042] pupil center 506 location in image plane 518. Line gp can be a line that extends from the glint image through the pupil image to maximize the symmetry of the pupil image. If the portion of the pupil image on one side of line gp were “folded over” line gp onto the other portion of the pupil image, the overall differential pupil area would be minimized. Alternately, line gp can be chosen to go through the “center of mass” of the pupil image, i.e. a homogeneous slab of material shaped like the pupil image and of uniform thickness would balance if suspended on line gp. The pupil image will not be circular nor even elliptical if there are distortions in the corneal lens. However, it can be shown that when modeling the eye as a corneal lens attached to a spherical ball, the line of sight must lie on the plane passing through the glint and the symmetry line of the pupil as imaged via perspective projection onto a camera's image plane. Under this model, the line of sight may not pass through the measured pupil center due to the distortion the corneal lens induces on the pupil image.
  • Referring now to FIG. 8, a diagram of the user's eye is shown in accordance with the preferred embodiment of the present invention. This Figure is similar to FIG. 5, but describes the view of the user's eye as seen by [0043] second camera 404. Light from second light source 408 reflects from the user's cornea at point I back to second camera 404, producing second glint 510 in the image plane 526 of second camera 404. Light emitted from first light source 406 reflects from the user's cornea at point H and is visible by second camera 404 as first glint 508. Points H and I lie on plane ABC. Second image plane 526 is a plane orthogonal to the optical axis of second camera 402. Point i 528 is the image of second glint 510 in image plane 526. Point h 522 is the image of first glint 508 in image plane 526. Point p 524 is the image of pupil center 506 in image plane 526.
  • Referring now to FIG. 9, a diagram of the user's eye is shown including a [0044] second plane Bip 900 according to the preferred embodiment of the present invention. Plane Bip 900 includes second light source 408 and second camera 404, second glint 510 in image plane 522, and pupil center 506. Points C, I, and B are collinear. Points C, P, and P′ are collinear. A plane spanning lines CIB and CPP′ would therefore include lines PI and line BP′. Plane Bip 900 can be considered to be plane ABC (which is also plane ABH) rotated around line CIB by a particular angle β.
  • Referring now to FIG. 10, a view of the user's eye as seen by [0045] second camera 404 is shown according to the preferred embodiment of the present invention. The identities and locations in the image plane 526 of second camera 404 of first glint 508 (at point h) and second glint 510 (at point i) are determined from analysis of the images taken by second camera 402 when first light source 406 and second light source 408 were energized, preferably in an alternating manner as described above. In other words, first glint 508 is due to first light source 406, and second glint 510 is due to second light source 408, so if the light sources are alternately energized only one glint will appear in each interlaced scan made by second camera 404. Pupil center 506 (at point p) is also identified and located in image plane 526, preferably from the difference image generated by subtraction of interlaced scan rows and subsequent processing techniques as described above. Angle β separating plane ABC and Bip 900 is therefore merely the angle hip between line ih and line ip in this Figure, which is a view along the axis of plane rotation.
  • Referring now to FIG. 11, a diagram of a user's eye including [0046] first plane Agp 600 and second plane Bip 900 is shown according to the preferred embodiment of the present invention. Line CPP′ is the intersection of first plane Agp 600 and second plane Bip 900. Note that point C, center of cornea curvature 504, need not be explicitly computed to determine either gaze vector 516 or point of regard P′ 514; point C can be indirectly determined if needed. The intersection of line CP (gaze vector 516) with the pre-defined display screen 512 plane (or another observed object, whether planar or not) is point of regard P′ 514. Point P′ 514 is known because the relative position of first camera 402 and second camera 404 to display screen 512 plane and to each other is known, and the relative positions of first glint 508 and second glint 510 and pupil center 506 in image planes 518 and 526 are known.
  • In the above analysis, it is assumed that the eye is a sphere (a good first approximation). However, more detailed analysis shows that it is enough to assume that the eye has rotational symmetry around the axis connecting the pupil center and the eyeball center. This is a good approximation except for the case of large astigmatism. The invention therefore tracks eye gaze properly for near-sighted and far-sighted users. While the invention has been described in a preferred embodiment employing two cameras, embodiments using more than two cameras are also included within the scope of the invention. Similarly, embodiments in which both of the user's eyes are tracked, each by at least one camera, is included within the scope of the invention. [0047]
  • Referring now to FIG. 12, a flowchart of the eye gaze tracking method is shown according to the preferred embodiment of the present invention. In [0048] step 1200, first camera 402 generates an image of the user's eye. In step 1202, second camera 404 generates an image of the user's eye. Each image may include interlaced scans and is passed to computer 400 as described above. In step 1204, for each image, computer 400 identifies and locates pupil center 506 and first glint 508 and second glint 510 in the image planes. In step 1206, computer 400 computes the plane rotation angles α and β. In step 1208, computer 400 identifies gaze vector 516 as the intersection line of first plane 600 and second plane 900. In step 1210, computer 400 identifies point of regard 514 from gaze vector 516 and data describing the spatial arrangement of first camera 402, second camera 404, and display screen 512 plane (or another observed object, whether planar or not). In step 1212, computer 400 generates outputs describing gaze vector 516 and point of regard 514 and begins another cycle of the method.
  • Referring now to FIG. 13, a diagram of a user's eye according to a second embodiment of the present invention is shown. The second embodiment is identical to the preferred embodiment, except that each of the two intersecting planes are computed from different data points. In this embodiment, it is not necessary for either camera to view reflected light originally emitted by a light source other than its own, although this additional data can be used. However, unlike the preferred embodiment, it is necessary in this second embodiment for the roll angle for each camera to be known, i.e. some “up vector” or absolute orientation reference is needed. For each camera, the focal center Fx of the camera [0049] 1300, the position of the pupil center Px 1302 as projected onto the image plane 1304 of the camera, and the position of the glint Gx 1306 produced by that camera's own light source projected onto the image plane 1304 of the camera define a plane FxPxGx. The intersection of the first plane with display screen plane 512 defines a first line containing point of regard 514. The intersection of the second plane with display screen plane 512 defines a second line containing point of regard 514. The gaze vector 516 is a line defined by the intersection between the first plane and the second plane and extending from the user's eye toward an observed object. The point of regard 514 is computed from the intersection of gaze vector 516 with the observed object, which corresponds to the intersection of the first line and the second line when the observed object is planar. While the invention has been described in a second embodiment employing two cameras, embodiments using more than two cameras are also included within the scope of the invention. Similarly, an embodiment employing two cameras, each of which tracks a different user eye, is also included within the scope of the invention.
  • Referring now to FIG. 14, a diagram of a third embodiment of the present invention is shown. This embodiment requires a one-time calibration of the radius of curvature of the user's cornea, and an estimate of the distance of the eye from [0050] display screen 512 plane or camera 402. The third embodiment system components are identical to those of the second embodiment except that the third embodiment omits second camera 404, second light source 408 and second frame grabber 414. Projections of first glint 508 (at point g) and pupil center 506 (at point p) are identified and located in image plane 518, and the distance between points g and p is measured. If the user is looking directly at camera 402, there will be no distance between points p and g, i.e. they will coincide. Angle gAp and the distance d from the camera 402 are used to compute distance PG, which is the actual distance between pupil center 506 and glint 508 on the eye. Because the radius of corneal curvature r is known, the angle ACP′ can be computed from distance PG via elementary trigonometry. Point of regard 514 and the gaze vector 516 are computed from the position of camera 402. Camera 402 may alternately scan each of the user's eyes to allow two computations as described above, reducing the need for the distance d.
  • A general purpose computer is programmed according to the inventive steps herein. The invention can also be embodied as an article of manufacture—a machine component—that is used by a digital processing apparatus to execute the present logic. This invention is realized in a critical machine component that causes a digital processing apparatus to perform the inventive method steps herein. The invention may be embodied by a computer program that is executed by a processor within a computer as a series of computer-executable instructions. These instructions may reside, for example, in RAM of a computer or on a hard drive or optical drive of the computer, or the instructions may be stored on a DASD array, magnetic tape, electronic read-only memory, or other appropriate data storage device. [0051]
  • While the invention has been described with respect to illustrative embodiments thereof, it will be understood that various changes may be made in the apparatus and means herein described without departing from the scope and teaching of the invention. Accordingly, the described embodiment is to be considered merely exemplary and the invention is not to be limited except as specified in the attached claims. [0052]

Claims (30)

We claim:
1. A method for eye gaze tracking, comprising the steps of:
focusing at least one camera upon at least one of a user's eyes, each said camera having a focal center, an image plane, and a co-located light source emitting light toward said eye;
identifying and locating image aspects including at least one glint and a pupil image in said image plane; and
computing a gaze vector from at least one plane generated from said image aspects and camera position and orientation data.
2. The method of claim 1 wherein said user is an animal.
3. The method of claim 1 wherein said user is a person.
4. The method of claim 1 wherein said user is a baby.
5. The method of claim 1 comprising the further step of locating a point of regard as the intersection of said gaze vector with a predetermined surface.
6. The method of claim 1 comprising the further steps of:
synchronizing scanning signals controlling said cameras; and
responsively alternately energizing said light sources to identify correspondences between said light sources and said glints.
7. The method of claim 1 comprising the further step of correcting said gaze vector for a foveal axis offset angle.
8. The method of claim 1 comprising the further steps of:
determining an angle between said glint in said image plane, said focal center, and a center of said pupil image in said image plane;
finding a separation on said eye between said glint and said pupil center using said angle and a distance estimate between said eye and a point of regard;
defining a second angle between said focal center, a corneal curvature center, and said pupil center using a radius of corneal curvature to define said gaze vector; and
locating said point of regard at the intersection of said gaze vector with a predetermined surface.
9. The method of claim 1 comprising the further steps of:
defining for each of a plurality of said cameras a particular plane spanning said glint in said image plane, said focal center, and a center of said pupil image in said image plane; and
identifying an intersection line of said particular planes as said gaze vector.
10. The method of claim 1 comprising the further steps of:
for each one of a plurality of said cameras, defining in said image plane an angle spanning a center of said pupil image, a first glint, and a second glint, wherein said first glint results from said light source on each said one camera and said second glint results from another light source;
defining a base plane spanning said focal center for each said one camera and said focal center for each said other camera and a point on said eye corresponding to said second glint;
for each one of said cameras, defining a particular plane by rotating said base plane through each said corresponding angle around an axis including said focal center for each said one camera and said first glint; and
identifying a line at an intersection of said planes as said gaze vector.
11. The method of claim 10 wherein said center of said pupil image lies on a line maximizing symmetry of said pupil image.
12. A method for eye gaze tracking, comprising the steps of:
focusing at least one camera upon at least one of a user's eyes, each said camera having a focal center, an image plane, and a co-located light source emitting light toward said eye;
identifying and locating image aspects including at least one glint and a pupil image in said image plane; and
computing a line containing a point of regard on a display screen from said image aspects and camera position and orientation data.
13. The method of claim 12 wherein a position of said line on said display screen controls a graphical user interface element.
14. The method of 13 wherein said graphical user interface element is a scroll bar slider.
15. A system for eye gaze tracking, comprising:
at least one camera focusing upon at least one of a user's eyes, each said camera having a focal center, an image plane, and a co-located light source emitting light toward said eye;
a computer to identify and locate image aspects including at least one glint and a pupil image in said image plane, and to compute a gaze vector from at least one plane generated from said image aspects and camera position and orientation data.
16. The system of claim 15 wherein said user is an animal.
17. The system of claim 15 wherein said user is a person.
18. The system of claim 15 wherein said user is a baby.
19. The system of claim 15 wherein said computer locates a point of regard as the intersection of said gaze vector with a predetermined surface.
20. The system of claim 15 further comprising:
a source for synchronous scan signals controlling said cameras and alternately energizing said light sources to identify correspondences between said light sources and said glints.
21. The system of claim 15 wherein said computer corrects said gaze vector for a foveal offset axis angle.
22. The system of claim 15 wherein said computer:
determines an angle between said glint in said image plane, said focal center, and a center of said pupil image in said image plane;
finds a separation on said eye between said glint and said pupil center using said angle and a distance estimate between said eye and a point of regard;
defines a second angle between said focal center, a corneal curvature center, and said pupil center using a radius of corneal curvature to define said gaze vector; and
locates said point of regard at the intersection of said gaze vector with a predetermined surface.
23. The system of claim 15 wherein said computer:
defines for each of a plurality of said cameras a particular plane spanning said glint in said image plane, said focal center, and a center of said pupil image in said image plane; and
identifes an intersection line of said planes as said gaze vector.
24. The system of claim 15 wherein said computer:
for each one of a plurality of said cameras, defines in said image plane an angle spanning a center of said pupil image, a first glint, and a second glint, wherein said first glint results from said light source on each said one camera and said second glint results from another light source;
defines a base plane spanning said focal centers for each said one camera and said focal center for each said other camera and a point on said eye corresponding to said second glint;
for each one of said cameras, defines a particular plane by rotating said base plane through each said corresponding angle around an axis including said focal center for each said one camera and said first glint; and
identifies a line at an intersection of said planes as said gaze vector.
25. The system of claim 24 wherein said computer chooses said center of said pupil image to lie on a line maximizing symmetry of said pupil image.
26. A system for eye gaze tracking comprising:
at least one camera focused upon at least one of a user's eyes, each said camera having a focal center, an image plane, and a co-located light source emitting light toward said eye; and
a computer to identify and locate image aspects including at least one glint and a pupil image in said image plane, and to compute a line containing a point of regard on a display screen from said image aspects and camera position and orientation data.
27. The system of claim 26 wherein a position of said line on said display screen controls a graphical user interface element.
28. The system of 27 wherein said graphical user interface element is a scroll bar slider.
29. A system for eye gaze tracking comprising:
means for focusing at least one camera upon a user's eye, each said camera having a co-located light source emitting light toward said eye;
means for identifying and locating, in an image plane, image aspects including at least one glint and a pupil image;
means for computing from said image aspects a gaze vector; and
means for determining from said image aspects a point of regard.
30. A computer program product including a program storage device readable by a machine, tangibly embodying a program of instructions executable by the machine to perform method steps for eye gaze tracking, said method steps comprising:
focusing at least one camera upon a user's eye, each said camera having a co-located light source emitting light toward said eye;
identifying and locating, in an image plane, image aspects including at least one glint and a pupil image;
computing from said image aspects a gaze vector; and
determining from said image aspects a point of regard.
US09/844,682 2001-04-27 2001-04-27 Calibration-free eye gaze tracking Expired - Lifetime US6578962B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/844,682 US6578962B1 (en) 2001-04-27 2001-04-27 Calibration-free eye gaze tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/844,682 US6578962B1 (en) 2001-04-27 2001-04-27 Calibration-free eye gaze tracking

Publications (2)

Publication Number Publication Date
US20030098954A1 true US20030098954A1 (en) 2003-05-29
US6578962B1 US6578962B1 (en) 2003-06-17

Family

ID=25293371

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/844,682 Expired - Lifetime US6578962B1 (en) 2001-04-27 2001-04-27 Calibration-free eye gaze tracking

Country Status (1)

Country Link
US (1) US6578962B1 (en)

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
US20040183749A1 (en) * 2003-03-21 2004-09-23 Roel Vertegaal Method and apparatus for communication between humans and devices
WO2006032253A1 (en) * 2004-09-22 2006-03-30 Eldith Gmbh Device and method for the contactless determination of the direction of viewing
US20060093998A1 (en) * 2003-03-21 2006-05-04 Roel Vertegaal Method and apparatus for communication between humans and devices
US20060210122A1 (en) * 2005-03-16 2006-09-21 Dixon Cleveland System and method for eyeball surface topography as a biometric discriminator
US20080143857A1 (en) * 2006-12-19 2008-06-19 California Institute Of Technology Image processor
US20080158226A1 (en) * 2006-12-19 2008-07-03 California Institute Of Technology Imaging model and apparatus
DE102007001738A1 (en) * 2007-01-11 2008-07-17 Audi Ag Method for view detection of test person, involves identifying test person when test person directs view towards monitoring area, where spatially limited monitoring area is specified, and camera is arranged in monitoring area
US7401920B1 (en) * 2003-05-20 2008-07-22 Elbit Systems Ltd. Head mounted eye tracking and display system
EP2042079A1 (en) * 2006-07-14 2009-04-01 Panasonic Corporation Visual axis direction detection device and visual line direction detection method
US20090157625A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
EP2150170A1 (en) * 2007-05-23 2010-02-10 The University of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
US20120290401A1 (en) * 2011-05-11 2012-11-15 Google Inc. Gaze tracking system
US20130002846A1 (en) * 2010-03-22 2013-01-03 Koninklijke Philips Electronics N.V. System and method for tracking the point of gaze of an observer
US20130128364A1 (en) * 2011-11-22 2013-05-23 Google Inc. Method of Using Eye-Tracking to Center Image Content in a Display
US20130169532A1 (en) * 2011-12-29 2013-07-04 Grinbath, Llc System and Method of Moving a Cursor Based on Changes in Pupil Position
US20130169533A1 (en) * 2011-12-29 2013-07-04 Grinbath, Llc System and Method of Cursor Position Control Based on the Vestibulo-Ocular Reflex
US20130241805A1 (en) * 2012-03-15 2013-09-19 Google Inc. Using Convergence Angle to Select Among Different UI Elements
CN103631364A (en) * 2012-08-20 2014-03-12 联想(北京)有限公司 Control method and electronic device
US8860660B2 (en) 2011-12-29 2014-10-14 Grinbath, Llc System and method of determining pupil center position
US8860787B1 (en) 2011-05-11 2014-10-14 Google Inc. Method and apparatus for telepresence sharing
US8862764B1 (en) 2012-03-16 2014-10-14 Google Inc. Method and Apparatus for providing Media Information to Mobile Devices
CN104113680A (en) * 2013-04-19 2014-10-22 北京三星通信技术研究有限公司 Sight line tracking system and method
US8885882B1 (en) 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
EP2823751A1 (en) * 2013-07-09 2015-01-14 Smart Eye AB Eye gaze imaging
US20150049013A1 (en) * 2013-08-19 2015-02-19 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
US8971570B1 (en) 2011-11-04 2015-03-03 Google Inc. Dual LED usage for glint detection
US8970452B2 (en) 2011-11-02 2015-03-03 Google Inc. Imaging method
JP2015046111A (en) * 2013-08-29 2015-03-12 株式会社Jvcケンウッド Viewpoint detection device and viewpoint detection method
US20150154758A1 (en) * 2012-07-31 2015-06-04 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
CN104699124A (en) * 2015-03-24 2015-06-10 天津通信广播集团有限公司 Television angle regulating method based on sight watching angle detection
US20150181100A1 (en) * 2010-03-01 2015-06-25 Eyefluence, Inc. Systems and methods for spatially controlled scene illumination
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9292086B2 (en) 2012-09-26 2016-03-22 Grinbath, Llc Correlating pupil position to gaze location within a scene
WO2016043835A1 (en) * 2014-09-19 2016-03-24 Intel Corporation Facilitating dynamic eye torsion-based eye tracking on computing devices
US20160095511A1 (en) * 2014-10-02 2016-04-07 Fujitsu Limited Eye gaze detecting device and eye gaze detection method
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20160262614A1 (en) * 2013-11-28 2016-09-15 JVC Kenwood Corporation Eye gaze detection supporting device and eye gaze detection supporting method
KR20160108394A (en) * 2014-01-07 2016-09-19 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mapping glints to light sources
US9495684B2 (en) 2007-12-13 2016-11-15 The Invention Science Fund I, Llc Methods and systems for indicating behavior in a population cohort
US20170061251A1 (en) * 2015-08-28 2017-03-02 Beijing Kuangshi Technology Co., Ltd. Liveness detection method, liveness detection system, and liveness detection device
US9619020B2 (en) 2013-03-01 2017-04-11 Tobii Ab Delay warp gaze interaction
US20170150236A1 (en) * 2015-11-24 2017-05-25 Gopro, Inc. Multi-Camera Time Synchronization
US9727136B2 (en) * 2014-05-19 2017-08-08 Microsoft Technology Licensing, Llc Gaze detection calibration
US9775554B2 (en) 2007-12-31 2017-10-03 Invention Science Fund I, Llc Population cohort-linked avatar
US20170293354A1 (en) * 2016-04-09 2017-10-12 Beihang University Calculation method of line-of-sight direction based on analysis and match of iris contour in human eye image
US9792709B1 (en) 2015-11-23 2017-10-17 Gopro, Inc. Apparatus and methods for image alignment
US9864498B2 (en) 2013-03-13 2018-01-09 Tobii Ab Automatic scrolling based on gaze detection
US9934758B1 (en) 2016-09-21 2018-04-03 Gopro, Inc. Systems and methods for simulating adaptation of eyes to changes in lighting conditions
US9952883B2 (en) 2014-08-05 2018-04-24 Tobii Ab Dynamic determination of hardware
US9973696B1 (en) 2015-11-23 2018-05-15 Gopro, Inc. Apparatus and methods for image alignment
US9973746B2 (en) 2016-02-17 2018-05-15 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10033928B1 (en) 2015-10-29 2018-07-24 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10061383B1 (en) * 2015-09-16 2018-08-28 Mirametrix Inc. Multi-feature gaze tracking system and method
US10108319B2 (en) * 2017-03-22 2018-10-23 International Business Machines Corporation Cognitive dashboard adjustment
US10129470B2 (en) 2013-04-19 2018-11-13 Gopro, Inc. Apparatus and method for generating an output video stream from a wide field video stream
US10129516B2 (en) 2016-02-22 2018-11-13 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10194101B1 (en) 2017-02-22 2019-01-29 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10234940B2 (en) * 2015-02-04 2019-03-19 Itu Business Development A/S Gaze tracker and a gaze tracking method
US10268896B1 (en) 2016-10-05 2019-04-23 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10317995B2 (en) 2013-11-18 2019-06-11 Tobii Ab Component determination and gaze provoked interaction
US10432855B1 (en) 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images
EP3436326A4 (en) * 2016-04-01 2019-11-13 LG Electronics Inc. -1- Vehicle control apparatus and method thereof
US10558262B2 (en) 2013-11-18 2020-02-11 Tobii Ab Component determination and gaze provoked interaction
US10582144B2 (en) 2009-05-21 2020-03-03 May Patents Ltd. System and method for control based on face or hand gesture detection
US10614294B1 (en) * 2006-06-16 2020-04-07 Videomining Corporation Method and system for measuring viewership of people for displayed object
EP3680884A2 (en) * 2019-01-09 2020-07-15 Samsung Display Co., Ltd. Photo sensor, display device including the same, and driving method thereof
CN111493809A (en) * 2014-12-17 2020-08-07 索尼公司 Information processing apparatus and method, glasses-type terminal, and storage medium
US10936056B2 (en) * 2018-04-16 2021-03-02 Google Llc Method and system of eye tracking with glint drift correction on wearable heads-up display
CN112732071A (en) * 2020-12-11 2021-04-30 浙江大学 Calibration-free eye movement tracking system and application
US11150469B2 (en) * 2017-09-28 2021-10-19 Apple Inc. Method and device for eye tracking using event camera data
US20220321866A1 (en) * 2021-02-08 2022-10-06 Yuyao Sunny Optical Intelligence Technology Co., Ltd. Head-Mounted Viewable Device and Eye-Tracking System for Use in Head-Mounted Viewable Device
SE2151198A1 (en) * 2021-09-30 2023-03-31 Tobii Ab Gaze defect compensation
US12086312B2 (en) 2021-12-20 2024-09-10 Samsung Electronics Co., Ltd. Apparatus and method of controlling light source in eye tracking using glint

Families Citing this family (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7076118B1 (en) * 1997-12-05 2006-07-11 Sharp Laboratories Of America, Inc. Document classification system
DE19926476A1 (en) * 1999-06-10 2000-12-14 Wavelight Laser Technologie Gm Device for medical treatment of the eye with laser radiation
JP3790680B2 (en) * 2001-05-25 2006-06-28 株式会社東芝 Image processing system and driving support system using the same
US7284201B2 (en) * 2001-09-20 2007-10-16 Koninklijke Philips Electronics N.V. User attention-based adaptation of quality level to improve the management of real-time multi-media content delivery and distribution
CA2545202C (en) * 2003-11-14 2014-01-14 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
US7963652B2 (en) * 2003-11-14 2011-06-21 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
US7533989B2 (en) * 2003-12-25 2009-05-19 National University Corporation Shizuoka University Sight-line detection method and device, and three-dimensional view-point measurement device
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US7388580B2 (en) * 2004-05-07 2008-06-17 Valve Corporation Generating eyes for a character in a virtual environment
US9030532B2 (en) * 2004-08-19 2015-05-12 Microsoft Technology Licensing, Llc Stereoscopic image display
ITFI20040223A1 (en) * 2004-10-29 2005-01-29 Sr Labs S R L METHOD AND INTEGRATED VISUALIZATION, PROCESSING AND ANALYSIS SYSTEM OF MEDICAL IMAGES
US20060210111A1 (en) * 2005-03-16 2006-09-21 Dixon Cleveland Systems and methods for eye-operated three-dimensional object location
US7522344B1 (en) * 2005-12-14 2009-04-21 University Of Central Florida Research Foundation, Inc. Projection-based head-mounted display with eye-tracking capabilities
KR101309176B1 (en) * 2006-01-18 2013-09-23 삼성전자주식회사 Apparatus and method for augmented reality
US8077914B1 (en) * 2006-08-07 2011-12-13 Arkady Kaplan Optical tracking apparatus using six degrees of freedom
US7682026B2 (en) * 2006-08-22 2010-03-23 Southwest Research Institute Eye location and gaze detection system and method
US7783077B2 (en) * 2006-12-01 2010-08-24 The Boeing Company Eye gaze tracker system and method
TWI453711B (en) * 2007-03-21 2014-09-21 Semiconductor Energy Lab Display device
JP2008250774A (en) * 2007-03-30 2008-10-16 Denso Corp Information equipment operation device
US20090103048A1 (en) * 2007-10-17 2009-04-23 Omron Silicon Valley Method and system for pupil detection
US8808164B2 (en) * 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
US8155479B2 (en) 2008-03-28 2012-04-10 Intuitive Surgical Operations Inc. Automated panning and digital zooming for robotic surgical systems
US20090273562A1 (en) * 2008-05-02 2009-11-05 International Business Machines Corporation Enhancing computer screen security using customized control of displayed content area
KR100947990B1 (en) * 2008-05-15 2010-03-18 성균관대학교산학협력단 Gaze Tracking Apparatus and Method using Difference Image Entropy
CN101344919B (en) * 2008-08-05 2012-08-22 华南理工大学 Sight tracing method and disabled assisting system using the same
DE102009010263B4 (en) * 2009-02-24 2011-01-20 Reiner Kunz Method for navigating an endoscopic instrument during technical endoscopy and associated device
US8199186B2 (en) 2009-03-05 2012-06-12 Microsoft Corporation Three-dimensional (3D) imaging based on motionparallax
IT1399456B1 (en) * 2009-09-11 2013-04-19 Sr Labs S R L METHOD AND APPARATUS FOR THE USE OF GENERIC SOFTWARE APPLICATIONS THROUGH EYE CONTROL AND INTERACTION METHODS IS APPROPRIATE.
US8539560B2 (en) 2010-06-24 2013-09-17 International Business Machines Corporation Content protection using automatically selectable display surfaces
US9172913B1 (en) * 2010-09-24 2015-10-27 Jetprotect Corporation Automatic counter-surveillance detection camera and software
US8599027B2 (en) 2010-10-19 2013-12-03 Deere & Company Apparatus and method for alerting machine operator responsive to the gaze zone
JP5858433B2 (en) * 2010-12-08 2016-02-10 国立大学法人静岡大学 Gaze point detection method and gaze point detection device
US8408706B2 (en) * 2010-12-13 2013-04-02 Microsoft Corporation 3D gaze tracker
US8888287B2 (en) 2010-12-13 2014-11-18 Microsoft Corporation Human-computer interface system having a 3D gaze tracker
US9213405B2 (en) 2010-12-16 2015-12-15 Microsoft Technology Licensing, Llc Comprehension and intent-based content for augmented reality displays
US9153195B2 (en) 2011-08-17 2015-10-06 Microsoft Technology Licensing, Llc Providing contextual personal information by a mixed reality device
WO2013028908A1 (en) 2011-08-24 2013-02-28 Microsoft Corporation Touch and social cues as inputs into a computer
CA2750287C (en) 2011-08-29 2012-07-03 Microsoft Corporation Gaze detection in a see-through, near-eye, mixed reality display
CN103033936A (en) 2011-08-30 2013-04-10 微软公司 Head mounted display with iris scan profiling
US9323325B2 (en) 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
US9421866B2 (en) 2011-09-23 2016-08-23 Visteon Global Technologies, Inc. Vehicle system and method for providing information regarding an external item a driver is focusing on
US8998414B2 (en) 2011-09-26 2015-04-07 Microsoft Technology Licensing, Llc Integrated eye tracking and display system
US9503713B2 (en) 2011-11-02 2016-11-22 Intuitive Surgical Operations, Inc. Method and system for stereo gaze tracking
DE102011055967B4 (en) * 2011-12-02 2016-03-10 Seereal Technologies S.A. Measuring method and device for carrying out the measuring method
JP5942586B2 (en) * 2012-05-18 2016-06-29 富士通株式会社 Tablet terminal and operation reception program
US9244529B2 (en) * 2013-01-27 2016-01-26 Dmitri Model Point-of-gaze estimation robust to head rotations and/or device rotations
US9179833B2 (en) 2013-02-28 2015-11-10 Carl Zeiss Meditec, Inc. Systems and methods for improved ease and accuracy of gaze tracking
US9801539B2 (en) * 2013-05-23 2017-10-31 Stiftung Caesar—Center Of Advanced European Studies And Research Ocular Videography System
CN111616666A (en) 2014-03-19 2020-09-04 直观外科手术操作公司 Medical devices, systems, and methods using eye gaze tracking
CN106659541B (en) 2014-03-19 2019-08-16 直观外科手术操作公司 Integrated eyeball stares medical device, the system and method that tracking is used for stereoscopic viewer
KR102198852B1 (en) * 2014-03-24 2021-01-05 삼성전자 주식회사 Iris recognition apparatus and and mobile apparatus having the same
US9684827B2 (en) * 2014-03-26 2017-06-20 Microsoft Technology Licensing, Llc Eye gaze tracking based upon adaptive homography mapping
KR102212209B1 (en) * 2014-04-10 2021-02-05 삼성전자주식회사 Method, apparatus and computer readable recording medium for eye gaze tracking
GB2536650A (en) 2015-03-24 2016-09-28 Augmedics Ltd Method and system for combining video-based and optic-based augmented reality in a near eye display
US9858719B2 (en) 2015-03-30 2018-01-02 Amazon Technologies, Inc. Blended reality systems and methods
GB201507210D0 (en) 2015-04-28 2015-06-10 Microsoft Technology Licensing Llc Eye gaze correction
GB201507224D0 (en) 2015-04-28 2015-06-10 Microsoft Technology Licensing Llc Eye gaze correction
US10016130B2 (en) 2015-09-04 2018-07-10 University Of Massachusetts Eye tracker system and methods for detecting eye parameters
US10630965B2 (en) * 2015-10-02 2020-04-21 Microsoft Technology Licensing, Llc Calibrating a near-eye display
US10133345B2 (en) 2016-03-22 2018-11-20 Microsoft Technology Licensing, Llc Virtual-reality navigation
US9807383B2 (en) * 2016-03-30 2017-10-31 Daqri, Llc Wearable video headset and method for calibration
CN108053444B (en) * 2018-01-02 2021-03-12 京东方科技集团股份有限公司 Pupil positioning method and device, equipment and storage medium
US10564716B2 (en) 2018-02-12 2020-02-18 Hong Kong Applied Science and Technology Research Institute Company Limited 3D gazing point detection by binocular homography mapping
WO2019211741A1 (en) 2018-05-02 2019-11-07 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US20210212773A1 (en) * 2018-05-16 2021-07-15 Intuitive Surgical Operations, Inc. System and method for hybrid control using eye tracking
US11543663B2 (en) * 2018-05-16 2023-01-03 Cartosense Pvt. Ltd. System and method for alignment between real and virtual objects in a head-mounted optical see-through display
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US10936059B2 (en) 2019-07-26 2021-03-02 Cajal Corporation Systems and methods for gaze tracking
US11980506B2 (en) 2019-07-29 2024-05-14 Augmedics Ltd. Fiducial marker
CN110537897B (en) * 2019-09-10 2022-04-05 北京未动科技有限公司 Sight tracking method and device, computer readable storage medium and electronic equipment
US11382712B2 (en) 2019-12-22 2022-07-12 Augmedics Ltd. Mirroring in image guided surgery
US11966048B1 (en) 2020-07-09 2024-04-23 Apple Inc. Head-mounted devices with dual gaze tracking systems
US11503998B1 (en) 2021-05-05 2022-11-22 Innodem Neurosciences Method and a system for detection of eye gaze-pattern abnormalities and related neurological diseases
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter
WO2024057210A1 (en) 2022-09-13 2024-03-21 Augmedics Ltd. Augmented reality eyewear for image-guided medical intervention

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4595990A (en) 1980-12-31 1986-06-17 International Business Machines Corporation Eye controlled information transfer
US4568159A (en) 1982-11-26 1986-02-04 The United States Of America As Represented By The Secretary Of The Navy CCD Head and eye position indicator
US4597648A (en) 1983-04-01 1986-07-01 Keratometer Research And Development Keratometer
US4973149A (en) 1987-08-19 1990-11-27 Center For Innovative Technology Eye movement detector
US4836670A (en) 1987-08-19 1989-06-06 Center For Innovative Technology Eye movement detector
US5016282A (en) 1988-07-14 1991-05-14 Atr Communication Systems Research Laboratories Eye tracking image pickup apparatus for separating noise from feature portions
US4950069A (en) 1988-11-04 1990-08-21 University Of Virginia Eye movement detector with improved calibration and speed
US5231674A (en) * 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
JPH0761314B2 (en) * 1991-10-07 1995-07-05 コナミ株式会社 Retinal reflected light amount measuring device and eye gaze detecting device using the device
US5471542A (en) * 1993-09-27 1995-11-28 Ragland; Richard R. Point-of-gaze tracker
GB2315858A (en) 1996-08-01 1998-02-11 Sharp Kk System for eye detection and gaze direction determination
US6351273B1 (en) * 1997-04-30 2002-02-26 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen
DE19736995B4 (en) 1997-08-26 2009-05-07 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device for determining a fixation point
US6152563A (en) * 1998-02-20 2000-11-28 Hutchinson; Thomas E. Eye gaze direction tracker
US6204828B1 (en) 1998-03-31 2001-03-20 International Business Machines Corporation Integrated gaze/manual cursor positioning system
DE19953835C1 (en) 1999-10-30 2001-05-23 Hertz Inst Heinrich Computer-aided method for contactless, video-based gaze direction determination of a user's eye for eye-guided human-computer interaction and device for carrying out the method

Cited By (136)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7306337B2 (en) * 2003-03-06 2007-12-11 Rensselaer Polytechnic Institute Calibration-free gaze tracking under natural head movement
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
US8322856B2 (en) 2003-03-21 2012-12-04 Queen's University At Kingston Method and apparatus for communication between humans and devices
US20170371407A1 (en) * 2003-03-21 2017-12-28 Queen's University At Kingston Method and Apparatus for Communication Between Humans and Devices
US7762665B2 (en) 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
US20130188032A1 (en) * 2003-03-21 2013-07-25 Roel Vertegaal Method and Apparatus for Communication Between Humans and Devices
US8672482B2 (en) 2003-03-21 2014-03-18 Queen's University At Kingston Method and apparatus for communication between humans and devices
US20040183749A1 (en) * 2003-03-21 2004-09-23 Roel Vertegaal Method and apparatus for communication between humans and devices
US10296084B2 (en) * 2003-03-21 2019-05-21 Queen's University At Kingston Method and apparatus for communication between humans and devices
US20060093998A1 (en) * 2003-03-21 2006-05-04 Roel Vertegaal Method and apparatus for communication between humans and devices
US8292433B2 (en) * 2003-03-21 2012-10-23 Queen's University At Kingston Method and apparatus for communication between humans and devices
US7401920B1 (en) * 2003-05-20 2008-07-22 Elbit Systems Ltd. Head mounted eye tracking and display system
WO2006032253A1 (en) * 2004-09-22 2006-03-30 Eldith Gmbh Device and method for the contactless determination of the direction of viewing
US8317327B2 (en) * 2005-03-16 2012-11-27 Lc Technologies, Inc. System and method for eyeball surface topography as a biometric discriminator
US20060210122A1 (en) * 2005-03-16 2006-09-21 Dixon Cleveland System and method for eyeball surface topography as a biometric discriminator
US10614294B1 (en) * 2006-06-16 2020-04-07 Videomining Corporation Method and system for measuring viewership of people for displayed object
US8406479B2 (en) 2006-07-14 2013-03-26 Panasonic Corporation Visual axis direction detection device and visual line direction detection method
EP2042079A4 (en) * 2006-07-14 2010-01-20 Panasonic Corp Visual axis direction detection device and visual line direction detection method
US20090304232A1 (en) * 2006-07-14 2009-12-10 Panasonic Corporation Visual axis direction detection device and visual line direction detection method
EP2042079A1 (en) * 2006-07-14 2009-04-01 Panasonic Corporation Visual axis direction detection device and visual line direction detection method
US8094965B2 (en) * 2006-12-19 2012-01-10 California Institute Of Technology Image processor
US8094169B2 (en) 2006-12-19 2012-01-10 California Institute Of Technology Imaging model and apparatus
US20080158226A1 (en) * 2006-12-19 2008-07-03 California Institute Of Technology Imaging model and apparatus
US20080143857A1 (en) * 2006-12-19 2008-06-19 California Institute Of Technology Image processor
DE102007001738B4 (en) * 2007-01-11 2016-04-14 Audi Ag Method and computer program product for eye tracking
DE102007001738A1 (en) * 2007-01-11 2008-07-17 Audi Ag Method for view detection of test person, involves identifying test person when test person directs view towards monitoring area, where spatially limited monitoring area is specified, and camera is arranged in monitoring area
US20110228975A1 (en) * 2007-05-23 2011-09-22 The University Of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
EP2150170A4 (en) * 2007-05-23 2011-01-05 Univ British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
US8457352B2 (en) 2007-05-23 2013-06-04 The University Of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
EP2150170A1 (en) * 2007-05-23 2010-02-10 The University of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
US9070017B2 (en) 2007-05-23 2015-06-30 Mirametrix Inc. Methods and apparatus for estimating point-of-gaze in three dimensions
US9495684B2 (en) 2007-12-13 2016-11-15 The Invention Science Fund I, Llc Methods and systems for indicating behavior in a population cohort
US20090157625A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
US9775554B2 (en) 2007-12-31 2017-10-03 Invention Science Fund I, Llc Population cohort-linked avatar
US10582144B2 (en) 2009-05-21 2020-03-03 May Patents Ltd. System and method for control based on face or hand gesture detection
US20150181100A1 (en) * 2010-03-01 2015-06-25 Eyefluence, Inc. Systems and methods for spatially controlled scene illumination
US9237844B2 (en) * 2010-03-22 2016-01-19 Koninklijke Philips N.V. System and method for tracking the point of gaze of an observer
US20130002846A1 (en) * 2010-03-22 2013-01-03 Koninklijke Philips Electronics N.V. System and method for tracking the point of gaze of an observer
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
US8510166B2 (en) * 2011-05-11 2013-08-13 Google Inc. Gaze tracking system
US8860787B1 (en) 2011-05-11 2014-10-14 Google Inc. Method and apparatus for telepresence sharing
US20120290401A1 (en) * 2011-05-11 2012-11-15 Google Inc. Gaze tracking system
US8885882B1 (en) 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
US8970452B2 (en) 2011-11-02 2015-03-03 Google Inc. Imaging method
US8971570B1 (en) 2011-11-04 2015-03-03 Google Inc. Dual LED usage for glint detection
US8786953B2 (en) 2011-11-22 2014-07-22 Google Inc. User interface
US20130128364A1 (en) * 2011-11-22 2013-05-23 Google Inc. Method of Using Eye-Tracking to Center Image Content in a Display
US8611015B2 (en) * 2011-11-22 2013-12-17 Google Inc. User interface
US20130169533A1 (en) * 2011-12-29 2013-07-04 Grinbath, Llc System and Method of Cursor Position Control Based on the Vestibulo-Ocular Reflex
US8860660B2 (en) 2011-12-29 2014-10-14 Grinbath, Llc System and method of determining pupil center position
US9910490B2 (en) * 2011-12-29 2018-03-06 Eyeguide, Inc. System and method of cursor position control based on the vestibulo-ocular reflex
US20130169532A1 (en) * 2011-12-29 2013-07-04 Grinbath, Llc System and Method of Moving a Cursor Based on Changes in Pupil Position
US20130241805A1 (en) * 2012-03-15 2013-09-19 Google Inc. Using Convergence Angle to Select Among Different UI Elements
US8862764B1 (en) 2012-03-16 2014-10-14 Google Inc. Method and Apparatus for providing Media Information to Mobile Devices
US10440103B2 (en) 2012-03-16 2019-10-08 Google Llc Method and apparatus for digital media control rooms
US9628552B2 (en) 2012-03-16 2017-04-18 Google Inc. Method and apparatus for digital media control rooms
US20150154758A1 (en) * 2012-07-31 2015-06-04 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
US9262680B2 (en) * 2012-07-31 2016-02-16 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
EP2826414A4 (en) * 2012-07-31 2015-12-02 Japan Science & Tech Agency Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
CN103631364A (en) * 2012-08-20 2014-03-12 联想(北京)有限公司 Control method and electronic device
US9292086B2 (en) 2012-09-26 2016-03-22 Grinbath, Llc Correlating pupil position to gaze location within a scene
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9619020B2 (en) 2013-03-01 2017-04-11 Tobii Ab Delay warp gaze interaction
US10545574B2 (en) 2013-03-01 2020-01-28 Tobii Ab Determining gaze target based on facial features
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US10534526B2 (en) 2013-03-13 2020-01-14 Tobii Ab Automatic scrolling based on gaze detection
US9864498B2 (en) 2013-03-13 2018-01-09 Tobii Ab Automatic scrolling based on gaze detection
US10129470B2 (en) 2013-04-19 2018-11-13 Gopro, Inc. Apparatus and method for generating an output video stream from a wide field video stream
CN104113680A (en) * 2013-04-19 2014-10-22 北京三星通信技术研究有限公司 Sight line tracking system and method
US10007337B2 (en) 2013-07-09 2018-06-26 Smart Eye Ab Eye gaze imaging
EP2823751A1 (en) * 2013-07-09 2015-01-14 Smart Eye AB Eye gaze imaging
WO2015003955A1 (en) * 2013-07-09 2015-01-15 Smart Eye Ab Eye gaze imaging
CN105358045A (en) * 2013-07-09 2016-02-24 斯玛特艾公司 Eye gaze imaging
US20150049013A1 (en) * 2013-08-19 2015-02-19 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
US10073518B2 (en) * 2013-08-19 2018-09-11 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
JP2015046111A (en) * 2013-08-29 2015-03-12 株式会社Jvcケンウッド Viewpoint detection device and viewpoint detection method
US10558262B2 (en) 2013-11-18 2020-02-11 Tobii Ab Component determination and gaze provoked interaction
US10317995B2 (en) 2013-11-18 2019-06-11 Tobii Ab Component determination and gaze provoked interaction
US20160262614A1 (en) * 2013-11-28 2016-09-15 JVC Kenwood Corporation Eye gaze detection supporting device and eye gaze detection supporting method
US9993154B2 (en) * 2013-11-28 2018-06-12 JVC Kenwood Corporation Eye gaze detection supporting device and eye gaze detection supporting method
EP3075304A4 (en) * 2013-11-28 2016-12-21 Jvc Kenwood Corp Line-of-sight detection assistance device and line-of-sight detection assistance method
KR102366110B1 (en) * 2014-01-07 2022-02-21 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mapping glints to light sources
KR20160108394A (en) * 2014-01-07 2016-09-19 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mapping glints to light sources
US9727136B2 (en) * 2014-05-19 2017-08-08 Microsoft Technology Licensing, Llc Gaze detection calibration
US10248199B2 (en) 2014-05-19 2019-04-02 Microsoft Technology Licensing, Llc Gaze detection calibration
US9952883B2 (en) 2014-08-05 2018-04-24 Tobii Ab Dynamic determination of hardware
US9798383B2 (en) 2014-09-19 2017-10-24 Intel Corporation Facilitating dynamic eye torsion-based eye tracking on computing devices
CN106575162A (en) * 2014-09-19 2017-04-19 英特尔公司 Facilitating dynamic eye torsion-based eye tracking on computing devices
WO2016043835A1 (en) * 2014-09-19 2016-03-24 Intel Corporation Facilitating dynamic eye torsion-based eye tracking on computing devices
US9913578B2 (en) * 2014-10-02 2018-03-13 Fujitsu Limited Eye gaze detecting device and eye gaze detection method
US20160095511A1 (en) * 2014-10-02 2016-04-07 Fujitsu Limited Eye gaze detecting device and eye gaze detection method
US11635806B2 (en) 2014-12-17 2023-04-25 Sony Corporation Information processing apparatus and information processing method
CN111493809A (en) * 2014-12-17 2020-08-07 索尼公司 Information processing apparatus and method, glasses-type terminal, and storage medium
US10234940B2 (en) * 2015-02-04 2019-03-19 Itu Business Development A/S Gaze tracker and a gaze tracking method
CN104699124A (en) * 2015-03-24 2015-06-10 天津通信广播集团有限公司 Television angle regulating method based on sight watching angle detection
US20170061251A1 (en) * 2015-08-28 2017-03-02 Beijing Kuangshi Technology Co., Ltd. Liveness detection method, liveness detection system, and liveness detection device
US10528849B2 (en) * 2015-08-28 2020-01-07 Beijing Kuangshi Technology Co., Ltd. Liveness detection method, liveness detection system, and liveness detection device
US10061383B1 (en) * 2015-09-16 2018-08-28 Mirametrix Inc. Multi-feature gaze tracking system and method
US10033928B1 (en) 2015-10-29 2018-07-24 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10999512B2 (en) 2015-10-29 2021-05-04 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10560633B2 (en) 2015-10-29 2020-02-11 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10972661B2 (en) 2015-11-23 2021-04-06 Gopro, Inc. Apparatus and methods for image alignment
US9973696B1 (en) 2015-11-23 2018-05-15 Gopro, Inc. Apparatus and methods for image alignment
US10498958B2 (en) 2015-11-23 2019-12-03 Gopro, Inc. Apparatus and methods for image alignment
US9792709B1 (en) 2015-11-23 2017-10-17 Gopro, Inc. Apparatus and methods for image alignment
US9848132B2 (en) * 2015-11-24 2017-12-19 Gopro, Inc. Multi-camera time synchronization
US20170150236A1 (en) * 2015-11-24 2017-05-25 Gopro, Inc. Multi-Camera Time Synchronization
US9973746B2 (en) 2016-02-17 2018-05-15 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10129516B2 (en) 2016-02-22 2018-11-13 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US11546566B2 (en) 2016-02-22 2023-01-03 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10536683B2 (en) 2016-02-22 2020-01-14 Gopro, Inc. System and method for presenting and viewing a spherical video segment
EP3436326A4 (en) * 2016-04-01 2019-11-13 LG Electronics Inc. -1- Vehicle control apparatus and method thereof
US20170293354A1 (en) * 2016-04-09 2017-10-12 Beihang University Calculation method of line-of-sight direction based on analysis and match of iris contour in human eye image
US10082868B2 (en) * 2016-04-09 2018-09-25 Beihang University Calculation method of line-of-sight direction based on analysis and match of iris contour in human eye image
US10432855B1 (en) 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images
US9934758B1 (en) 2016-09-21 2018-04-03 Gopro, Inc. Systems and methods for simulating adaptation of eyes to changes in lighting conditions
US10546555B2 (en) 2016-09-21 2020-01-28 Gopro, Inc. Systems and methods for simulating adaptation of eyes to changes in lighting conditions
US10607087B2 (en) 2016-10-05 2020-03-31 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10268896B1 (en) 2016-10-05 2019-04-23 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10915757B2 (en) 2016-10-05 2021-02-09 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10560648B2 (en) 2017-02-22 2020-02-11 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10893223B2 (en) 2017-02-22 2021-01-12 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10194101B1 (en) 2017-02-22 2019-01-29 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10412328B2 (en) 2017-02-22 2019-09-10 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10558337B2 (en) 2017-03-22 2020-02-11 International Business Machines Corporation Cognitive dashboard adjustment
US10108319B2 (en) * 2017-03-22 2018-10-23 International Business Machines Corporation Cognitive dashboard adjustment
US11150469B2 (en) * 2017-09-28 2021-10-19 Apple Inc. Method and device for eye tracking using event camera data
US11474348B2 (en) 2017-09-28 2022-10-18 Apple Inc. Method and device for eye tracking using event camera data
US12105280B2 (en) 2017-09-28 2024-10-01 Apple Inc. Method and device for eye tracking using event camera data
US10936056B2 (en) * 2018-04-16 2021-03-02 Google Llc Method and system of eye tracking with glint drift correction on wearable heads-up display
EP3680884A2 (en) * 2019-01-09 2020-07-15 Samsung Display Co., Ltd. Photo sensor, display device including the same, and driving method thereof
CN112732071A (en) * 2020-12-11 2021-04-30 浙江大学 Calibration-free eye movement tracking system and application
US20220321866A1 (en) * 2021-02-08 2022-10-06 Yuyao Sunny Optical Intelligence Technology Co., Ltd. Head-Mounted Viewable Device and Eye-Tracking System for Use in Head-Mounted Viewable Device
US11743446B2 (en) * 2021-02-08 2023-08-29 Yuyao Sunny Optical Intelligence Technology Co., Ltd. Head-mounted viewable device and eye-tracking system for use in head-mounted viewable device
SE2151198A1 (en) * 2021-09-30 2023-03-31 Tobii Ab Gaze defect compensation
US12086312B2 (en) 2021-12-20 2024-09-10 Samsung Electronics Co., Ltd. Apparatus and method of controlling light source in eye tracking using glint

Also Published As

Publication number Publication date
US6578962B1 (en) 2003-06-17

Similar Documents

Publication Publication Date Title
US6578962B1 (en) Calibration-free eye gaze tracking
US6659611B2 (en) System and method for eye gaze tracking using corneal image mapping
JP6902075B2 (en) Line-of-sight tracking using structured light
Zhu et al. Novel eye gaze tracking techniques under natural head movement
JP6963820B2 (en) Line-of-sight detector
US8077914B1 (en) Optical tracking apparatus using six degrees of freedom
JP5467303B1 (en) Gaze point detection device, gaze point detection method, personal parameter calculation device, personal parameter calculation method, program, and computer-readable recording medium
Ji et al. Eye and gaze tracking for interactive graphic display
JP6631951B2 (en) Eye gaze detection device and eye gaze detection method
US7657062B2 (en) Self-calibration for an eye tracker
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
US11624907B2 (en) Method and device for eye metric acquisition
JP7030317B2 (en) Pupil detection device and pupil detection method
KR20190079667A (en) Method and system for eye tracking using speckle patterns
Ebisawa et al. Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras
US10475415B1 (en) Strobe tracking of head-mounted displays (HMDs) in virtual, augmented, and mixed reality (xR) applications
Nagamatsu et al. User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes
WO2018164104A1 (en) Eye image processing device
JP6747172B2 (en) Diagnosis support device, diagnosis support method, and computer program
JP6430813B2 (en) Position detection apparatus, position detection method, gazing point detection apparatus, and image generation apparatus
JP2019098024A (en) Image processing device and method
JP6780161B2 (en) Line-of-sight detection device and line-of-sight detection method
Ebisawa Realtime 3D position detection of human pupil
JP7269617B2 (en) Face image processing device, image observation system, and pupil detection system
Sugimoto et al. Detecting a gazing region by visual direction and stereo cameras

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINES MACHINES CORPORATINO, NEW YO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AMIR, ARNON;FLICKNER, MYRON DALE;KOONS, DAVID BRUCE;AND OTHERS;REEL/FRAME:012087/0569

Effective date: 20010427

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AMIR, ARNON;FLICKNER, MYRON DALE;KOONS, DAVID BRUCE;AND OTHERS;REEL/FRAME:012087/0569

Effective date: 20010427

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: IPG HEALTHCARE 501 LIMITED, UNITED KINGDOM

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:020083/0864

Effective date: 20070926

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: PAT HOLDER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: LTOS); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: TOBII TECHNOLOGY AB, SWEDEN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPG HEALTHCARE 501 LIMITED;REEL/FRAME:027714/0001

Effective date: 20120207

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: TOBII AB, SWEDEN

Free format text: CHANGE OF NAME;ASSIGNOR:TOBII TECHNOLOGY AB;REEL/FRAME:042980/0766

Effective date: 20150206