VADS: Visual attention detection with a smartphone

Z Jiang, J Han, C Qian, W Xi, K Zhao… - … INFOCOM 2016-The …, 2016 - ieeexplore.ieee.org
IEEE INFOCOM 2016-The 35th Annual IEEE International Conference on …, 2016ieeexplore.ieee.org
Identifying the object that attracts human visual attention is an essential function for
automatic services in smart environments. However, existing solutions can compute the
gaze direction without providing the distance to the target. In addition, most of them rely on
special devices or infrastructure support. This paper explores the possibility of using a
smartphone to detect the visual attention of a user. By applying the proposed VADS system,
acquiring the location of the intended object only requires one simple action: gazing at the …
Identifying the object that attracts human visual attention is an essential function for automatic services in smart environments. However, existing solutions can compute the gaze direction without providing the distance to the target. In addition, most of them rely on special devices or infrastructure support. This paper explores the possibility of using a smartphone to detect the visual attention of a user. By applying the proposed VADS system, acquiring the location of the intended object only requires one simple action: gazing at the intended object and holding up the smartphone so that the object as well as user's face can be simultaneously captured by the front and rear cameras. We extend the current advances of computer vision to develop efficient algorithms to obtain the distance between the camera and user, the user's gaze direction, and the object's direction from camera. The object's location can then be computed by solving a trigonometric problem. VADS has been prototyped on commercial off-the-shelf (COTS) devices. Extensive evaluation results show that VADS achieves low error (about 1.5° in angle and 0.15m in distance for objects within 12m) as well as short latency. We believe that VADS enables a large variety of applications in smart environments.
ieeexplore.ieee.org