US10890979B2 - Controlling system and controlling method for virtual display - Google Patents
Controlling system and controlling method for virtual display Download PDFInfo
- Publication number
- US10890979B2 US10890979B2 US15/959,665 US201815959665A US10890979B2 US 10890979 B2 US10890979 B2 US 10890979B2 US 201815959665 A US201815959665 A US 201815959665A US 10890979 B2 US10890979 B2 US 10890979B2
- Authority
- US
- United States
- Prior art keywords
- axis
- display
- virtual display
- space
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000000007 visual effect Effects 0.000 claims abstract description 51
- 230000001131 transforming effect Effects 0.000 claims abstract description 19
- 210000001747 pupil Anatomy 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 9
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 12
- 230000002452 interceptive effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/532—Query formulation, e.g. graphical querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G06K9/00375—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/80—Recognising image objects characterised by unique random patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
Definitions
- the disclosure relates in general to a controlling system and a controlling method for virtual display.
- VR virtual reality
- AR augmented reality
- SR substitutional reality
- MR mixed reality
- the interactive display technology has been applied in some professional areas such as gaming, virtual shops, virtual offices, and virtual tour.
- the interactive display technology can also be used in areas such as education to provide a learning experience which is lively and impressive.
- Conventional interactive display technology is normally operated through a user interface (UI).
- UI user interface
- the user's hand often affects object recognition.
- the user cannot control the object at a remote end.
- the user normally needs to physically touch the user interface, and therefore has a poor user experience.
- virtual display should be infinitely extended.
- an effective cursor controlling method capable of concurrently controlling an object located afar and another object located nearby is still unavailable.
- the disclosure is directed to a controlling system and a controlling method for virtual display capable of controlling each object in an infinitely extended virtual display with a cursor by using a visual line tracking technology and a space transformation technology.
- a controlling system for virtual display includes a visual line tracking unit, a space forming unit, a hand information capturing unit, a transforming unit and a controlling unit.
- the visual line tracking unit is used for tracking a visual line of a user.
- the space forming unit is used for forming a virtual display space according to the visual line.
- the hand information capturing unit is used for obtaining a hand location of the user's one hand in a real operation space.
- the transforming unit is used for transforming the hand location to be a cursor location in the virtual display space.
- the controlling unit is used for controlling the virtual display according to the cursor location.
- a controlling method for virtual display includes following steps: tracking a visual line of a user; forming a virtual display space according to the visual line; obtaining a hand location of the user's one hand in a real operation space; transforming the hand location to be a cursor location in the virtual display space; and controlling the virtual display according to the cursor location.
- FIG. 1 is a schematic diagram of a controlling system for virtual display according to an embodiment
- FIG. 2 is a block diagram of a controlling system.
- FIG. 3 is a flowchart of a controlling method for virtual display according to an embodiment.
- FIGS. 4A to 4C illustrates the operations of step S 110 .
- FIG. 5 illustrates the operations of step S 140 .
- FIG. 6 illustrates the operations of step S 150 .
- the controlling system 100 can be realized by such as a head-mounted display, a smartphone, or a smart glasses.
- the controlling system 100 allows the user to operate in a real operation space S 0 .
- a visual line VS 1 of a user corresponds to an object O 1
- the user's operation in the real operation space S 0 will correspond to a virtual display space S 1 at a remote end.
- the visual line VS 2 of the user corresponds to an object O 2
- the user's operation in the real operation space S 0 will correspond to a virtual display space S 2 at the remote end.
- a hand location L 0 of the user' one hand in the real operation space S 0 will correspond to a cursor location L 1 (or a cursor location L 2 ) in the virtual display space S 1 (or the virtual display space S 2 ) to control the virtual display according to the cursor location L 1 (or the cursor location L 2 ).
- the controlling system 100 includes a visual line tracking unit 110 , an object detection unit 120 , a space forming unit 130 , a hand information capturing unit 140 , a transforming unit 150 and a controlling unit 160 .
- the visual line tracking unit 110 is used for tracking the visual lines VS 1 and VS 2 .
- the visual line tracking unit 110 can be formed of an infrared image capturing device 111 , a pupil location detector 112 and a visual line locator 113 .
- the object detection unit 120 is used for detecting the objects O 1 and O 2 .
- the space forming unit 130 is used for forming the virtual display spaces S 1 , S 2 .
- the hand information capturing unit 140 is used for capturing the hand location L 0 .
- the hand information capturing unit 140 can be realized by a combination of a depth image capturing device 141 and a hand recognizer 142 .
- the transforming unit 150 is used for transforming the hand location L 0 to be cursor locations L 1 and L 2
- the transforming unit 150 can be realized by a combination of a ratio calculator 151 and a mapper 152 .
- the controlling unit 160 is used for controlling virtual display.
- the visual line tracking unit 110 , the object detection unit 120 , the space forming unit 130 , the hand information capturing unit 140 , the transforming unit 150 , and the controlling unit 160 can be realized by such as a chip, a circuit, a firmware, a circuit board, an electronic device or a recording device storing multiple programming codes. The operations of each element are disclosed below with a flowchart.
- step S 110 the visual line tracking unit 110 tracks a visual line VS 1 (or a visual line VS 2 ) of a user.
- FIGS. 4A to 4C operations of step S 110 are illustrated.
- an infrared image capturing device 111 may be disposed in a head-mounted display 900 .
- the infrared image capturing device 111 immediately captures an image of an eye 800 (illustrated in FIG. 4B ).
- FIG. 4B illustrates an image of an eye 800 (illustrated in FIG. 4B ).
- the pupil location detector 112 detects a location of a pupil 810 according to the infrared images IM 1 to IM 4 captured by the infrared image capturing device 111 .
- the eye 800 of FIG. 4B is looking at the rightmost point A 1 , the topmost point A 2 , the leftmost point A 3 and the bottommost point A 4 of FIG. 4C , respectively.
- the visual line locator 113 recognizes the visual line of the user according to the location of the pupil 810 .
- the object detection unit 120 provides an object O 1 (or an object O 2 ) at which the user is looking according to the visual line VS 1 (or the visual line VS 2 ).
- the object detection unit 120 detects the background image according to at least one contour line of the background by using an edge detection algorithm, and connects the at least one contour line to form the object O 1 (or the object O 2 ).
- the object detection unit 120 searches a database to locate the object O 1 (or the object O 2 ) corresponding to the visual line VS 1 (or the visual line VS 2 ) according to the visual line VS 1 (or the visual line VS 2 ).
- the space forming unit 130 forms a virtual display space S 1 (or a virtual display space S 2 ) according to the object O 1 (or the object O 2 ) corresponding to the visual line VS 1 (or the visual line VS 2 ).
- the sizes of the virtual display spaces S 1 and S 2 vary with the objects O 1 and O 2 , but are irrelevant with the distances of the objects O 1 and O 2 .
- the object O 1 is larger, so the virtual display space S 1 is also larger; the object O 2 is smaller, so the virtual display space S 2 is smaller.
- the size of the real operation space S 0 may be the same as or different from the size of the virtual display space S 1 (or the virtual display space S 2 ).
- the length/width/height ratio of the virtual display spaces S 1 and S 2 is not fixed but depends on the objects O 1 and O 2 .
- the step S 120 can be omitted, and the virtual display space S 1 (or the virtual display space S 2 ) can be directly formed according to the visual line VS 1 (or the visual line VS 2 ).
- step S 140 the hand information capturing unit 140 obtains a hand location L 0 of the user's one hand 700 (illustrated in FIG. 5 ) in the real operation space S 0 .
- the depth image capturing device 141 captures a depth image DM 1 of the user.
- the hand recognizer 142 recognizes the hand 700 (illustrated in FIG. 5 ) from the depth image DM 1 and positions the hand location L 0 of the user's one hand 700 .
- step S 150 the transforming unit 150 transforms the hand location L 0 to be a cursor location L 1 (or a cursor location L 2 ) in the virtual display space S 1 (or the virtual display space S 2 ).
- the cursor location L 1 is used as an example.
- the real operation space S 0 has a first operation axis , a second operation axis , and a third operation axis .
- the first operation axis is a vector formed by point R 1 and point R 2 .
- the second operation axis is a vector formed by point R 1 and point R 3 .
- the third operation axis is a vector formed by point R 1 and point R 4 .
- the hand location vector is a vector formed by point R 1 and the hand location L 0 .
- the virtual display space S 1 has a first display axis , a second display axis , and a third display axis .
- the first display axis is a vector formed by point V 1 and point V 2 .
- the second display axis is a vector formed by point V 1 and point V 3 .
- the third display axis is a vector formed by point V 1 and point V 4 .
- the cursor location vector is a vector formed by point V 1 and the cursor location L 1 .
- the angle relationship among the first operation axis , the second operation axis , and the third operation axis may be different from or the same as the angle relationship among the first display axis , the second display axis , and the third display axis .
- the real operation space S 0 may be a Cartesian coordinate system
- the virtual display space S 1 may be a non-Cartesian coordinate system (that is, not every angle formed by two axes is a right angle).
- step S 150 based on formulas (1) to (3), the ratio calculator 151 calculates a first relative ratio Xrate of the hand location L 0 in the first operation axis , a second relative ratio Yrate of the hand location L 0 in the second operation axis , and a third relative ratio Zrate of the hand location L 0 in the third operation axis .
- the first hand projection vector is a projection vector of the hand location vector in the first operation axis .
- the second hand projection vector is a projection vector of the hand location vector in the second operation axis .
- the third hand projection vector is a projection vector of the hand location vector in the third operation axis .
- Xrate ⁇ RL ⁇ ⁇ 0 ⁇ x ⁇ ⁇ ⁇ Rx ⁇ ⁇ ( 1 )
- Yrate ⁇ RL ⁇ ⁇ 0 ⁇ y ⁇ ⁇ ⁇ Ry ⁇ ⁇ ( 2 )
- Zrate ⁇ RL ⁇ ⁇ 0 ⁇ z ⁇ ⁇ ⁇ Rz ⁇ ⁇ ( 3 )
- the mapper 152 calculates a first display coordinate XL 1 of the hand location L 0 corresponding to the first display axis according to the first relative ratio Xrate, a second display coordinate YL 1 of the hand location L 0 corresponding to the second display axis according to the second relative ratio Yrate, and a third display coordinate ZL 1 of the hand location L 0 in the third display axis according to the third relative ratio Zrate.
- the point V 1 has the first display coordinate XV 1 , the second display coordinate YV 1 , and the third display coordinate ZV 1 .
- ( XL 1, YL 1, ZL 1) ( XV 1, YV 1, ZV 1)+ X rate* + Y rate* + Z rate* (4)
- the transforming unit 150 can transform the hand location L 0 to be the cursor location L 1 in the virtual display space S 1 .
- step S 160 the controlling unit 160 controls the virtual display according to the cursor location L 1 (or the cursor location L 2 ).
- the movement of the cursor is adjusted according to the first relative ratio Xrate, the second relative ratio Yrate, and the third relative ratio Zrate.
- the same effect can be generated as long as the operations performed in the real operation space S 0 are of the same scale.
- the user can slide his hand 700 for “a half-length” in the real operation space S 0 to flip the object O 2 located nearby by 90°.
- the user can also can slide his hand 700 for “a half-length” in the real operation space S 0 to flip the object O 1 located afar by 90°.
- the two operations generate the same effect, and are not affected by the distances of the objects O 1 and O 2 .
- the movement of the cursor is adjusted according to the first relative ratio Xrate, the second relative ratio Yrate, and the third relative ratio Zrate.
- the same effect can be generated as long as the operations performed in the real operation space S 0 are of the same scale.
- the user can slide his hand 700 for “a semi-circle” in the real operation space S 0 to turn the object O 2 which is smaller upside down.
- the user can also slide his hand 700 for “a semi-circle” in the real operation space S 0 to turn the object O 1 which is bigger upside down.
- the two operations generate the same effect, and are not affected by the size of the object O 1 and the size of the object O 2 .
- the user can use a cursor to control each object in an infinitely extended virtual display with a cursor by using a visual line tracking technology and a space transformation technology of the interactive display technologies.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
(XL1,YL1,ZL1)=(XV1,YV1,ZV1)+Xrate*+Yrate*+Zrate* (4)
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/959,665 US10890979B2 (en) | 2018-04-23 | 2018-04-23 | Controlling system and controlling method for virtual display |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/959,665 US10890979B2 (en) | 2018-04-23 | 2018-04-23 | Controlling system and controlling method for virtual display |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190324550A1 US20190324550A1 (en) | 2019-10-24 |
US10890979B2 true US10890979B2 (en) | 2021-01-12 |
Family
ID=68237749
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/959,665 Active 2038-07-07 US10890979B2 (en) | 2018-04-23 | 2018-04-23 | Controlling system and controlling method for virtual display |
Country Status (1)
Country | Link |
---|---|
US (1) | US10890979B2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018061621A (en) * | 2016-10-11 | 2018-04-19 | オプトス ピーエルシー | Ocular fundus imaging apparatus, ocular fundus imaging method, and ocular fundus imaging program |
JP7285879B2 (en) * | 2021-05-26 | 2023-06-02 | ソフトバンク株式会社 | XR MALL GENERATING DEVICE, CONTROL METHOD OF XR MALL GENERATING DEVICE, AND XR MALL GENERATING PROGRAM |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020175897A1 (en) | 1999-08-27 | 2002-11-28 | Pelosi Michael J. | 3D cursor or joystick device |
US20050159916A1 (en) * | 2002-12-27 | 2005-07-21 | Canon Kabushiki Kaisha | Information processing apparatus, and information processing method |
US20060050069A1 (en) * | 2004-09-07 | 2006-03-09 | Canon Kabushiki Kaisha | Virtual reality presentation device and information processing method |
US20090102845A1 (en) * | 2007-10-19 | 2009-04-23 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20110055720A1 (en) * | 2009-09-03 | 2011-03-03 | David Potter | Comprehensive user control system for therapeutic wellness devices |
US20110221672A1 (en) | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Hand-worn control device in an augmented reality eyepiece |
US20120287304A1 (en) * | 2009-12-28 | 2012-11-15 | Cyber Ai Entertainment Inc. | Image recognition system |
US20130328925A1 (en) | 2012-06-12 | 2013-12-12 | Stephen G. Latta | Object focus in a mixed reality environment |
US20140160001A1 (en) | 2012-12-06 | 2014-06-12 | Peter Tobias Kinnebrew | Mixed reality presentation |
CN103955267A (en) | 2013-11-13 | 2014-07-30 | 上海大学 | Double-hand man-machine interaction method in x-ray fluoroscopy augmented reality system |
US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
US20150178939A1 (en) | 2013-11-27 | 2015-06-25 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US20150205484A1 (en) * | 2012-07-27 | 2015-07-23 | Nec Solution Innovators, Ltd. | Three-dimensional user interface apparatus and three-dimensional operation method |
CN104850221A (en) | 2014-02-14 | 2015-08-19 | 欧姆龙株式会社 | Gesture recognition device and method of controlling gesture recognition device |
TWI502468B (en) | 2009-11-10 | 2015-10-01 | Acer Inc | Mobile electronic device and method for controlling 3d operation interface thereof |
US20170053452A1 (en) * | 2015-08-17 | 2017-02-23 | Colopl, Inc. | Method and apparatus for providing virtual space, and non-transitory computer readable data storage medium storing program causing computer to perform method |
TWM537702U (en) | 2016-11-19 | 2017-03-01 | Ping-Yuan Tsai | Augmented reality learning and reference system and architecture thereof |
CN106530926A (en) | 2016-11-29 | 2017-03-22 | 东南大学 | Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking |
US20170262168A1 (en) | 2014-08-29 | 2017-09-14 | Hewlett-Packard Development Company, Lp. | Touchscreen gestures |
CN107430437A (en) | 2015-02-13 | 2017-12-01 | 厉动公司 | The system and method that real crawl experience is created in virtual reality/augmented reality environment |
TWI610218B (en) | 2013-03-25 | 2018-01-01 | 三星電子股份有限公司 | Apparatus and method of controlling screens in a device |
US10092480B2 (en) * | 2013-12-11 | 2018-10-09 | Luraco, Inc. | Touchscreen-based control system for massage chairs |
US20190179511A1 (en) * | 2016-08-31 | 2019-06-13 | Sony Corporation | Information processing device, information processing method, and program |
-
2018
- 2018-04-23 US US15/959,665 patent/US10890979B2/en active Active
Patent Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020175897A1 (en) | 1999-08-27 | 2002-11-28 | Pelosi Michael J. | 3D cursor or joystick device |
US20050159916A1 (en) * | 2002-12-27 | 2005-07-21 | Canon Kabushiki Kaisha | Information processing apparatus, and information processing method |
US20060050069A1 (en) * | 2004-09-07 | 2006-03-09 | Canon Kabushiki Kaisha | Virtual reality presentation device and information processing method |
US20090102845A1 (en) * | 2007-10-19 | 2009-04-23 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20110055720A1 (en) * | 2009-09-03 | 2011-03-03 | David Potter | Comprehensive user control system for therapeutic wellness devices |
TWI502468B (en) | 2009-11-10 | 2015-10-01 | Acer Inc | Mobile electronic device and method for controlling 3d operation interface thereof |
US20120287304A1 (en) * | 2009-12-28 | 2012-11-15 | Cyber Ai Entertainment Inc. | Image recognition system |
US20110221672A1 (en) | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Hand-worn control device in an augmented reality eyepiece |
US20130328925A1 (en) | 2012-06-12 | 2013-12-12 | Stephen G. Latta | Object focus in a mixed reality environment |
US20150205484A1 (en) * | 2012-07-27 | 2015-07-23 | Nec Solution Innovators, Ltd. | Three-dimensional user interface apparatus and three-dimensional operation method |
US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
US20140160001A1 (en) | 2012-12-06 | 2014-06-12 | Peter Tobias Kinnebrew | Mixed reality presentation |
TWI610218B (en) | 2013-03-25 | 2018-01-01 | 三星電子股份有限公司 | Apparatus and method of controlling screens in a device |
CN103955267A (en) | 2013-11-13 | 2014-07-30 | 上海大学 | Double-hand man-machine interaction method in x-ray fluoroscopy augmented reality system |
US20150178939A1 (en) | 2013-11-27 | 2015-06-25 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US10092480B2 (en) * | 2013-12-11 | 2018-10-09 | Luraco, Inc. | Touchscreen-based control system for massage chairs |
CN104850221A (en) | 2014-02-14 | 2015-08-19 | 欧姆龙株式会社 | Gesture recognition device and method of controlling gesture recognition device |
US20170262168A1 (en) | 2014-08-29 | 2017-09-14 | Hewlett-Packard Development Company, Lp. | Touchscreen gestures |
CN107430437A (en) | 2015-02-13 | 2017-12-01 | 厉动公司 | The system and method that real crawl experience is created in virtual reality/augmented reality environment |
US20170053452A1 (en) * | 2015-08-17 | 2017-02-23 | Colopl, Inc. | Method and apparatus for providing virtual space, and non-transitory computer readable data storage medium storing program causing computer to perform method |
US20190179511A1 (en) * | 2016-08-31 | 2019-06-13 | Sony Corporation | Information processing device, information processing method, and program |
TWM537702U (en) | 2016-11-19 | 2017-03-01 | Ping-Yuan Tsai | Augmented reality learning and reference system and architecture thereof |
CN106530926A (en) | 2016-11-29 | 2017-03-22 | 东南大学 | Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking |
Non-Patent Citations (4)
Title |
---|
Author Unknown, "A Survey of 3D Object Selection Techniques for Virtual Environments," Preprint submitted to Computer & Graphics, Nov. 20, 2012, pp. 1-18. |
Deng et al., "Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation," Int. J. Human-Computer Studies, vol. 105, 2017 (available online Apr. 27, 2017), pp. 68-80. |
Kölsch et al., "Vision-Based Interfaces for Mobility," Proceedings of the First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004, 9 pages. |
Park et al., "Defining Rules Among Devices in Smart Environment Using an Augmented Reality Headset," Urb-IoT, Tokyo, Japan, May 24-25, 2016, 4 pages. |
Also Published As
Publication number | Publication date |
---|---|
US20190324550A1 (en) | 2019-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Memo et al. | Head-mounted gesture controlled interface for human-computer interaction | |
US10380763B2 (en) | Hybrid corner and edge-based tracking | |
EP3786892B1 (en) | Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium | |
CN106920279B (en) | Three-dimensional map construction method and device | |
CN109564472B (en) | Method, medium, and system for selecting an interaction method with a virtual object | |
CN109947886B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
JP5728009B2 (en) | Instruction input device, instruction input method, program, recording medium, and integrated circuit | |
JP5521727B2 (en) | Image processing system, image processing apparatus, image processing method, and program | |
US8705868B2 (en) | Computer-readable storage medium, image recognition apparatus, image recognition system, and image recognition method | |
US20240087237A1 (en) | Augmented reality guidance that generates guidance markers | |
CN110363061B (en) | Computer readable medium, method for training object detection algorithm and display device | |
US20150145762A1 (en) | Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program | |
CN106462242A (en) | User interface control using gaze tracking | |
US10234955B2 (en) | Input recognition apparatus, input recognition method using maker location, and non-transitory computer-readable storage program | |
US8718325B2 (en) | Computer-readable storage medium, image processing apparatus, image processing system, and image processing method | |
EP2492869A2 (en) | Image processing program, image processing apparatus, image processing system, and image processing method | |
CN110168615B (en) | Information processing apparatus, information processing method, and storage medium | |
US20170243052A1 (en) | Book detection apparatus and book detection method | |
US11508130B2 (en) | Augmented reality environment enhancement | |
US20120219179A1 (en) | Computer-readable storage medium, image processing apparatus, image processing system, and image processing method | |
US10890979B2 (en) | Controlling system and controlling method for virtual display | |
JP2012238293A (en) | Input device | |
CN108896035B (en) | Method and equipment for realizing navigation through image information and navigation robot | |
Piérard et al. | I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes | |
JP7335726B2 (en) | Information display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIAO, DUAN-LI;LI, CHIA-CHANG;TING, WEN-HUNG;AND OTHERS;REEL/FRAME:046331/0195 Effective date: 20180703 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |