Nothing Special   »   [go: up one dir, main page]

US10890979B2 - Controlling system and controlling method for virtual display - Google Patents

Controlling system and controlling method for virtual display Download PDF

Info

Publication number
US10890979B2
US10890979B2 US15/959,665 US201815959665A US10890979B2 US 10890979 B2 US10890979 B2 US 10890979B2 US 201815959665 A US201815959665 A US 201815959665A US 10890979 B2 US10890979 B2 US 10890979B2
Authority
US
United States
Prior art keywords
axis
display
virtual display
space
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/959,665
Other versions
US20190324550A1 (en
Inventor
Duan-Li Liao
Chia-Chang Li
Wen-Hung Ting
Hian-Kun Tenn
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to US15/959,665 priority Critical patent/US10890979B2/en
Assigned to INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE reassignment INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LI, CHIA-CHANG, LIAO, DUAN-LI, TENN, HIAN-KUN, TING, WEN-HUNG
Publication of US20190324550A1 publication Critical patent/US20190324550A1/en
Application granted granted Critical
Publication of US10890979B2 publication Critical patent/US10890979B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • G06K9/00375
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/80Recognising image objects characterised by unique random patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Definitions

  • the disclosure relates in general to a controlling system and a controlling method for virtual display.
  • VR virtual reality
  • AR augmented reality
  • SR substitutional reality
  • MR mixed reality
  • the interactive display technology has been applied in some professional areas such as gaming, virtual shops, virtual offices, and virtual tour.
  • the interactive display technology can also be used in areas such as education to provide a learning experience which is lively and impressive.
  • Conventional interactive display technology is normally operated through a user interface (UI).
  • UI user interface
  • the user's hand often affects object recognition.
  • the user cannot control the object at a remote end.
  • the user normally needs to physically touch the user interface, and therefore has a poor user experience.
  • virtual display should be infinitely extended.
  • an effective cursor controlling method capable of concurrently controlling an object located afar and another object located nearby is still unavailable.
  • the disclosure is directed to a controlling system and a controlling method for virtual display capable of controlling each object in an infinitely extended virtual display with a cursor by using a visual line tracking technology and a space transformation technology.
  • a controlling system for virtual display includes a visual line tracking unit, a space forming unit, a hand information capturing unit, a transforming unit and a controlling unit.
  • the visual line tracking unit is used for tracking a visual line of a user.
  • the space forming unit is used for forming a virtual display space according to the visual line.
  • the hand information capturing unit is used for obtaining a hand location of the user's one hand in a real operation space.
  • the transforming unit is used for transforming the hand location to be a cursor location in the virtual display space.
  • the controlling unit is used for controlling the virtual display according to the cursor location.
  • a controlling method for virtual display includes following steps: tracking a visual line of a user; forming a virtual display space according to the visual line; obtaining a hand location of the user's one hand in a real operation space; transforming the hand location to be a cursor location in the virtual display space; and controlling the virtual display according to the cursor location.
  • FIG. 1 is a schematic diagram of a controlling system for virtual display according to an embodiment
  • FIG. 2 is a block diagram of a controlling system.
  • FIG. 3 is a flowchart of a controlling method for virtual display according to an embodiment.
  • FIGS. 4A to 4C illustrates the operations of step S 110 .
  • FIG. 5 illustrates the operations of step S 140 .
  • FIG. 6 illustrates the operations of step S 150 .
  • the controlling system 100 can be realized by such as a head-mounted display, a smartphone, or a smart glasses.
  • the controlling system 100 allows the user to operate in a real operation space S 0 .
  • a visual line VS 1 of a user corresponds to an object O 1
  • the user's operation in the real operation space S 0 will correspond to a virtual display space S 1 at a remote end.
  • the visual line VS 2 of the user corresponds to an object O 2
  • the user's operation in the real operation space S 0 will correspond to a virtual display space S 2 at the remote end.
  • a hand location L 0 of the user' one hand in the real operation space S 0 will correspond to a cursor location L 1 (or a cursor location L 2 ) in the virtual display space S 1 (or the virtual display space S 2 ) to control the virtual display according to the cursor location L 1 (or the cursor location L 2 ).
  • the controlling system 100 includes a visual line tracking unit 110 , an object detection unit 120 , a space forming unit 130 , a hand information capturing unit 140 , a transforming unit 150 and a controlling unit 160 .
  • the visual line tracking unit 110 is used for tracking the visual lines VS 1 and VS 2 .
  • the visual line tracking unit 110 can be formed of an infrared image capturing device 111 , a pupil location detector 112 and a visual line locator 113 .
  • the object detection unit 120 is used for detecting the objects O 1 and O 2 .
  • the space forming unit 130 is used for forming the virtual display spaces S 1 , S 2 .
  • the hand information capturing unit 140 is used for capturing the hand location L 0 .
  • the hand information capturing unit 140 can be realized by a combination of a depth image capturing device 141 and a hand recognizer 142 .
  • the transforming unit 150 is used for transforming the hand location L 0 to be cursor locations L 1 and L 2
  • the transforming unit 150 can be realized by a combination of a ratio calculator 151 and a mapper 152 .
  • the controlling unit 160 is used for controlling virtual display.
  • the visual line tracking unit 110 , the object detection unit 120 , the space forming unit 130 , the hand information capturing unit 140 , the transforming unit 150 , and the controlling unit 160 can be realized by such as a chip, a circuit, a firmware, a circuit board, an electronic device or a recording device storing multiple programming codes. The operations of each element are disclosed below with a flowchart.
  • step S 110 the visual line tracking unit 110 tracks a visual line VS 1 (or a visual line VS 2 ) of a user.
  • FIGS. 4A to 4C operations of step S 110 are illustrated.
  • an infrared image capturing device 111 may be disposed in a head-mounted display 900 .
  • the infrared image capturing device 111 immediately captures an image of an eye 800 (illustrated in FIG. 4B ).
  • FIG. 4B illustrates an image of an eye 800 (illustrated in FIG. 4B ).
  • the pupil location detector 112 detects a location of a pupil 810 according to the infrared images IM 1 to IM 4 captured by the infrared image capturing device 111 .
  • the eye 800 of FIG. 4B is looking at the rightmost point A 1 , the topmost point A 2 , the leftmost point A 3 and the bottommost point A 4 of FIG. 4C , respectively.
  • the visual line locator 113 recognizes the visual line of the user according to the location of the pupil 810 .
  • the object detection unit 120 provides an object O 1 (or an object O 2 ) at which the user is looking according to the visual line VS 1 (or the visual line VS 2 ).
  • the object detection unit 120 detects the background image according to at least one contour line of the background by using an edge detection algorithm, and connects the at least one contour line to form the object O 1 (or the object O 2 ).
  • the object detection unit 120 searches a database to locate the object O 1 (or the object O 2 ) corresponding to the visual line VS 1 (or the visual line VS 2 ) according to the visual line VS 1 (or the visual line VS 2 ).
  • the space forming unit 130 forms a virtual display space S 1 (or a virtual display space S 2 ) according to the object O 1 (or the object O 2 ) corresponding to the visual line VS 1 (or the visual line VS 2 ).
  • the sizes of the virtual display spaces S 1 and S 2 vary with the objects O 1 and O 2 , but are irrelevant with the distances of the objects O 1 and O 2 .
  • the object O 1 is larger, so the virtual display space S 1 is also larger; the object O 2 is smaller, so the virtual display space S 2 is smaller.
  • the size of the real operation space S 0 may be the same as or different from the size of the virtual display space S 1 (or the virtual display space S 2 ).
  • the length/width/height ratio of the virtual display spaces S 1 and S 2 is not fixed but depends on the objects O 1 and O 2 .
  • the step S 120 can be omitted, and the virtual display space S 1 (or the virtual display space S 2 ) can be directly formed according to the visual line VS 1 (or the visual line VS 2 ).
  • step S 140 the hand information capturing unit 140 obtains a hand location L 0 of the user's one hand 700 (illustrated in FIG. 5 ) in the real operation space S 0 .
  • the depth image capturing device 141 captures a depth image DM 1 of the user.
  • the hand recognizer 142 recognizes the hand 700 (illustrated in FIG. 5 ) from the depth image DM 1 and positions the hand location L 0 of the user's one hand 700 .
  • step S 150 the transforming unit 150 transforms the hand location L 0 to be a cursor location L 1 (or a cursor location L 2 ) in the virtual display space S 1 (or the virtual display space S 2 ).
  • the cursor location L 1 is used as an example.
  • the real operation space S 0 has a first operation axis , a second operation axis , and a third operation axis .
  • the first operation axis is a vector formed by point R 1 and point R 2 .
  • the second operation axis is a vector formed by point R 1 and point R 3 .
  • the third operation axis is a vector formed by point R 1 and point R 4 .
  • the hand location vector is a vector formed by point R 1 and the hand location L 0 .
  • the virtual display space S 1 has a first display axis , a second display axis , and a third display axis .
  • the first display axis is a vector formed by point V 1 and point V 2 .
  • the second display axis is a vector formed by point V 1 and point V 3 .
  • the third display axis is a vector formed by point V 1 and point V 4 .
  • the cursor location vector is a vector formed by point V 1 and the cursor location L 1 .
  • the angle relationship among the first operation axis , the second operation axis , and the third operation axis may be different from or the same as the angle relationship among the first display axis , the second display axis , and the third display axis .
  • the real operation space S 0 may be a Cartesian coordinate system
  • the virtual display space S 1 may be a non-Cartesian coordinate system (that is, not every angle formed by two axes is a right angle).
  • step S 150 based on formulas (1) to (3), the ratio calculator 151 calculates a first relative ratio Xrate of the hand location L 0 in the first operation axis , a second relative ratio Yrate of the hand location L 0 in the second operation axis , and a third relative ratio Zrate of the hand location L 0 in the third operation axis .
  • the first hand projection vector is a projection vector of the hand location vector in the first operation axis .
  • the second hand projection vector is a projection vector of the hand location vector in the second operation axis .
  • the third hand projection vector is a projection vector of the hand location vector in the third operation axis .
  • Xrate ⁇ RL ⁇ ⁇ 0 ⁇ x ⁇ ⁇ ⁇ Rx ⁇ ⁇ ( 1 )
  • Yrate ⁇ RL ⁇ ⁇ 0 ⁇ y ⁇ ⁇ ⁇ Ry ⁇ ⁇ ( 2 )
  • Zrate ⁇ RL ⁇ ⁇ 0 ⁇ z ⁇ ⁇ ⁇ Rz ⁇ ⁇ ( 3 )
  • the mapper 152 calculates a first display coordinate XL 1 of the hand location L 0 corresponding to the first display axis according to the first relative ratio Xrate, a second display coordinate YL 1 of the hand location L 0 corresponding to the second display axis according to the second relative ratio Yrate, and a third display coordinate ZL 1 of the hand location L 0 in the third display axis according to the third relative ratio Zrate.
  • the point V 1 has the first display coordinate XV 1 , the second display coordinate YV 1 , and the third display coordinate ZV 1 .
  • ( XL 1, YL 1, ZL 1) ( XV 1, YV 1, ZV 1)+ X rate* + Y rate* + Z rate* (4)
  • the transforming unit 150 can transform the hand location L 0 to be the cursor location L 1 in the virtual display space S 1 .
  • step S 160 the controlling unit 160 controls the virtual display according to the cursor location L 1 (or the cursor location L 2 ).
  • the movement of the cursor is adjusted according to the first relative ratio Xrate, the second relative ratio Yrate, and the third relative ratio Zrate.
  • the same effect can be generated as long as the operations performed in the real operation space S 0 are of the same scale.
  • the user can slide his hand 700 for “a half-length” in the real operation space S 0 to flip the object O 2 located nearby by 90°.
  • the user can also can slide his hand 700 for “a half-length” in the real operation space S 0 to flip the object O 1 located afar by 90°.
  • the two operations generate the same effect, and are not affected by the distances of the objects O 1 and O 2 .
  • the movement of the cursor is adjusted according to the first relative ratio Xrate, the second relative ratio Yrate, and the third relative ratio Zrate.
  • the same effect can be generated as long as the operations performed in the real operation space S 0 are of the same scale.
  • the user can slide his hand 700 for “a semi-circle” in the real operation space S 0 to turn the object O 2 which is smaller upside down.
  • the user can also slide his hand 700 for “a semi-circle” in the real operation space S 0 to turn the object O 1 which is bigger upside down.
  • the two operations generate the same effect, and are not affected by the size of the object O 1 and the size of the object O 2 .
  • the user can use a cursor to control each object in an infinitely extended virtual display with a cursor by using a visual line tracking technology and a space transformation technology of the interactive display technologies.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A controlling system and a controlling method for virtual display are provided. The controlling system for virtual display includes a visual line tracking unit, a space forming unit, a hand information capturing unit, a transforming unit and a controlling unit. The visual line tracking unit is used for tracking a visual line of a user. The space forming unit is used for forming a virtual display space according to the visual line. The hand information capturing unit is used for obtaining a hand location of the user's one hand in a real operation space. The transforming unit is used for transforming the hand location to be a cursor location in the virtual display space. The controlling unit is used for controlling the virtual display according to the cursor location.

Description

TECHNICAL FIELD
The disclosure relates in general to a controlling system and a controlling method for virtual display.
BACKGROUND
Along with the development in the interactive technology, various interactive display technologies such as virtual reality (VR), augmented reality (AR), substitutional reality (SR) and mixed reality (MR) are provided. The interactive display technology has been applied in some professional areas such as gaming, virtual shops, virtual offices, and virtual tour. The interactive display technology can also be used in areas such as education to provide a learning experience which is lively and impressive.
Conventional interactive display technology is normally operated through a user interface (UI). However, the user's hand often affects object recognition. In the conventional interactive display technology, the user cannot control the object at a remote end. The user normally needs to physically touch the user interface, and therefore has a poor user experience.
Moreover, according to the interactive display technology, virtual display should be infinitely extended. However, an effective cursor controlling method capable of concurrently controlling an object located afar and another object located nearby is still unavailable.
SUMMARY
The disclosure is directed to a controlling system and a controlling method for virtual display capable of controlling each object in an infinitely extended virtual display with a cursor by using a visual line tracking technology and a space transformation technology.
According to one embodiment, a controlling system for virtual display is provided. The controlling system for virtual display includes a visual line tracking unit, a space forming unit, a hand information capturing unit, a transforming unit and a controlling unit. The visual line tracking unit is used for tracking a visual line of a user. The space forming unit is used for forming a virtual display space according to the visual line. The hand information capturing unit is used for obtaining a hand location of the user's one hand in a real operation space. The transforming unit is used for transforming the hand location to be a cursor location in the virtual display space. The controlling unit is used for controlling the virtual display according to the cursor location.
According to another embodiment, a controlling method for virtual display is provided. The controlling method for virtual display includes following steps: tracking a visual line of a user; forming a virtual display space according to the visual line; obtaining a hand location of the user's one hand in a real operation space; transforming the hand location to be a cursor location in the virtual display space; and controlling the virtual display according to the cursor location.
The above and other aspects of the invention will become better understood with regard to the following detailed description of the preferred but non-limiting embodiment(s). The following description is made with reference to the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of a controlling system for virtual display according to an embodiment
FIG. 2 is a block diagram of a controlling system.
FIG. 3 is a flowchart of a controlling method for virtual display according to an embodiment.
FIGS. 4A to 4C illustrates the operations of step S110.
FIG. 5 illustrates the operations of step S140.
FIG. 6 illustrates the operations of step S150.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
DETAILED DESCRIPTION
Referring to FIG. 1, a schematic diagram of a controlling system for virtual display 100 according to an embodiment is shown. The controlling system 100 can be realized by such as a head-mounted display, a smartphone, or a smart glasses. The controlling system 100 allows the user to operate in a real operation space S0. When a visual line VS1 of a user corresponds to an object O1, the user's operation in the real operation space S0 will correspond to a virtual display space S1 at a remote end. When the visual line VS2 of the user corresponds to an object O2, the user's operation in the real operation space S0 will correspond to a virtual display space S2 at the remote end.
To put it in greater details, a hand location L0 of the user' one hand in the real operation space S0 will correspond to a cursor location L1 (or a cursor location L2) in the virtual display space S1 (or the virtual display space S2) to control the virtual display according to the cursor location L1 (or the cursor location L2).
Referring to FIG. 2, a block diagram of a controlling system 100 is shown. The controlling system 100 includes a visual line tracking unit 110, an object detection unit 120, a space forming unit 130, a hand information capturing unit 140, a transforming unit 150 and a controlling unit 160. The visual line tracking unit 110 is used for tracking the visual lines VS1 and VS2. The visual line tracking unit 110 can be formed of an infrared image capturing device 111, a pupil location detector 112 and a visual line locator 113.
The object detection unit 120 is used for detecting the objects O1 and O2. The space forming unit 130 is used for forming the virtual display spaces S1, S2. The hand information capturing unit 140 is used for capturing the hand location L0. The hand information capturing unit 140 can be realized by a combination of a depth image capturing device 141 and a hand recognizer 142.
The transforming unit 150 is used for transforming the hand location L0 to be cursor locations L1 and L2 The transforming unit 150 can be realized by a combination of a ratio calculator 151 and a mapper 152. The controlling unit 160 is used for controlling virtual display.
The visual line tracking unit 110, the object detection unit 120, the space forming unit 130, the hand information capturing unit 140, the transforming unit 150, and the controlling unit 160 can be realized by such as a chip, a circuit, a firmware, a circuit board, an electronic device or a recording device storing multiple programming codes. The operations of each element are disclosed below with a flowchart.
Referring to FIG. 3, a flowchart of a controlling method for virtual display according to an embodiment is shown. Firstly, the method begins at step S110, the visual line tracking unit 110 tracks a visual line VS1 (or a visual line VS2) of a user. Referring to FIGS. 4A to 4C, operations of step S110 are illustrated. As indicated in FIG. 4A, an infrared image capturing device 111 may be disposed in a head-mounted display 900. When the user wears the head-mounted display 900, the infrared image capturing device 111 immediately captures an image of an eye 800 (illustrated in FIG. 4B). As indicated in FIG. 4B, the pupil location detector 112 detects a location of a pupil 810 according to the infrared images IM1 to IM4 captured by the infrared image capturing device 111. The eye 800 of FIG. 4B is looking at the rightmost point A1, the topmost point A2, the leftmost point A3 and the bottommost point A4 of FIG. 4C, respectively. The visual line locator 113 recognizes the visual line of the user according to the location of the pupil 810.
Next, in step S120, the object detection unit 120 provides an object O1 (or an object O2) at which the user is looking according to the visual line VS1 (or the visual line VS2). In an embodiment, the object detection unit 120 detects the background image according to at least one contour line of the background by using an edge detection algorithm, and connects the at least one contour line to form the object O1 (or the object O2). Or, the object detection unit 120 searches a database to locate the object O1 (or the object O2) corresponding to the visual line VS1 (or the visual line VS2) according to the visual line VS1 (or the visual line VS2).
Then, in step S130, the space forming unit 130 forms a virtual display space S1 (or a virtual display space S2) according to the object O1 (or the object O2) corresponding to the visual line VS1 (or the visual line VS2). The sizes of the virtual display spaces S1 and S2 vary with the objects O1 and O2, but are irrelevant with the distances of the objects O1 and O2. For example, the object O1 is larger, so the virtual display space S1 is also larger; the object O2 is smaller, so the virtual display space S2 is smaller. Besides, the size of the real operation space S0 may be the same as or different from the size of the virtual display space S1 (or the virtual display space S2).
Moreover, the length/width/height ratio of the virtual display spaces S1 and S2 is not fixed but depends on the objects O1 and O2. In an embodiment, the step S120 can be omitted, and the virtual display space S1 (or the virtual display space S2) can be directly formed according to the visual line VS1 (or the visual line VS2).
Then, in step S140, the hand information capturing unit 140 obtains a hand location L0 of the user's one hand 700 (illustrated in FIG. 5) in the real operation space S0. Referring to FIG. 5, operations of step S140 are illustrated. In step S140, the depth image capturing device 141 captures a depth image DM1 of the user. The hand recognizer 142 recognizes the hand 700 (illustrated in FIG. 5) from the depth image DM1 and positions the hand location L0 of the user's one hand 700.
Then, in step S150, the transforming unit 150 transforms the hand location L0 to be a cursor location L1 (or a cursor location L2) in the virtual display space S1 (or the virtual display space S2). Referring to FIG. 6, operations of step S150 are illustrated. In FIG. 6, the cursor location L1 is used as an example. As indicated in FIG. 6, the real operation space S0 has a first operation axis
Figure US10890979-20210112-P00001
, a second operation axis
Figure US10890979-20210112-P00002
, and a third operation axis
Figure US10890979-20210112-P00003
. The first operation axis
Figure US10890979-20210112-P00001
is a vector formed by point R1 and point R2. The second operation axis
Figure US10890979-20210112-P00002
is a vector formed by point R1 and point R3. The third operation axis
Figure US10890979-20210112-P00003
is a vector formed by point R1 and point R4. The hand location vector
Figure US10890979-20210112-P00004
is a vector formed by point R1 and the hand location L0.
The virtual display space S1 has a first display axis
Figure US10890979-20210112-P00005
, a second display axis
Figure US10890979-20210112-P00006
, and a third display axis
Figure US10890979-20210112-P00007
. The first display axis
Figure US10890979-20210112-P00005
is a vector formed by point V1 and point V2. The second display axis
Figure US10890979-20210112-P00006
is a vector formed by point V1 and point V3. The third display axis
Figure US10890979-20210112-P00007
is a vector formed by point V1 and point V4. The cursor location vector
Figure US10890979-20210112-P00008
is a vector formed by point V1 and the cursor location L1.
The angle relationship among the first operation axis
Figure US10890979-20210112-P00001
, the second operation axis
Figure US10890979-20210112-P00002
, and the third operation axis
Figure US10890979-20210112-P00003
may be different from or the same as the angle relationship among the first display axis
Figure US10890979-20210112-P00005
, the second display axis
Figure US10890979-20210112-P00006
, and the third display axis
Figure US10890979-20210112-P00007
. For example, the real operation space S0 may be a Cartesian coordinate system, and the virtual display space S1 may be a non-Cartesian coordinate system (that is, not every angle formed by two axes is a right angle).
In step S150, based on formulas (1) to (3), the ratio calculator 151 calculates a first relative ratio Xrate of the hand location L0 in the first operation axis
Figure US10890979-20210112-P00001
, a second relative ratio Yrate of the hand location L0 in the second operation axis
Figure US10890979-20210112-P00002
, and a third relative ratio Zrate of the hand location L0 in the third operation axis
Figure US10890979-20210112-P00003
. The first hand projection vector
Figure US10890979-20210112-P00009
is a projection vector of the hand location vector
Figure US10890979-20210112-P00010
in the first operation axis
Figure US10890979-20210112-P00001
. The second hand projection vector
Figure US10890979-20210112-P00011
is a projection vector of the hand location vector
Figure US10890979-20210112-P00010
in the second operation axis
Figure US10890979-20210112-P00002
. The third hand projection vector
Figure US10890979-20210112-P00012
is a projection vector of the hand location vector
Figure US10890979-20210112-P00010
in the third operation axis
Figure US10890979-20210112-P00003
.
Xrate = RL 0 x Rx ( 1 ) Yrate = RL 0 y Ry ( 2 ) Zrate = RL 0 z Rz ( 3 )
Based on formula (4), the mapper 152 calculates a first display coordinate XL1 of the hand location L0 corresponding to the first display axis
Figure US10890979-20210112-P00005
according to the first relative ratio Xrate, a second display coordinate YL1 of the hand location L0 corresponding to the second display axis
Figure US10890979-20210112-P00006
according to the second relative ratio Yrate, and a third display coordinate ZL1 of the hand location L0 in the third display axis
Figure US10890979-20210112-P00007
according to the third relative ratio Zrate. The point V1 has the first display coordinate XV1, the second display coordinate YV1, and the third display coordinate ZV1.
(XL1,YL1,ZL1)=(XV1,YV1,ZV1)+Xrate*
Figure US10890979-20210112-P00005
+Yrate*
Figure US10890979-20210112-P00006
+Zrate*
Figure US10890979-20210112-P00007
   (4)
Thus, the transforming unit 150 can transform the hand location L0 to be the cursor location L1 in the virtual display space S1.
Then, in step S160, the controlling unit 160 controls the virtual display according to the cursor location L1 (or the cursor location L2). During the control of virtual display, the movement of the cursor is adjusted according to the first relative ratio Xrate, the second relative ratio Yrate, and the third relative ratio Zrate. Thus, regardless of the objects O1 and O2 being located afar or nearby, the same effect can be generated as long as the operations performed in the real operation space S0 are of the same scale. For example, as indicated in FIG. 1, the user can slide his hand 700 for “a half-length” in the real operation space S0 to flip the object O2 located nearby by 90°. Similarly, the user can also can slide his hand 700 for “a half-length” in the real operation space S0 to flip the object O1 located afar by 90°. The two operations generate the same effect, and are not affected by the distances of the objects O1 and O2.
Similarly, during the operation of virtual display, the movement of the cursor is adjusted according to the first relative ratio Xrate, the second relative ratio Yrate, and the third relative ratio Zrate. Thus, regardless of the size of the objects O1 and O2, the same effect can be generated as long as the operations performed in the real operation space S0 are of the same scale. As indicated in FIG. 1, the user can slide his hand 700 for “a semi-circle” in the real operation space S0 to turn the object O2 which is smaller upside down. Similarly, the user can also slide his hand 700 for “a semi-circle” in the real operation space S0 to turn the object O1 which is bigger upside down. The two operations generate the same effect, and are not affected by the size of the object O1 and the size of the object O2.
Through the above steps, the user can use a cursor to control each object in an infinitely extended virtual display with a cursor by using a visual line tracking technology and a space transformation technology of the interactive display technologies.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.

Claims (21)

What is claimed is:
1. A controlling system for virtual display, comprising:
a visual line tracking circuit configured to track a visual line of a user, wherein the visual line tracking circuit includes a pupil location detector, the pupil location detector is configured to detect a location of a pupil according to an infrared image of an eye, the visual line is recognized according to a location of the pupil;
a space forming circuit configured to form a virtual display space according to the visual line, wherein the virtual display space is changed with the visual line;
a hand information capturing circuit configured to obtain a hand location of the user's one hand in a real operation space;
a transforming circuit configured to transform the hand location to be a cursor location in the virtual display space, wherein the space forming circuit and the hand information capturing circuit are connected to the transforming circuit; and
a controlling circuit configured to control the virtual display according to the cursor location.
2. The controlling system for virtual display according to claim 1, wherein the real operation space is a three-dimensional space; the virtual display space is a three-dimensional space.
3. The controlling system for virtual display according to claim 2, wherein the real operation space has a first operation axis, a second operation axis, and a third operation axis; the virtual display space has a first display axis, a second display axis, and a third display axis; the transforming circuit comprises:
a ratio calculator configured to calculate a first relative ratio of the hand location in the first operation axis, a second relative ratio in the second operation axis, and a third relative ratio in the third operation axis; and
a mapper configured to calculate a first display coordinate of the hand location in the first display axis according to the first relative ratio, a second display coordinate of the hand location in the second display axis according to the second relative ratio, and a third display coordinate of the hand location in the third display axis according to the third relative ratio.
4. The controlling system for virtual display according to claim 3, wherein an angle relationship among the first display axis, the second display axis, and the third display axis is different from an angle relationship among the first operation axis, the second operation axis, and the third operation axis.
5. The controlling system for virtual display according to claim 3, wherein an angle relationship among the first display axis, the second display axis, and the third display axis is the same as an angle relationship among the first operation axis, the second operation axis, and the third operation axis.
6. The controlling system for virtual display according to claim 1, wherein a size of the real operation space is different from a size of the virtual display space.
7. The controlling system for virtual display according to claim 1, wherein a size of the real operation space is the same as a size of the virtual display space.
8. The controlling system for virtual display according to claim 1, further comprising:
an object detection circuit configured to provide an object at which the user is looking according to the visual line.
9. The controlling system for virtual display according to claim 8, wherein the space forming circuit forms the virtual display space according to the object.
10. The controlling system for virtual display according to claim 8, wherein the object detection circuit detects at least one contour line by using an edge detection algorithm and connects the at least one contour line to form the object.
11. The controlling system for virtual display according to claim 8, wherein the object detection circuit searches a database to provide the object corresponding to the visual line.
12. A controlling method for virtual display, comprising:
tracking a visual line of a user;
forming a virtual display space according to the visual line, wherein the virtual display space is changed with the visual line, an infrared image of an eye is captured, a location of a pupil detected according to the infrared image, and the visual line is recognized according to a location of the pupil;
obtaining a hand location of the user's one hand in a real operation space;
transforming the hand location to be a cursor location in the virtual display space; and
controlling the virtual display according to the cursor location.
13. The controlling method for virtual display according to claim 12, wherein the real operation space is a three-dimensional space; the virtual display space is a three-dimensional space.
14. The controlling method for virtual display according to claim 13, wherein the real operation space has a first operation axis, a second operation axis and a third operation axis; the virtual display space has a first display axis, a second display axis and a third display axis; the step of transforming the hand location to be the cursor location in the virtual display space comprises:
calculating a first relative ratio of the hand location in the first operation axis, a second relative ratio in the second operation axis, and a third relative ratio in the third operation axis; and
calculating a first display coordinate of the hand location in the first display axis according to the first relative ratio, a second display coordinate of the hand location in the second display axis according to the second relative ratio, and a third display coordinate of the hand location in the third display axis according to the third relative ratio.
15. The controlling method for virtual display according to claim 14, wherein an angle relationship among the first display axis, the second display axis, and the third display axis is different from an angle relationship among the first operation axis, the second operation axis, and the third operation axis.
16. The controlling method for virtual display according to claim 14, wherein an angle relationship among the first display axis, the second display axis, and the third display axis is the same as an angle relationship among the first operation axis, the second operation axis, and the third operation axis.
17. The controlling method for virtual display according to claim 12, wherein a size of the real operation space is different from a size of the virtual display space.
18. The controlling method for virtual display according to claim 12, wherein a size of the real operation space is the same as a size of the virtual display space.
19. The controlling method for virtual display according to claim 12, further comprising:
providing an object at which the user is looking according to the visual line.
20. The controlling method for virtual display according to claim 19, wherein the step of providing the object at which the user is looking according to the visual line comprises:
detecting at least one contour line by using an edge detection algorithm; and
connecting the at least one contour line to form the object.
21. The controlling method for virtual display according to claim 19, wherein the step of providing the object at which the user is looking according to the visual line comprises:
searching a database; and
searching the database to provide the object corresponding to the visual line.
US15/959,665 2018-04-23 2018-04-23 Controlling system and controlling method for virtual display Active 2038-07-07 US10890979B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/959,665 US10890979B2 (en) 2018-04-23 2018-04-23 Controlling system and controlling method for virtual display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/959,665 US10890979B2 (en) 2018-04-23 2018-04-23 Controlling system and controlling method for virtual display

Publications (2)

Publication Number Publication Date
US20190324550A1 US20190324550A1 (en) 2019-10-24
US10890979B2 true US10890979B2 (en) 2021-01-12

Family

ID=68237749

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/959,665 Active 2038-07-07 US10890979B2 (en) 2018-04-23 2018-04-23 Controlling system and controlling method for virtual display

Country Status (1)

Country Link
US (1) US10890979B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018061621A (en) * 2016-10-11 2018-04-19 オプトス ピーエルシー Ocular fundus imaging apparatus, ocular fundus imaging method, and ocular fundus imaging program
JP7285879B2 (en) * 2021-05-26 2023-06-02 ソフトバンク株式会社 XR MALL GENERATING DEVICE, CONTROL METHOD OF XR MALL GENERATING DEVICE, AND XR MALL GENERATING PROGRAM

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175897A1 (en) 1999-08-27 2002-11-28 Pelosi Michael J. 3D cursor or joystick device
US20050159916A1 (en) * 2002-12-27 2005-07-21 Canon Kabushiki Kaisha Information processing apparatus, and information processing method
US20060050069A1 (en) * 2004-09-07 2006-03-09 Canon Kabushiki Kaisha Virtual reality presentation device and information processing method
US20090102845A1 (en) * 2007-10-19 2009-04-23 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20110055720A1 (en) * 2009-09-03 2011-03-03 David Potter Comprehensive user control system for therapeutic wellness devices
US20110221672A1 (en) 2010-02-28 2011-09-15 Osterhout Group, Inc. Hand-worn control device in an augmented reality eyepiece
US20120287304A1 (en) * 2009-12-28 2012-11-15 Cyber Ai Entertainment Inc. Image recognition system
US20130328925A1 (en) 2012-06-12 2013-12-12 Stephen G. Latta Object focus in a mixed reality environment
US20140160001A1 (en) 2012-12-06 2014-06-12 Peter Tobias Kinnebrew Mixed reality presentation
CN103955267A (en) 2013-11-13 2014-07-30 上海大学 Double-hand man-machine interaction method in x-ray fluoroscopy augmented reality system
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US20150178939A1 (en) 2013-11-27 2015-06-25 Magic Leap, Inc. Virtual and augmented reality systems and methods
US20150205484A1 (en) * 2012-07-27 2015-07-23 Nec Solution Innovators, Ltd. Three-dimensional user interface apparatus and three-dimensional operation method
CN104850221A (en) 2014-02-14 2015-08-19 欧姆龙株式会社 Gesture recognition device and method of controlling gesture recognition device
TWI502468B (en) 2009-11-10 2015-10-01 Acer Inc Mobile electronic device and method for controlling 3d operation interface thereof
US20170053452A1 (en) * 2015-08-17 2017-02-23 Colopl, Inc. Method and apparatus for providing virtual space, and non-transitory computer readable data storage medium storing program causing computer to perform method
TWM537702U (en) 2016-11-19 2017-03-01 Ping-Yuan Tsai Augmented reality learning and reference system and architecture thereof
CN106530926A (en) 2016-11-29 2017-03-22 东南大学 Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking
US20170262168A1 (en) 2014-08-29 2017-09-14 Hewlett-Packard Development Company, Lp. Touchscreen gestures
CN107430437A (en) 2015-02-13 2017-12-01 厉动公司 The system and method that real crawl experience is created in virtual reality/augmented reality environment
TWI610218B (en) 2013-03-25 2018-01-01 三星電子股份有限公司 Apparatus and method of controlling screens in a device
US10092480B2 (en) * 2013-12-11 2018-10-09 Luraco, Inc. Touchscreen-based control system for massage chairs
US20190179511A1 (en) * 2016-08-31 2019-06-13 Sony Corporation Information processing device, information processing method, and program

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175897A1 (en) 1999-08-27 2002-11-28 Pelosi Michael J. 3D cursor or joystick device
US20050159916A1 (en) * 2002-12-27 2005-07-21 Canon Kabushiki Kaisha Information processing apparatus, and information processing method
US20060050069A1 (en) * 2004-09-07 2006-03-09 Canon Kabushiki Kaisha Virtual reality presentation device and information processing method
US20090102845A1 (en) * 2007-10-19 2009-04-23 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20110055720A1 (en) * 2009-09-03 2011-03-03 David Potter Comprehensive user control system for therapeutic wellness devices
TWI502468B (en) 2009-11-10 2015-10-01 Acer Inc Mobile electronic device and method for controlling 3d operation interface thereof
US20120287304A1 (en) * 2009-12-28 2012-11-15 Cyber Ai Entertainment Inc. Image recognition system
US20110221672A1 (en) 2010-02-28 2011-09-15 Osterhout Group, Inc. Hand-worn control device in an augmented reality eyepiece
US20130328925A1 (en) 2012-06-12 2013-12-12 Stephen G. Latta Object focus in a mixed reality environment
US20150205484A1 (en) * 2012-07-27 2015-07-23 Nec Solution Innovators, Ltd. Three-dimensional user interface apparatus and three-dimensional operation method
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US20140160001A1 (en) 2012-12-06 2014-06-12 Peter Tobias Kinnebrew Mixed reality presentation
TWI610218B (en) 2013-03-25 2018-01-01 三星電子股份有限公司 Apparatus and method of controlling screens in a device
CN103955267A (en) 2013-11-13 2014-07-30 上海大学 Double-hand man-machine interaction method in x-ray fluoroscopy augmented reality system
US20150178939A1 (en) 2013-11-27 2015-06-25 Magic Leap, Inc. Virtual and augmented reality systems and methods
US10092480B2 (en) * 2013-12-11 2018-10-09 Luraco, Inc. Touchscreen-based control system for massage chairs
CN104850221A (en) 2014-02-14 2015-08-19 欧姆龙株式会社 Gesture recognition device and method of controlling gesture recognition device
US20170262168A1 (en) 2014-08-29 2017-09-14 Hewlett-Packard Development Company, Lp. Touchscreen gestures
CN107430437A (en) 2015-02-13 2017-12-01 厉动公司 The system and method that real crawl experience is created in virtual reality/augmented reality environment
US20170053452A1 (en) * 2015-08-17 2017-02-23 Colopl, Inc. Method and apparatus for providing virtual space, and non-transitory computer readable data storage medium storing program causing computer to perform method
US20190179511A1 (en) * 2016-08-31 2019-06-13 Sony Corporation Information processing device, information processing method, and program
TWM537702U (en) 2016-11-19 2017-03-01 Ping-Yuan Tsai Augmented reality learning and reference system and architecture thereof
CN106530926A (en) 2016-11-29 2017-03-22 东南大学 Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Author Unknown, "A Survey of 3D Object Selection Techniques for Virtual Environments," Preprint submitted to Computer & Graphics, Nov. 20, 2012, pp. 1-18.
Deng et al., "Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation," Int. J. Human-Computer Studies, vol. 105, 2017 (available online Apr. 27, 2017), pp. 68-80.
Kölsch et al., "Vision-Based Interfaces for Mobility," Proceedings of the First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004, 9 pages.
Park et al., "Defining Rules Among Devices in Smart Environment Using an Augmented Reality Headset," Urb-IoT, Tokyo, Japan, May 24-25, 2016, 4 pages.

Also Published As

Publication number Publication date
US20190324550A1 (en) 2019-10-24

Similar Documents

Publication Publication Date Title
Memo et al. Head-mounted gesture controlled interface for human-computer interaction
US10380763B2 (en) Hybrid corner and edge-based tracking
EP3786892B1 (en) Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium
CN106920279B (en) Three-dimensional map construction method and device
CN109564472B (en) Method, medium, and system for selecting an interaction method with a virtual object
CN109947886B (en) Image processing method, image processing device, electronic equipment and storage medium
JP5728009B2 (en) Instruction input device, instruction input method, program, recording medium, and integrated circuit
JP5521727B2 (en) Image processing system, image processing apparatus, image processing method, and program
US8705868B2 (en) Computer-readable storage medium, image recognition apparatus, image recognition system, and image recognition method
US20240087237A1 (en) Augmented reality guidance that generates guidance markers
CN110363061B (en) Computer readable medium, method for training object detection algorithm and display device
US20150145762A1 (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
CN106462242A (en) User interface control using gaze tracking
US10234955B2 (en) Input recognition apparatus, input recognition method using maker location, and non-transitory computer-readable storage program
US8718325B2 (en) Computer-readable storage medium, image processing apparatus, image processing system, and image processing method
EP2492869A2 (en) Image processing program, image processing apparatus, image processing system, and image processing method
CN110168615B (en) Information processing apparatus, information processing method, and storage medium
US20170243052A1 (en) Book detection apparatus and book detection method
US11508130B2 (en) Augmented reality environment enhancement
US20120219179A1 (en) Computer-readable storage medium, image processing apparatus, image processing system, and image processing method
US10890979B2 (en) Controlling system and controlling method for virtual display
JP2012238293A (en) Input device
CN108896035B (en) Method and equipment for realizing navigation through image information and navigation robot
Piérard et al. I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes
JP7335726B2 (en) Information display device

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIAO, DUAN-LI;LI, CHIA-CHANG;TING, WEN-HUNG;AND OTHERS;REEL/FRAME:046331/0195

Effective date: 20180703

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4