Nothing Special   »   [go: up one dir, main page]

WO2016068015A1 - Coordinate acquisition device and display device - Google Patents

Coordinate acquisition device and display device Download PDF

Info

Publication number
WO2016068015A1
WO2016068015A1 PCT/JP2015/079840 JP2015079840W WO2016068015A1 WO 2016068015 A1 WO2016068015 A1 WO 2016068015A1 JP 2015079840 W JP2015079840 W JP 2015079840W WO 2016068015 A1 WO2016068015 A1 WO 2016068015A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
coordinate
proximity point
unit
Prior art date
Application number
PCT/JP2015/079840
Other languages
French (fr)
Japanese (ja)
Inventor
洋一 久下
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2016068015A1 publication Critical patent/WO2016068015A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to a technique for specifying coordinates on a display screen of a display device based on the position of a user's hand in a three-dimensional space.
  • coordinate information can be input via a touch panel by displaying a stereoscopic image and allowing a user to touch an imaging position of the stereoscopic image.
  • a stereoscopic image display operation device that can be used.
  • the above-described stereoscopic image display operation device is premised on input using a touch panel, and a user's proximity region to the extent that can cause a capacitance change (electric field change) on the touch panel of the stereoscopic image display operation device. It is necessary to move a finger or the like.
  • the present invention does not require a touch panel, and appropriately sets the coordinates on the display screen of the display device based on the position of the user's hand in the three-dimensional space, even from a remote position.
  • An object is to realize a coordinate acquisition device and a display device that can be specified.
  • the first configuration is a coordinate acquisition device for acquiring coordinate information on the display surface of the display device, and includes an imaging unit, a sensor unit, and a calculation unit.
  • the imaging unit acquires a captured image by imaging a subject with a predetermined angle of view.
  • the sensor unit acquires the user proximity point, which is the position of the user's hand or the object being held by the user, and the distance to the user proximity point.
  • the calculation unit (1) In the captured image acquired by the imaging unit, the user's eyes are detected, and based on the detected position of both eyes on the captured image, the distance between both eyes, and the angle of view, Calculate the distance to the plane containing the user ’s eyes, (2) Based on the user proximity point acquired by the sensor unit and the distance from the user's own device to the user proximity point, and the horizontal and vertical widths of the display surface of the display device, Calculate the overlapping view range, which is the range where the proximity point and the display surface of the display device appear to overlap, (3) When it is determined that the user proximity point is within the overlapping view range, coordinate information of the user proximity point is calculated and output.
  • the coordinates on the display screen of the display device can be appropriately specified based on the position of the user's hand in the three-dimensional space even from a remote position without the need for a touch panel.
  • a coordinate acquisition device and a display device can be realized.
  • FIG. 1 is a schematic configuration diagram of a display device 1000 according to a first embodiment.
  • the position of the user's hand position in the three-dimensional space
  • the coordinate information on the display surface Scn corresponding to the position of the hand is acquired.
  • the figure for demonstrating the identification method of the x coordinate position of a user's viewpoint The figure for demonstrating the identification method of the x coordinate position of a user's viewpoint.
  • the figure for demonstrating the calculation method of the x coordinate Xmin of the lower limit position of the x-axis direction of an overlapping visual field range The figure for demonstrating the calculation method of x coordinate Xmax of the upper limit position of the x-axis direction of an overlapping visual field range.
  • the figure for demonstrating the identification method of the y coordinate position of a user's viewpoint The figure for demonstrating the calculation method of y coordinate Ymin of the lower limit position of the y-axis direction of an overlapping visual field range.
  • a display device 1000 will be described as an example of a device using a coordinate acquisition device.
  • FIG. 1 is a schematic configuration diagram of the display device 1000.
  • the display device 1000 includes a coordinate acquisition device 100, a display control unit 4, and a display panel (eg, a liquid crystal display, an organic EL display, etc.) LCD.
  • a display panel eg, a liquid crystal display, an organic EL display, etc.
  • the coordinate acquisition apparatus 100 includes an imaging unit 1, sensors DS ⁇ b> 1 and DS ⁇ b> 2, a sensor control unit 2, and a calculation unit 3.
  • the imaging unit 1 is installed in, for example, a part of the housing of the display panel LCD (for example, near the center of the upper peripheral portion (frame portion) of the display surface Scn) and is a subject that exists in the normal direction of the display surface Scn (Scene) is imaged. That is, the imaging unit 1 captures an image of a subject or the like facing the display surface Scn (for example, a user looking at the display surface Scn).
  • Sensor DS1 is, for example, a proximity sensor.
  • the sensor DS1 irradiates an electromagnetic wave such as infrared light, and receives the electromagnetic wave (reflected wave) reflected and returned by the object. Then, the sensor DS1 outputs a signal corresponding to the received light amount of the reflected wave (a signal converted into a predetermined physical quantity) to the sensor control unit 2.
  • the sensor DS1 and the sensor control unit 2 may capture reflected light such as infrared light emitted by the sensor DS1, and determine the distance based on the position and intensity.
  • this embodiment demonstrates to an example the case where there is one sensor, the number of sensors is not limited to one and may be two or more. Furthermore, the position where the sensor is arranged may be determined according to the measurement object.
  • the sensor control unit 2 inputs a signal output from the sensor DS1 (a signal corresponding to the amount of reflected wave received), and acquires (specifies the position of the measurement object (for example, a user's hand) based on the signal. )
  • the sensor control unit 2 controls the intensity and irradiation angle of electromagnetic waves such as infrared light emitted from the sensor DS1, and information such as the received light intensity and angle of the reflected wave from the object is output from the sensor DS1. Based on the signal to be identified. Thereby, the sensor control part 2 can acquire the positional information on a measuring object (for example, a user's hand). Then, the sensor control unit 2 outputs the acquired position information of the measurement object to the calculation unit 3.
  • the calculation unit 3 inputs an image (video) captured by the imaging unit 1 and position information of a measurement object (for example, a user's hand) acquired by the sensor control unit 2. Based on the image (video) captured by the imaging unit 1 and the position information of the measurement object (for example, the user's hand) acquired by the sensor control unit 2, the calculation unit 3 determines the position of the user's hand as follows: It is determined whether or not it exists at a position where it can be seen to overlap with the display surface Scn of the display panel LCD as viewed from the user.
  • the calculation unit 3 determines that the position of the user's hand is present at a position where the user's hand is seen to overlap the display surface Scn of the display panel LCD, based on the position of the user's hand, The coordinates on the display surface Scn corresponding to the position of the hand are calculated. Then, the calculation unit 3 outputs information regarding the calculated coordinates to the display control unit 4.
  • the display control unit 4 includes an input control unit 41, an application unit 42, and a display processing unit 43, as shown in FIG.
  • the input control unit 41 inputs information regarding coordinates output from the calculation unit 3.
  • the input control unit 41 outputs a control signal to the application unit 42 and / or the display processing unit 43 based on the coordinate information input from the calculation unit 3. For example, when the coordinate information is input from the calculation unit 3 after a period in which the coordinate information is not input from the calculation unit 3 continues for a certain period, the input control unit 41 determines the display surface Scn according to the coordinate information. Data and control signals necessary for updating a predetermined display (for example, for executing update processing such as changing the color of an icon) are generated, and the generated data and control signals are output to the application unit 42. .
  • the input control unit 41 outputs a control signal for controlling the display processing unit 43 to the display processing unit 43.
  • the application unit 42 inputs data and control signals output from the input control unit 41.
  • the application unit 42 executes predetermined processing on the data output from the input control unit 41 in accordance with the control signal output from the input control unit 41.
  • the application unit 42 performs predetermined display update (for example, icon display update) on the coordinate position of the display surface Scn based on the coordinate information indicating the position of the user's hand output from the input control unit 41.
  • Data for performing is generated, and the generated data is output to the display processing unit 43.
  • the display processing unit 43 inputs a control signal output from the input control unit 41 and data output from the application unit 42.
  • the display processing unit 43 controls the display panel LCD according to the control signal output from the input control unit 41 so that the display based on the data output from the application unit 42 is performed on the display surface Scn. That is, the display processing unit 43 outputs display data to be displayed on the display surface Scn of the display panel LCD. Further, the display processing unit 43 outputs a display control signal for causing the display data to be displayed on the display surface Scn of the display panel LCD to the display panel LCD.
  • the display panel LCD is realized by, for example, a liquid crystal display, an organic EL display, or the like.
  • the display panel LCD displays the display data output from the display processing unit 43 of the display control unit 4 on the display surface Scn based on the display control signal output from the display processing unit 43 of the display control unit 4.
  • the display apparatus 1000 coordinate acquisition apparatus 100
  • a process of acquiring the position of the user's hand (position in the three-dimensional space) and acquiring coordinate information on the display surface Scn corresponding to the position of the hand The case where it is executed will be described.
  • FIG. 2 shows a display device 1000 (coordinate acquisition device 100) that acquires the position of the user's hand (position in the three-dimensional space) and acquires coordinate information on the display surface Scn that corresponds to the position of the hand (output). It is a flowchart of the process to do.
  • step S1 the imaging unit 1 images a subject (scene) that exists in the normal direction of the display surface Scn. That is, the imaging unit 1 captures a subject or the like facing the display surface Scn (for example, a user looking at the display surface Scn) and acquires a captured image. Note that the imaging unit 1 acquires a captured image at a horizontal angle of view ⁇ .
  • step S ⁇ b> 2 the calculation unit 3 detects the position of the user's eyes from the captured image acquired by the imaging unit 1.
  • the computing unit 3 detects the position of the user's eyes by specifying an image region having human eye characteristics from the captured image by matching processing or the like.
  • step S3 If the user's eyes are detected (“Yes” in step S3), the process proceeds to step S4. If the user's eyes are not detected (“No” in step S3), the process is terminated. .
  • step S4 the calculation unit 3 calculates a distance D from the imaging unit 1 to the user. This will be described with reference to FIG.
  • FIG. 3 is a diagram for explaining a method of calculating the distance D from the imaging unit 1 to the user position using the captured image Img1 acquired by the imaging unit 1 at the horizontal angle of view ⁇ .
  • the center of both eyes of the user is on the optical axis of the optical system of the imaging unit 1 and the distance from the imaging unit 1 to the center point of both eyes of the user is D.
  • the captured image Img1 illustrated in FIG. 3 is an image acquired (captured) at the horizontal angle of view ⁇ by the imaging unit 1 in the situation illustrated in FIG.
  • the horizontal width of the captured image Img1 is Wc
  • the distance between both eyes of the user is d0
  • the horizontal width of the captured image Img1 is the distance d0 between the eyes of the user with respect to Wc.
  • the calculation unit 3 can calculate the distance D from the imaging unit 1 to the user's position.
  • the calculation unit 3 calculates the ratio P of the distance d0 between the user's eyes to the horizontal width of the captured image Img1 with respect to Wc. By executing the processing corresponding to the equation, the distance D from the imaging unit 1 to the user's position is calculated.
  • step S5 the sensor DS1 emits an electromagnetic wave such as infrared light, and receives the electromagnetic wave (reflected wave) reflected and returned from the object. Then, the sensor DS1 outputs a signal corresponding to the received light amount of the reflected wave (a signal converted into a predetermined physical quantity) to the sensor control unit 2.
  • an electromagnetic wave such as infrared light
  • the sensor DS1 outputs a signal corresponding to the received light amount of the reflected wave (a signal converted into a predetermined physical quantity) to the sensor control unit 2.
  • the sensor control unit 2 acquires (identifies) the position of the measurement object (for example, the user's hand) based on the signal input from the sensor DS1 (signal corresponding to the amount of reflected wave received).
  • the sensor control unit 2 controls the intensity and irradiation angle of electromagnetic waves such as infrared light emitted from the sensor DS1, and information such as the received light intensity and angle of the reflected wave from the object is output from the sensor DS1. Based on the signal to be identified. Thereby, the sensor control part 2 acquires the positional information on a measuring object (for example, a user's hand). Then, the sensor control unit 2 outputs the acquired position information of the measurement object to the calculation unit 3.
  • FIG. 4 is a diagram schematically showing the display surface Scn of the display panel LCD, the sensor DS1, and the user's hand as a measurement object.
  • FIG. 4 is a diagram of the sensor DS1 and the user's hand as a measurement object as viewed from above. As shown in FIG. 4, the x-axis, y-axis, and z-axis are set, and the position of the sensor DS1 The x coordinate of is set to “0” (origin).
  • the sensor DS1 receives the reflected wave from the point P0 that is the tip position of the user's hand.
  • the sensor control unit 2 is based on the received light amount of the reflected wave from the point P0 that is the tip position of the user's hand. (1) a distance Z0 in the z-axis direction from the sensor DS1 to the point P0 that is the tip position of the user's hand; (2) x-coordinate X0 of point P0; To get.
  • the distance Z0 is the distance in the z-axis direction from the sensor DS1 to the point P0 that is the tip position of the user's hand, and is therefore equal to the distance from the display surface Scn of the display panel LCD to the point P0.
  • step S ⁇ b> 6 the calculation unit 3 specifies the overlapping view range when the user's finger (portion closest to the display surface Scn) exists at a distance Z ⁇ b> 0 from the display surface Scn.
  • the “overlapping field of view” refers to the user's finger (tip of the hand) (which may be the tip of a pointer, pen, or the like) and the display surface Scn when the user views the display surface Scn. It is the range that appears to overlap.
  • FIG. 5 is a diagram for explaining a method of specifying the x coordinate position of the user's viewpoint.
  • FIG. 5 is a diagram seen from above, and schematically shows the imaging unit 1, the display surface Scn of the display panel LCD, and a plane including both eyes of the user.
  • FIG. 5 schematically shows a captured image Img2 captured by the imaging unit 1 at a horizontal angle of view ⁇ . Note that the captured image Img2 is illustrated as an image (mirror image) in which left and right are reversed in order to match the positional relationship of the x axis.
  • the x-axis, y-axis, and z-axis are set, and the x-coordinate of the position of the imaging unit 1 (the optical axis position of the optical system of the imaging unit 1) is set to “0” (the origin ).
  • the x coordinate of the center position of both eyes of the user is X1
  • the horizontal width of the captured image Img2 is Xpic, and the length in the x-axis direction (horizontal direction) from the center point of the captured image to the center position of both eyes of the user is Xpc. .
  • the calculation unit 3 Wc 2D ⁇ tan ( ⁇ ) To calculate the length Wc.
  • the x coordinate position of the user's viewpoint (the center point of both eyes of the user) is specified.
  • FIG. 6 is a diagram for explaining a method of calculating the x-coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range.
  • FIG. 7 is a diagram for explaining a method of calculating the x-coordinate Xmax of the upper limit position in the x-axis direction of the overlapping view range.
  • the length of the display surface Scn in the x-axis direction is Wd
  • the x coordinate of the center position of the display surface Scn is “0”
  • the x coordinate of the user's viewpoint P1 is X1, and explain.
  • the calculation unit 3 calculates the x-coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range and the x-coordinate Xmax of the upper limit position of the overlapping view range in the x-axis direction by a process corresponding to the above formula.
  • the x coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range and the x coordinate Xmax of the upper limit position in the x-axis direction of the overlapping view range are determined.
  • FIG. 8 is a diagram for explaining a method of specifying the y coordinate position of the user's viewpoint.
  • FIG. 8 is a view seen from the side (x-axis direction), and schematically shows the imaging unit 1, the display surface Scn of the display panel LCD, and a plane including both eyes of the user.
  • FIG. 8 schematically shows a captured image Img2 captured by the imaging unit 1 at a vertical angle of view ⁇ . Note that the captured image Img2 is shown with the positional relationship of the y-axis matched. Further, the captured image Img2 is illustrated as an image (mirror image) obtained by inverting the left and right as in FIG.
  • the x-axis, y-axis, and z-axis are set, and the y-coordinate of the position of the imaging unit 1 (the optical axis position of the optical system of the imaging unit 1) is set to “0” (the origin ).
  • the y coordinate of the center point of the display surface Scn is also “0”.
  • the y coordinate of the center position of both eyes of the user is Y1
  • the vertical width of the captured image Img2 is Hpic, and the length in the y-axis direction (vertical direction) from the center point of the captured image to the center position of both eyes of the user is Ypc. .
  • the calculation unit 3 Hc 2D ⁇ tan ( ⁇ ) To calculate the length Hc.
  • the y coordinate position of the user's viewpoint (the center point of both eyes of the user) is specified.
  • the lower limit position in the y-axis direction (the lower limit position in the y-axis direction of the overlapping view range) and the upper limit position (the upper limit position in the y-axis direction of the overlapping view range) in which the user's finger and the display surface Scn appear to overlap each other
  • the y coordinate of the lower limit position in the y-axis direction of the overlapping view range is Ymin
  • the y coordinate of the upper limit position of the overlapping view range in the y-axis direction is Ymax.
  • FIG. 9 is a diagram for explaining a method for calculating the y-coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range.
  • FIG. 10 is a diagram for explaining a method of calculating the y-coordinate Ymax of the upper limit position in the y-axis direction of the overlapping view range.
  • the length of the display surface Scn in the y-axis direction (vertical length) is Ht, the y-coordinate of the center position of the display surface Scn is “0”, and the y-coordinate of the user's viewpoint P1 is Y1. ,explain.
  • the calculation unit 3 calculates the y-coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range and the y-coordinate Ymax of the upper limit position in the y-axis direction of the overlapping view range by a process corresponding to the above mathematical formula.
  • the y-coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range and the y-coordinate Ymax of the upper limit position of the overlapping view range in the y-axis direction are determined.
  • the overlapping view range is specified by the above processing. That is, the calculation unit 3 performs the above processing, and (1) the x-axis of the overlapping view range when the user's finger (portion closest to the display surface Scn) exists at a distance Z0 from the display surface Scn.
  • the y coordinate Ymax of the upper limit position in the axial direction is specified (calculated).
  • step S ⁇ b> 7 the calculation unit 3 determines whether or not the user's finger (the part closest to the display surface Scn) is within the overlapping view range. This will be described with reference to FIGS.
  • FIG. 11 is a diagram for explaining processing for determining whether or not the x-coordinate of the position P0 of the user's finger (portion closest to the display surface Scn) is within the overlapping view range.
  • FIG. 12 is a diagram for explaining processing for determining whether or not the y coordinate of the position P0 of the user's finger (portion closest to the display surface Scn) is within the overlapping view range.
  • the calculation unit 3 acquires the x coordinate X0 of the position P0 of the user's finger, and the x coordinate X0 is Xmin ⁇ X0 ⁇ Xmax It is determined whether or not the above is satisfied.
  • the calculation unit 3 acquires the y coordinate Y0 of the position P0 of the user's finger, and the y coordinate Y0 is Ymin ⁇ Y0 ⁇ Ymax It is determined whether or not the above is satisfied.
  • step S7 determines that the position P0 of the user's finger is within the overlapping field of view (“Yes” in step S7), and advances the process to step S8.
  • the calculation unit 3 determines that the position P0 of the user's finger is not within the overlapping view range ("No" in step S7), and ends the process.
  • step S8 the calculation unit 3 calculates the coordinates (X coordinate, Y coordinate) of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn).
  • the calculation process of the X coordinate of the point P2 will be described with reference to FIG.
  • the X coordinate on the display surface Scn is “0” for the end of the display surface Scn in the negative x-axis direction and “Wd” for the end of the display surface Scn in the positive x-axis direction.
  • Wd length of the display surface Scn in the x-axis direction.
  • the X coordinate of the point P2 on the display surface Scn is set as Xout and will be described below.
  • the calculation unit 3 calculates the X coordinate Xout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn) by performing processing corresponding to the above formula. To do.
  • the Y coordinate on the display surface Scn is “0” at the end of the display surface Scn in the negative y-axis direction (the lower end of the display surface Scn), and is positive on the x-axis of the display surface Scn.
  • An end portion in the direction (upper end portion of the display surface Scn) is defined as “Ht” (Ht: length of the display surface Scn in the y-axis direction).
  • the Y coordinate of the point P2 on the display surface Scn is set as Yout and will be described below.
  • the calculation unit 3 performs a process corresponding to the above mathematical formula to calculate the Y coordinate Yout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn). To do.
  • the calculation unit 3 uses the display control unit 4 to obtain the X coordinate Xout and the Y coordinate Yout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (the portion closest to the display surface Scn) acquired by the above processing. It outputs to the input control part 41 (step S9).
  • the display apparatus 1000 acquires the position of the user's hand (position in the three-dimensional space), and acquires coordinate information on the display surface Scn corresponding to the position of the hand. (Output.
  • the input control unit 41 outputs a control signal to the application unit 42 and / or the display processing unit 43 based on the coordinate information input from the calculation unit 3. For example, when the coordinate information is input from the calculation unit 3 after a period in which the coordinate information is not input from the calculation unit 3 continues for a certain period, the input control unit 41 determines the display surface Scn according to the coordinate information. Data and control signals necessary for updating a predetermined display (for example, for executing update processing such as changing the color of an icon) are generated, and the generated data and control signals are output to the application unit 42. .
  • the input control unit 41 outputs a control signal for controlling the display processing unit 43 to the display processing unit 43.
  • the application unit 42 executes predetermined processing on data output from the input control unit 41 in accordance with a control signal output from the input control unit 41. For example, the application unit 42 performs predetermined display update (for example, icon display update) on the coordinate position of the display surface Scn based on the coordinate information indicating the position of the user's hand output from the input control unit 41. Data for performing is generated, and the generated data is output to the display processing unit 43.
  • predetermined display update for example, icon display update
  • the display processing unit 43 controls the display panel LCD according to the control signal output from the input control unit 41 so that the display based on the data output from the application unit 42 is performed on the display surface Scn. That is, the display processing unit 43 outputs display data to be displayed on the display surface Scn of the display panel LCD. Further, the display processing unit 43 outputs a display control signal for causing the display data to be displayed on the display surface Scn of the display panel LCD to the display panel LCD.
  • the display panel LCD displays the display data output from the display processing unit 43 of the display control unit 4 on the display surface Scn based on the display control signal output from the display processing unit 43 of the display control unit 4.
  • the position information of the user's viewpoint is acquired from the captured image acquired by the imaging unit 1, and further, the three-dimensional space is acquired by the sensor DS1 and the sensor control unit 2.
  • the coordinate information of the position of the user's hand (the part closest to the display surface Scn) is acquired.
  • the calculation unit 3 allows the user to display the display surface Scn based on the position information of the user's viewpoint and the coordinate information of the hand position (the portion closest to the display surface Scn).
  • the overlapping view range which is the range in which the position of the user's hand (the tip of the hand) (or the tip of a pointer or pen, etc.) and the display surface Scn can be seen.
  • the display device 1000 coordinate acquisition device 100
  • the calculation unit 3 calculates the position of the user's viewpoint and the position of the user's hand.
  • the coordinates (Xout, Yout) of the point where the straight line connecting to the display surface Scn intersects are calculated and output.
  • the display device 1000 does not require a touch panel
  • the user's hand can be determined based on the position of the user's hand in the three-dimensional space even from a position away from the display surface Scn. It is possible to appropriately specify (acquire) the coordinate information on the display surface Scn corresponding to the position and the like.
  • the calculation unit 3 outputs the coordinate information in step S9. You may make it not perform the process to perform.
  • the display apparatus 1000 for example, it is possible to detect a movement of the user's hand as a gesture and suppress malfunctions when executing processing according to the detected gesture. That is, by performing the processing as described above, when the user unconsciously touches his / her face, the display device 1000 can appropriately prevent the display device 1000 from erroneously recognizing as a predetermined gesture and malfunctioning. Can do.
  • the calculation unit 3 may not execute the process of outputting the coordinate information in step S9. Accordingly, it is possible to appropriately prevent the display device 1000 from erroneously recognizing the motion of the user's hand in a position farther than the predetermined distance as the predetermined gesture and causing the display device 1000 to malfunction.
  • the display control unit 4 performs a click operation. May be determined, a predetermined process may be executed, and control may be performed so that display according to the process is performed on the display surface Scn.
  • the display control unit 4 when the position of the user's viewpoint does not change and the user's finger is moved (or reciprocated) from the point P0 to the point P3, the display control unit 4 performs the click operation. It may be determined that a predetermined process is executed, and display according to the process is performed on the display surface Scn.
  • the calculation unit 3 detects the position of the user's face and determines that the position of the user's face is outside the overlapping view range, the calculation unit 3 does not execute the process of outputting the coordinate information in step S9. You may do it.
  • the display control unit 4 may determine that the movement of the user's hand is a predetermined gesture based on the movement of the user's hand detected by the calculation unit 3.
  • the display control unit 4 may determine that the gesture is different when the position of the user's hand is different. That is, the display control unit 4 may execute different processes depending on the position of the user's hand even if the detected movement (gesture) of the user's hand is the same.
  • the function of the “sensor unit” is realized by the sensor DS1 and the sensor control unit 2, for example.
  • the imaging unit 1 is arranged outside the display panel LCD as shown in FIG. 1, but the present invention is not limited to this.
  • the imaging unit 1 includes the housing of the display panel LCD. It may be arranged inside the body.
  • the orientation of the optical axis and the angle of view of the optical system of the imaging unit 1 are described as an example for convenience of explanation. You may change a corner
  • the coordinate acquisition device 100 considers the inclination of the optical axis of the optical system of the imaging unit 1. The calculation process may be executed.
  • part or all of the display device 1000 and the coordinate acquisition device 100 of the above embodiment may be realized as an integrated circuit (for example, an LSI, a system LSI, or the like).
  • Part or all of the processing of each functional block in the above embodiment may be realized by a program. And a part or all of the processing of each functional block of the above embodiment may be executed by a central processing unit (CPU) in the computer.
  • a program for performing each processing is stored in a storage device such as a hard disk or ROM, and a central processing unit (CPU) reads the program from the ROM or RAM and executes it. Also good.
  • each process of the above embodiment may be realized by hardware, or may be realized by software (including a case where it is realized together with an OS (operating system), middleware, or a predetermined library). Further, it may be realized by mixed processing of software and hardware. Further, it may be realized by mixed processing of software and hardware.
  • execution order of the processing methods in the above embodiment is not necessarily limited to the description of the above embodiment, and the execution order can be changed without departing from the gist of the invention.
  • a computer program that causes a computer to execute the above-described method and a computer-readable recording medium that records the program are included in the scope of the present invention.
  • the computer-readable recording medium include a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a large-capacity DVD, a next-generation DVD, and a semiconductor memory.
  • the computer program is not limited to the one recorded on the recording medium, but may be transmitted via a telecommunication line, a wireless or wired communication line, a network represented by the Internet, or the like.
  • the dimension of each member has a part which does not represent an actual dimension, a dimension ratio, etc. faithfully.
  • 1st invention is a coordinate acquisition apparatus for acquiring the coordinate information on the display surface of a display apparatus, Comprising: An imaging part, a sensor part, and a calculating part are provided.
  • the imaging unit acquires a captured image by imaging a subject with a predetermined angle of view.
  • the sensor unit acquires the user proximity point, which is the position of the user's hand or the object being held by the user, and the distance to the user proximity point.
  • the calculation unit (1) In the captured image acquired by the imaging unit, the user's eyes are detected, and based on the detected position of both eyes on the captured image, the distance between both eyes, and the angle of view, Calculate the distance to the plane containing the user ’s eyes, (2) Based on the user proximity point acquired by the sensor unit and the distance from the user's own device to the user proximity point, and the horizontal and vertical widths of the display surface of the display device, Calculate the overlapping view range, which is the range where the proximity point and the display surface of the display device appear to overlap, (3) When it is determined that the user proximity point is within the overlapping view range, coordinate information of the user proximity point is calculated and output.
  • the position information of the user's viewpoint (the center point of both eyes) is acquired from the captured image acquired by the imaging unit, and further, the sensor proximity point (for example, the part closest to the display surface) ) Coordinate information is acquired.
  • the user proximity An overlapping field of view that is a range in which a point (for example, the tip of a user's finger or a pointing stick or a pen held by the user) and a display surface can be seen is specified.
  • the calculation unit calculates the coordinates (Xout, Yout) of the point where the straight line connecting the user viewpoint and the user proximity point intersects the display surface. Calculate and output.
  • the user proximity point (the position of the user's hand) is determined based on the position of the user's hand in the three-dimensional space even from a position away from the display surface. Etc.) can be appropriately specified (acquired) on the display surface.
  • the second invention is a display device comprising the coordinate acquisition device according to the first invention, a display panel, and a display control unit.
  • the display control unit is a display control unit that controls the display panel, and inputs the coordinate information of the user proximity point calculated by the calculation unit.
  • the arithmetic unit detects the position of the user's hand and face in the captured image acquired by the imaging unit, and determines that the distance between the hand and the face is equal to or less than a predetermined value, the coordinate information of the user proximity point Is not output to the display controller.
  • this display device for example, it is possible to detect a movement of the user's hand as a gesture and suppress malfunctions when executing processing according to the detected gesture.
  • the processing as described above can appropriately prevent the display device from erroneously recognizing it as a predetermined gesture when the user unconsciously touches the face and malfunctioning. Can do.
  • the third invention is a display device comprising the coordinate acquisition device according to the first invention, a display panel, and a display control unit.
  • the display control unit is a display control unit that controls the display panel, and inputs the coordinate information of the user proximity point calculated by the calculation unit.
  • the calculation unit detects the position of the user's face in the captured image acquired by the imaging unit, and if the distance from the own device to the user's face is greater than a certain value, the coordinate information of the user proximity point is displayed on the display control unit. Do not output.
  • this display device it is possible to appropriately recognize the movement of the user's hand at a position farther than a predetermined distance as a predetermined gesture, and to prevent erroneous operation.
  • the fourth invention is a display device comprising the coordinate acquisition device according to the first invention, a display panel, and a display control unit.
  • the display control unit is a display control unit that controls the display panel, and inputs the coordinate information of the user proximity point calculated by the calculation unit.
  • the display control unit determines that (1) the ratio between the distance from the display surface to the user proximity point and the distance from the user proximity point to the plane on which the user's eyes are present has changed, (2) If it is determined that the coordinates of the point where the straight line connecting the viewpoint that is the center point of the user's eyes and the user's proximity point intersects the display surface have not changed, it is determined that the user has clicked, and the display surface Then, a predetermined process associated with the display at the coordinates of the point where the straight line connecting the viewpoint and the user proximity point intersects the display surface is executed.
  • the display control unit can determine that the operation is a click operation, execute a predetermined process, and control so that, for example, a display corresponding to the process is performed on the display surface.
  • the coordinates on the display screen of the display device can be appropriately specified based on the position of the user's hand in the three-dimensional space even from a remote position without the need for a touch panel. Since the coordinate acquisition device and the display device can be realized, it is useful in the display device related industrial field and can be implemented in this field.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The purpose of the present invention is to realize a coordinate acquisition device with which it is possible to cause, inter alia, coordinates on the display screen of a display device, to be appropriately specified without requiring a touch panel on the basis of, inter alia, the position of a user's hand in three-dimensional space, even from a remote position. An image-capturing unit acquires a captured image according to a prescribed angle of view. A sensor unit (sensor (DS1) and a sensor control unit (2)) acquire: a user proximity point, which is the position of, inter alia, the hand of the user; and a distance to the user proximity point. An arithmetic unit (3) calculates the distance from the device to a plane that includes both eyes of the user on the basis of the captured image, and calculates an overlapping range of visual fields, which is the range in which the user proximity point and the display face of the display device appear to be overlapping from the user's visual point, on the basis of: the user proximity point, which is acquired by the sensor unit; the distance to the user proximity point from the device; and the horizontal and vertical widths of the display face of the display device. When it is determined that the user proximity point is within the overlapping range of visual fields, the arithmetic unit (3) calculates the coordinate information of the user proximity point and outputs the result of the calculation.

Description

座標取得装置および表示装置Coordinate acquisition device and display device
 本発明は、3次元空間内のユーザーの手の位置等に基づいて、表示装置の表示画面上の座標等を特定するための技術に関する。 The present invention relates to a technique for specifying coordinates on a display screen of a display device based on the position of a user's hand in a three-dimensional space.
 例えば、特許文献1(特開2012-215964号公報)には、立体画像を表示させ、ユーザーに立体画像の結像位置をタッチ等させることで、タッチパネルを介して、座標情報を入力することができる立体画像表示操作装置の開示がある。 For example, in Patent Document 1 (Japanese Patent Application Laid-Open No. 2012-215964), coordinate information can be input via a touch panel by displaying a stereoscopic image and allowing a user to touch an imaging position of the stereoscopic image. There is a disclosure of a stereoscopic image display operation device that can be used.
 しかしながら、上記の立体画像表示操作装置は、タッチパネルを用いた入力を前提としており、立体画像表示操作装置のタッチパネル上の容量変化(電界変化)を生じさせることができる程度の近接領域に、ユーザーの指等を移動させる必要がある。 However, the above-described stereoscopic image display operation device is premised on input using a touch panel, and a user's proximity region to the extent that can cause a capacitance change (electric field change) on the touch panel of the stereoscopic image display operation device. It is necessary to move a finger or the like.
 つまり、上記の立体画像表示操作装置では、大きな表示画面を有する表示装置に対して、離れた位置から、ユーザーが指等を動かすことにより、操作をすることは困難である。 That is, in the above-described stereoscopic image display operation device, it is difficult for the user to operate the display device having a large display screen by moving a finger or the like from a remote position.
 また、大きな表示画面を有する表示装置は、通常、離れて見ることが多いため、当該表示装置に近接しなければ操作ができないのでは、極めて煩雑である。 In addition, since a display device having a large display screen is usually viewed from a distance, it is extremely complicated if it cannot be operated unless it is close to the display device.
 さらに、大きな表示画面を有する表示装置に、タッチパネルを装着する場合、タッチパネルのサイズも大きくなるため、コストが高くなる。 Furthermore, when a touch panel is mounted on a display device having a large display screen, the size of the touch panel is increased, resulting in an increase in cost.
 そこで、本発明は、上記課題に鑑み、タッチパネルを必要とせず、離れた位置からでも、3次元空間内のユーザーの手の位置等に基づいて、表示装置の表示画面上の座標等を適切に特定することができる座標取得装置および表示装置を実現することを目的とする。 Therefore, in view of the above problems, the present invention does not require a touch panel, and appropriately sets the coordinates on the display screen of the display device based on the position of the user's hand in the three-dimensional space, even from a remote position. An object is to realize a coordinate acquisition device and a display device that can be specified.
 上記課題を解決するために、第1の構成は、表示装置の表示面上の座標情報を取得するための座標取得装置であって、撮像部と、センサー部と、演算部と、を備える。 In order to solve the above-described problem, the first configuration is a coordinate acquisition device for acquiring coordinate information on the display surface of the display device, and includes an imaging unit, a sensor unit, and a calculation unit.
 撮像部は、所定の画角により、被写体を撮像することで撮像画像を取得する。 The imaging unit acquires a captured image by imaging a subject with a predetermined angle of view.
 センサー部は、ユーザーの手またはユーザーが把持している物の位置であるユーザー近接点およびユーザー近接点までの距離を取得する。 The sensor unit acquires the user proximity point, which is the position of the user's hand or the object being held by the user, and the distance to the user proximity point.
 演算部は、
(1)撮像部により取得された撮像画像において、ユーザーの両眼を検出し、検出した両眼の撮像画像上での位置および両眼間の距離と、画角とに基づいて、自装置からユーザーの両眼を含む平面までの距離を算出し、
(2)センサー部により取得されたユーザー近接点および自装置からユーザー近接点までの距離と、表示装置の表示面の水平方向の幅および垂直方向の幅と、に基づいて、ユーザーの視点からユーザー近接点と表示装置の表示面とが重なって見える範囲である重複視界範囲を算出し、
(3)ユーザー近接点が、重複視界範囲内であると判定した場合、ユーザー近接点の座標情報を算出し出力する。
The calculation unit
(1) In the captured image acquired by the imaging unit, the user's eyes are detected, and based on the detected position of both eyes on the captured image, the distance between both eyes, and the angle of view, Calculate the distance to the plane containing the user ’s eyes,
(2) Based on the user proximity point acquired by the sensor unit and the distance from the user's own device to the user proximity point, and the horizontal and vertical widths of the display surface of the display device, Calculate the overlapping view range, which is the range where the proximity point and the display surface of the display device appear to overlap,
(3) When it is determined that the user proximity point is within the overlapping view range, coordinate information of the user proximity point is calculated and output.
 本発明によれば、タッチパネルを必要とせず、離れた位置からでも、3次元空間内のユーザーの手の位置等に基づいて、表示装置の表示画面上の座標等を適切に特定することができる座標取得装置および表示装置を実現することができる。 According to the present invention, the coordinates on the display screen of the display device can be appropriately specified based on the position of the user's hand in the three-dimensional space even from a remote position without the need for a touch panel. A coordinate acquisition device and a display device can be realized.
第1実施形態に係る表示装置1000の概略構成図。1 is a schematic configuration diagram of a display device 1000 according to a first embodiment. 第1実施形態に係る表示装置1000(座標取得装置100)において、ユーザーの手の位置(3次元空間内の位置)を取得し、当該手の位置に対応する表示面Scn上の座標情報を取得(出力)する処理のフローチャート。In the display apparatus 1000 (coordinate acquisition apparatus 100) according to the first embodiment, the position of the user's hand (position in the three-dimensional space) is acquired, and the coordinate information on the display surface Scn corresponding to the position of the hand is acquired. The flowchart of the process to (output). 水平画角αで撮像部1が取得した撮像画像Img1を用いて、撮像部1からユーザーの位置までの距離Dを算出する方法を説明するための図。The figure for demonstrating the method to calculate the distance D from the imaging part 1 to a user's position using the captured image Img1 which the imaging part 1 acquired with the horizontal view angle (alpha). 表示パネルLCDの表示面Scnと、センサーDS1と、測定対象物としてのユーザーの手とを模式的に示した図。The figure which showed typically the display surface Scn of display panel LCD, sensor DS1, and the user's hand as a measuring object. ユーザーの視点のx座標位置の特定方法について、説明するための図。The figure for demonstrating the identification method of the x coordinate position of a user's viewpoint. 重複視界範囲のx軸方向の下限位置のx座標Xminの算出方法を説明するための図。The figure for demonstrating the calculation method of the x coordinate Xmin of the lower limit position of the x-axis direction of an overlapping visual field range. 重複視界範囲のx軸方向の上限位置のx座標Xmaxの算出方法を説明するための図。The figure for demonstrating the calculation method of x coordinate Xmax of the upper limit position of the x-axis direction of an overlapping visual field range. ユーザーの視点のy座標位置の特定方法について、説明するための図。The figure for demonstrating the identification method of the y coordinate position of a user's viewpoint. 重複視界範囲のy軸方向の下限位置のy座標Yminの算出方法を説明するための図。The figure for demonstrating the calculation method of y coordinate Ymin of the lower limit position of the y-axis direction of an overlapping visual field range. 重複視界範囲のy軸方向の上限位置のy座標Ymaxの算出方法を説明するための図。The figure for demonstrating the calculation method of y coordinate Ymax of the upper limit position of the y-axis direction of an overlapping view range. ユーザーの指(表示面Scnに最も近い部分)の位置P0のx座標が、重複視界範囲内であるか否かを判定する処理を説明するための図。The figure for demonstrating the process which determines whether the x coordinate of the position P0 of a user's finger | toe (part closest to the display surface Scn) is in an overlapping view range. ユーザーの指(表示面Scnに最も近い部分)の位置P0のy座標が、重複視界範囲内であるか否かを判定する処理を説明するための図。The figure for demonstrating the process which determines whether the y coordinate of the position P0 of a user's finger | toe (part closest to the display surface Scn) is in an overlapping view range.
 [第1実施形態]
 第1実施形態について、図面を参照しながら、以下、説明する。
[First Embodiment]
The first embodiment will be described below with reference to the drawings.
 以下では、座標取得装置を用いた装置として、表示装置1000を一例として、説明する。 Hereinafter, a display device 1000 will be described as an example of a device using a coordinate acquisition device.
 <1.1:表示装置の構成>
 図1は、表示装置1000の概略構成図である。
<1.1: Configuration of display device>
FIG. 1 is a schematic configuration diagram of the display device 1000.
 表示装置1000は、図1に示すように、座標取得装置100と、表示制御部4と、表示パネル(例えば、液晶ディスプレイ、有機ELディスプレイ等)LCDと、を備える。 As shown in FIG. 1, the display device 1000 includes a coordinate acquisition device 100, a display control unit 4, and a display panel (eg, a liquid crystal display, an organic EL display, etc.) LCD.
 座標取得装置100は、図1に示すように、撮像部1と、センサーDS1、DS2と、センサー制御部2と、演算部3と、を備える。 As shown in FIG. 1, the coordinate acquisition apparatus 100 includes an imaging unit 1, sensors DS <b> 1 and DS <b> 2, a sensor control unit 2, and a calculation unit 3.
 撮像部1は、例えば、表示パネルLCDの筐体の一部(例えば、表示面Scnの上部の周辺部(額縁部)の中央付近)に設置され、表示面Scnの法線方向に存在する被写体(シーン)を撮像する。つまり、撮像部1は、表示面Scnに面している被写体等(例えば、表示面Scnを見ているユーザー等)を撮像する。 The imaging unit 1 is installed in, for example, a part of the housing of the display panel LCD (for example, near the center of the upper peripheral portion (frame portion) of the display surface Scn) and is a subject that exists in the normal direction of the display surface Scn (Scene) is imaged. That is, the imaging unit 1 captures an image of a subject or the like facing the display surface Scn (for example, a user looking at the display surface Scn).
 センサーDS1は、例えば、近接センサーである。センサーDS1は、赤外光等の電磁波を照射し、対象物で反射し戻ってきた電磁波(反射波)を受光する。そして、センサーDS1は、反射波の受光量に応じた信号(所定の物理量に変換した信号)を、センサー制御部2に出力する。なお、センサーDS1、センサー制御部2により、センサーDS1により発光された赤外光等の反射光を撮像し、位置と強さにより距離を判断するようにしてもよい。 Sensor DS1 is, for example, a proximity sensor. The sensor DS1 irradiates an electromagnetic wave such as infrared light, and receives the electromagnetic wave (reflected wave) reflected and returned by the object. Then, the sensor DS1 outputs a signal corresponding to the received light amount of the reflected wave (a signal converted into a predetermined physical quantity) to the sensor control unit 2. The sensor DS1 and the sensor control unit 2 may capture reflected light such as infrared light emitted by the sensor DS1, and determine the distance based on the position and intensity.
 なお、本実施形態では、センサーが1つである場合を例に説明するが、センサーの数は、1つに限定されることはなく、2以上であってもよい。さらに、センサーを配置する位置も、測定対象物に応じて決定するようにしてもよい。 In addition, although this embodiment demonstrates to an example the case where there is one sensor, the number of sensors is not limited to one and may be two or more. Furthermore, the position where the sensor is arranged may be determined according to the measurement object.
 センサー制御部2は、センサーDS1から出力される信号(反射波の受光量に応じた信号)を入力し、当該信号に基づいて、測定対象物(例えば、ユーザーの手)の位置を取得(特定)する。センサー制御部2は、センサーDS1から照射する赤外光等の電磁波の強度、照射角度等を制御し、また、対象物からの反射波の受光強度、角度等の情報を、センサーDS1から出力される信号に基づいて、特定する。これにより、センサー制御部2は、測定対象物(例えば、ユーザーの手)の位置情報を取得することができる。そして、センサー制御部2は、取得した測定対象物の位置情報を演算部3に出力する。 The sensor control unit 2 inputs a signal output from the sensor DS1 (a signal corresponding to the amount of reflected wave received), and acquires (specifies the position of the measurement object (for example, a user's hand) based on the signal. ) The sensor control unit 2 controls the intensity and irradiation angle of electromagnetic waves such as infrared light emitted from the sensor DS1, and information such as the received light intensity and angle of the reflected wave from the object is output from the sensor DS1. Based on the signal to be identified. Thereby, the sensor control part 2 can acquire the positional information on a measuring object (for example, a user's hand). Then, the sensor control unit 2 outputs the acquired position information of the measurement object to the calculation unit 3.
 演算部3は、撮像部1が撮像した画像(映像)と、センサー制御部2により取得された測定対象物(例えば、ユーザーの手)の位置情報とを入力する。演算部3は、撮像部1が撮像した画像(映像)と、センサー制御部2により取得された測定対象物(例えば、ユーザーの手)の位置情報とに基づいて、ユーザーの手の位置が、ユーザーから見て、表示パネルLCDの表示面Scnと重なって見える位置に存在しているか否かを判定する。そして、演算部3は、ユーザーの手の位置が、ユーザーから見て、表示パネルLCDの表示面Scnと重なって見える位置に存在していると判定した場合、ユーザーの手の位置に基づいて、当該手の位置に対応する表示面Scn上の座標を算出する。そして、演算部3は、算出した座標に関する情報を表示制御部4に出力する。 The calculation unit 3 inputs an image (video) captured by the imaging unit 1 and position information of a measurement object (for example, a user's hand) acquired by the sensor control unit 2. Based on the image (video) captured by the imaging unit 1 and the position information of the measurement object (for example, the user's hand) acquired by the sensor control unit 2, the calculation unit 3 determines the position of the user's hand as follows: It is determined whether or not it exists at a position where it can be seen to overlap with the display surface Scn of the display panel LCD as viewed from the user. Then, when the calculation unit 3 determines that the position of the user's hand is present at a position where the user's hand is seen to overlap the display surface Scn of the display panel LCD, based on the position of the user's hand, The coordinates on the display surface Scn corresponding to the position of the hand are calculated. Then, the calculation unit 3 outputs information regarding the calculated coordinates to the display control unit 4.
 表示制御部4は、図1に示すように、入力制御部41と、アプリケーション部42と、表示処理部43とを備える。 The display control unit 4 includes an input control unit 41, an application unit 42, and a display processing unit 43, as shown in FIG.
 入力制御部41は、演算部3から出力される座標に関する情報を入力する。入力制御部41は、演算部3から入力された座標情報に基づいて、アプリケーション部42および/または表示処理部43に制御信号を出力する。例えば、入力制御部41は、演算部3から座標情報が入力されていない期間が一定期間継続した後、演算部3から座標情報が入力された場合、当該座標情報に応じて、表示面Scnの所定の表示を更新するために(例えば、アイコンの色を変化させる等の更新処理を実行するために)必要なデータおよび制御信号を生成し、生成したデータおよび制御信号をアプリケーション部42に出力する。 The input control unit 41 inputs information regarding coordinates output from the calculation unit 3. The input control unit 41 outputs a control signal to the application unit 42 and / or the display processing unit 43 based on the coordinate information input from the calculation unit 3. For example, when the coordinate information is input from the calculation unit 3 after a period in which the coordinate information is not input from the calculation unit 3 continues for a certain period, the input control unit 41 determines the display surface Scn according to the coordinate information. Data and control signals necessary for updating a predetermined display (for example, for executing update processing such as changing the color of an icon) are generated, and the generated data and control signals are output to the application unit 42. .
 また、入力制御部41は、表示処理部43を制御するための制御信号を、表示処理部43に出力する。 Further, the input control unit 41 outputs a control signal for controlling the display processing unit 43 to the display processing unit 43.
 アプリケーション部42は、入力制御部41から出力されるデータ、制御信号を入力する。アプリケーション部42は、入力制御部41から出力される制御信号に従い、入力制御部41から出力されるデータに対して所定の処理を実行する。アプリケーション部42は、例えば、入力制御部41から出力される、ユーザーの手の位置を示す座標情報に基づいて、表示面Scnの当該座標位置に所定の表示更新(例えば、アイコンの表示更新)を行うためのデータを生成し、生成したデータを表示処理部43に出力する。 The application unit 42 inputs data and control signals output from the input control unit 41. The application unit 42 executes predetermined processing on the data output from the input control unit 41 in accordance with the control signal output from the input control unit 41. For example, the application unit 42 performs predetermined display update (for example, icon display update) on the coordinate position of the display surface Scn based on the coordinate information indicating the position of the user's hand output from the input control unit 41. Data for performing is generated, and the generated data is output to the display processing unit 43.
 表示処理部43は、入力制御部41から出力される制御信号と、アプリケーション部42から出力されるデータとを入力する。表示処理部43は、入力制御部41から出力される制御信号に従い、アプリケーション部42から出力されるデータに基づく表示が、表示面Scnにおいてなされるように、表示パネルLCDを制御する。つまり、表示処理部43は、表示パネルLCDの表示面Scnに表示させるための表示データを出力する。また、表示処理部43は、当該表示データが表示パネルLCDの表示面Scnに表示させるための表示制御信号を、表示パネルLCDに出力する。 The display processing unit 43 inputs a control signal output from the input control unit 41 and data output from the application unit 42. The display processing unit 43 controls the display panel LCD according to the control signal output from the input control unit 41 so that the display based on the data output from the application unit 42 is performed on the display surface Scn. That is, the display processing unit 43 outputs display data to be displayed on the display surface Scn of the display panel LCD. Further, the display processing unit 43 outputs a display control signal for causing the display data to be displayed on the display surface Scn of the display panel LCD to the display panel LCD.
 表示パネルLCDは、例えば、液晶ディスプレイ、有機ELディスプレイ等で実現される。表示パネルLCDは、表示制御部4の表示処理部43から出力される表示制御信号に基づいて、表示制御部4の表示処理部43から出力される表示データを、表示面Scnに表示する。 The display panel LCD is realized by, for example, a liquid crystal display, an organic EL display, or the like. The display panel LCD displays the display data output from the display processing unit 43 of the display control unit 4 on the display surface Scn based on the display control signal output from the display processing unit 43 of the display control unit 4.
 <1.2:表示装置の動作>
 以上のように構成された表示装置1000の動作について、以下、説明する。
<1.2: Operation of display device>
The operation of the display device 1000 configured as described above will be described below.
 以下では、表示装置1000(座標取得装置100)において、ユーザーの手の位置(3次元空間内の位置)を取得し、当該手の位置に対応する表示面Scn上の座標情報を取得する処理が実行される場合について、説明する。 Hereinafter, in the display apparatus 1000 (coordinate acquisition apparatus 100), a process of acquiring the position of the user's hand (position in the three-dimensional space) and acquiring coordinate information on the display surface Scn corresponding to the position of the hand. The case where it is executed will be described.
 図2は、表示装置1000(座標取得装置100)において、ユーザーの手の位置(3次元空間内の位置)を取得し、当該手の位置に対応する表示面Scn上の座標情報を取得(出力)する処理のフローチャートである。 FIG. 2 shows a display device 1000 (coordinate acquisition device 100) that acquires the position of the user's hand (position in the three-dimensional space) and acquires coordinate information on the display surface Scn that corresponds to the position of the hand (output). It is a flowchart of the process to do.
 (S1):
 ステップS1において、撮像部1は、表示面Scnの法線方向に存在する被写体(シーン)を撮像する。つまり、撮像部1は、表示面Scnに面している被写体等(例えば、表示面Scnを見ているユーザー等)を撮像し、撮像画像を取得する。なお、撮像部1は、水平画角αで、撮像画像を取得する。
(S1):
In step S1, the imaging unit 1 images a subject (scene) that exists in the normal direction of the display surface Scn. That is, the imaging unit 1 captures a subject or the like facing the display surface Scn (for example, a user looking at the display surface Scn) and acquires a captured image. Note that the imaging unit 1 acquires a captured image at a horizontal angle of view α.
 (S2、S3):
 ステップS2において、演算部3は、撮像部1により取得された撮像画像から、ユーザーの目の位置を検出する。例えば、演算部3は、撮像画像から、マッチング処理等により、人の眼の特徴を有する画像領域を特定することで、ユーザーの眼の位置を検出する。
(S2, S3):
In step S <b> 2, the calculation unit 3 detects the position of the user's eyes from the captured image acquired by the imaging unit 1. For example, the computing unit 3 detects the position of the user's eyes by specifying an image region having human eye characteristics from the captured image by matching processing or the like.
 ユーザーの眼が検出された場合(ステップS3で「Yes」の場合)、処理をステップS4に進め、ユーザーの眼が検出されなかった場合(ステップS3で「No」の場合)、処理を終了させる。 If the user's eyes are detected (“Yes” in step S3), the process proceeds to step S4. If the user's eyes are not detected (“No” in step S3), the process is terminated. .
 (S4):
 ステップS4において、演算部3は、撮像部1からユーザーまでの距離Dを算出する。これについて、図3を用いて説明する。
(S4):
In step S4, the calculation unit 3 calculates a distance D from the imaging unit 1 to the user. This will be described with reference to FIG.
 図3は、水平画角αで撮像部1が取得した撮像画像Img1を用いて、撮像部1からユーザーの位置までの距離Dを算出する方法を説明するための図である。なお、図3において、ユーザーの両眼の中心は、撮像部1の光学系の光軸上にあり、撮像部1からユーザーの両眼の中心点までの距離は、Dであるものとして、以下、説明する。また、図3に示した撮像画像Img1は、図3の状況において、撮像部1が、水平画角αで取得(撮像)した画像である。 FIG. 3 is a diagram for explaining a method of calculating the distance D from the imaging unit 1 to the user position using the captured image Img1 acquired by the imaging unit 1 at the horizontal angle of view α. In FIG. 3, it is assumed that the center of both eyes of the user is on the optical axis of the optical system of the imaging unit 1 and the distance from the imaging unit 1 to the center point of both eyes of the user is D. ,explain. In addition, the captured image Img1 illustrated in FIG. 3 is an image acquired (captured) at the horizontal angle of view α by the imaging unit 1 in the situation illustrated in FIG.
 図3に示すように、撮像画像Img1の水平方向の幅をWcとし、ユーザーの両眼間の距離をd0とし、撮像画像Img1の水平方向の幅をWcに対するユーザーの両眼間の距離d0の比率をPとすると、
  Wc=2×D×tan(α/2)       (1)
  P=d0/Wc               (2)
であるので、演算部3は、撮像部1からユーザーの位置までの距離Dを、
  D=d0/(P×2×tan(α/2))   (3)
により、算出する。
As shown in FIG. 3, the horizontal width of the captured image Img1 is Wc, the distance between both eyes of the user is d0, and the horizontal width of the captured image Img1 is the distance d0 between the eyes of the user with respect to Wc. If the ratio is P,
Wc = 2 × D × tan (α / 2) (1)
P = d0 / Wc (2)
Therefore, the calculation unit 3 calculates the distance D from the imaging unit 1 to the user's position,
D = d0 / (P × 2 × tan (α / 2)) (3)
To calculate.
 なお、ユーザーの両眼間距離(人の両眼間距離)は、約6[cm]であるので、
  d0=6[cm]
とすることで、演算部3は、撮像部1からユーザーの位置までの距離Dを算出することができる。
Note that the distance between the eyes of the user (the distance between the eyes of the person) is about 6 [cm].
d0 = 6 [cm]
Thus, the calculation unit 3 can calculate the distance D from the imaging unit 1 to the user's position.
 つまり、演算部3は、水平画角αで取得された撮像画像Img1において、撮像画像Img1の水平方向の幅をWcに対するユーザーの両眼間の距離d0の比率Pを算出し、上記(3)式に相当する処理を実行することで、撮像部1からユーザーの位置までの距離Dを算出する。 That is, in the captured image Img1 acquired at the horizontal angle of view α, the calculation unit 3 calculates the ratio P of the distance d0 between the user's eyes to the horizontal width of the captured image Img1 with respect to Wc. By executing the processing corresponding to the equation, the distance D from the imaging unit 1 to the user's position is calculated.
 (S5):
 ステップS5において、センサーDS1は、赤外光等の電磁波を照射し、対象物で反射し戻ってきた電磁波(反射波)を受光する。そして、センサーDS1は、反射波の受光量に応じた信号(所定の物理量に変換した信号)を、センサー制御部2に出力する。
(S5):
In step S5, the sensor DS1 emits an electromagnetic wave such as infrared light, and receives the electromagnetic wave (reflected wave) reflected and returned from the object. Then, the sensor DS1 outputs a signal corresponding to the received light amount of the reflected wave (a signal converted into a predetermined physical quantity) to the sensor control unit 2.
 センサー制御部2は、センサーDS1から入力された信号(反射波の受光量に応じた信号)に基づいて、測定対象物(例えば、ユーザーの手)の位置を取得(特定)する。センサー制御部2は、センサーDS1から照射する赤外光等の電磁波の強度、照射角度等を制御し、また、対象物からの反射波の受光強度、角度等の情報を、センサーDS1から出力される信号に基づいて、特定する。これにより、センサー制御部2は、測定対象物(例えば、ユーザーの手)の位置情報を取得する。そして、センサー制御部2は、取得した測定対象物の位置情報を演算部3に出力する。 The sensor control unit 2 acquires (identifies) the position of the measurement object (for example, the user's hand) based on the signal input from the sensor DS1 (signal corresponding to the amount of reflected wave received). The sensor control unit 2 controls the intensity and irradiation angle of electromagnetic waves such as infrared light emitted from the sensor DS1, and information such as the received light intensity and angle of the reflected wave from the object is output from the sensor DS1. Based on the signal to be identified. Thereby, the sensor control part 2 acquires the positional information on a measuring object (for example, a user's hand). Then, the sensor control unit 2 outputs the acquired position information of the measurement object to the calculation unit 3.
 図4は、表示パネルLCDの表示面Scnと、センサーDS1と、測定対象物としてのユーザーの手とを模式的に示した図である。図4は、センサーDS1と、測定対象物としてのユーザーの手とを上方から見た図であり、図4に示すようにx軸、y軸、および、z軸を設定し、センサーDS1の位置のx座標を「0」(原点)とする。 FIG. 4 is a diagram schematically showing the display surface Scn of the display panel LCD, the sensor DS1, and the user's hand as a measurement object. FIG. 4 is a diagram of the sensor DS1 and the user's hand as a measurement object as viewed from above. As shown in FIG. 4, the x-axis, y-axis, and z-axis are set, and the position of the sensor DS1 The x coordinate of is set to “0” (origin).
 センサーDS1は、ユーザーの手の先端位置である点P0からの反射波を受光する。センサー制御部2は、ユーザーの手の先端位置である点P0からの反射波の受光量に基づいて、
(1)センサーDS1からユーザーの手の先端位置である点P0までのz軸方向の距離Z0と、
(2)点P0のx座標X0と、
を取得する。
The sensor DS1 receives the reflected wave from the point P0 that is the tip position of the user's hand. The sensor control unit 2 is based on the received light amount of the reflected wave from the point P0 that is the tip position of the user's hand.
(1) a distance Z0 in the z-axis direction from the sensor DS1 to the point P0 that is the tip position of the user's hand;
(2) x-coordinate X0 of point P0;
To get.
 なお、距離Z0は、センサーDS1からユーザーの手の先端位置である点P0までのz軸方向の距離であるので、表示パネルLCDの表示面Scnから点P0までの距離と等しい。 Note that the distance Z0 is the distance in the z-axis direction from the sensor DS1 to the point P0 that is the tip position of the user's hand, and is therefore equal to the distance from the display surface Scn of the display panel LCD to the point P0.
 (S6):
 ステップS6において、演算部3は、表示面Scnから距離Z0に、ユーザーの指(表示面Scnに最も近い部分)が存在する場合における重複視界範囲を特定する。
(S6):
In step S <b> 6, the calculation unit 3 specifies the overlapping view range when the user's finger (portion closest to the display surface Scn) exists at a distance Z <b> 0 from the display surface Scn.
 なお、「重複視界範囲」とは、ユーザーが、表示面Scnを見たときに、ユーザーの指(手の先端)(指示棒やペン等の先端であってもよい。)と表示面Scnとが重なって見える範囲のことをいう。 The “overlapping field of view” refers to the user's finger (tip of the hand) (which may be the tip of a pointer, pen, or the like) and the display surface Scn when the user views the display surface Scn. It is the range that appears to overlap.
 重複視界範囲の特定方法について、以下、説明する。 The method for specifying the overlapping field of view will be described below.
 まず、ユーザーの視点のx座標位置の特定方法について、説明する。 First, a method for specifying the x coordinate position of the user's viewpoint will be described.
 図5は、ユーザーの視点のx座標位置の特定方法について、説明するための図である。 FIG. 5 is a diagram for explaining a method of specifying the x coordinate position of the user's viewpoint.
 図5は、上方から見た図であり、撮像部1と、表示パネルLCDの表示面Scnと、ユーザーの両眼が含まれる平面とを模式的に示す図である。また、図5では、撮像部1により、水平画角αで撮像した撮像画像Img2を模式的に示している。なお、撮像画像Img2は、x軸の位置関係を一致させるために、左右を反転させた画像(ミラー画像)として、図示している。 FIG. 5 is a diagram seen from above, and schematically shows the imaging unit 1, the display surface Scn of the display panel LCD, and a plane including both eyes of the user. FIG. 5 schematically shows a captured image Img2 captured by the imaging unit 1 at a horizontal angle of view α. Note that the captured image Img2 is illustrated as an image (mirror image) in which left and right are reversed in order to match the positional relationship of the x axis.
 また、図5に示すように、x軸、y軸、および、z軸を設定し、撮像部1の位置(撮像部1の光学系の光軸の位置)のx座標を「0」(原点)とする。また、図5に示すように、ユーザーの両眼の中心位置のx座標をX1とし、表示面Scnから距離Dの平面上において、水平画角αにより撮像される範囲のx軸方向の長さをWcとする。 Also, as shown in FIG. 5, the x-axis, y-axis, and z-axis are set, and the x-coordinate of the position of the imaging unit 1 (the optical axis position of the optical system of the imaging unit 1) is set to “0” (the origin ). Also, as shown in FIG. 5, the x coordinate of the center position of both eyes of the user is X1, and the length in the x-axis direction of the range imaged by the horizontal angle of view α on the plane at a distance D from the display surface Scn. Is Wc.
 また、図5に示すように、撮像画像Img2の水平方向の幅をXpicとし、撮像画像の中心点からユーザーの両眼の中心位置までのx軸方向(水平方向)の長さをXpcとする。 Further, as shown in FIG. 5, the horizontal width of the captured image Img2 is Xpic, and the length in the x-axis direction (horizontal direction) from the center point of the captured image to the center position of both eyes of the user is Xpc. .
 上記のように設定した場合、演算部3は、ユーザーの両眼の中心点のx座標X1を、
  X1=(Wc/2)×Xpc/(Wpic/2)
により算出する。
When set as described above, the calculation unit 3 calculates the x coordinate X1 of the center point of the user's eyes,
X1 = (Wc / 2) × Xpc / (Wpic / 2)
Calculated by
 なお、演算部3は、
  Wc=2D×tan(α)
により、長さWcを算出する。
The calculation unit 3
Wc = 2D × tan (α)
To calculate the length Wc.
 上記処理により、ユーザーの視点(ユーザーの両眼の中心点)のx座標位置が特定される。 Through the above process, the x coordinate position of the user's viewpoint (the center point of both eyes of the user) is specified.
 次に、表示面Scnから距離Z0に、ユーザーの指(表示面Scnに最も近い部分)が存在する場合において、ユーザーの視点(ユーザーの両眼の中心点)P1から表示面Scnを見たときに、ユーザーの指と表示面Scnとが重複して見えるx軸方向の下限位置(重複視界範囲のx軸方向の下限位置)と、上限位置(重複視界範囲のx軸方向の上限位置)の算出方法について説明する。なお、重複視界範囲のx軸方向の下限位置のx座標をXminとし、重複視界範囲のx軸方向の上限位置のx座標をXmaxとする。 Next, when the user's finger (the portion closest to the display surface Scn) is present at a distance Z0 from the display surface Scn, when the display surface Scn is viewed from the user's viewpoint (center point of both eyes of the user) P1 The lower limit position in the x-axis direction (the lower limit position in the x-axis direction of the overlapping view range) and the upper limit position (the upper limit position in the x-axis direction of the overlapping view range) in which the user's finger and the display surface Scn appear to overlap each other A calculation method will be described. Note that the x coordinate of the lower limit position in the x-axis direction of the overlapping view range is Xmin, and the x coordinate of the upper limit position of the overlapping view range in the x-axis direction is Xmax.
 図6は、重複視界範囲のx軸方向の下限位置のx座標Xminの算出方法を説明するための図である。 FIG. 6 is a diagram for explaining a method of calculating the x-coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range.
 図7は、重複視界範囲のx軸方向の上限位置のx座標Xmaxの算出方法を説明するための図である。 FIG. 7 is a diagram for explaining a method of calculating the x-coordinate Xmax of the upper limit position in the x-axis direction of the overlapping view range.
 なお、表示面Scnのx軸方向の長さ(水平方向の幅)をWdとし、表示面Scnの中央位置のx座標が「0」とし、ユーザーの視点P1のx座標をX1として、以下、説明する。 The length of the display surface Scn in the x-axis direction (horizontal width) is Wd, the x coordinate of the center position of the display surface Scn is “0”, the x coordinate of the user's viewpoint P1 is X1, and explain.
 図6に示すように、
  D:(D-Z0)=(X1-(-Wd/2)):(X1-Xmin)
であるので、重複視界範囲のx軸方向の下限位置のx座標Xminは、
  Xmin=(Wd/2+X1)×Z0/D―Wd/2
により算出することができる。
As shown in FIG.
D: (D−Z0) = (X1 − (− Wd / 2)): (X1−Xmin)
Therefore, the x-coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range is
Xmin = (Wd / 2 + X1) × Z0 / D−Wd / 2
Can be calculated.
 また、図7に示すように、
  D:(D-Z0)=(Wd/2-X1):(Xmax-X1)
であるので、重複視界範囲のx軸方向の上限位置のx座標Xmaxは、
  Xmax=(X1-Wd/2)×Z0/D+Wd/2
により算出することができる。
In addition, as shown in FIG.
D: (D−Z0) = (Wd / 2−X1): (Xmax−X1)
Therefore, the x coordinate Xmax of the upper limit position in the x-axis direction of the overlapping view range is
Xmax = (X1-Wd / 2) × Z0 / D + Wd / 2
Can be calculated.
 演算部3は、上記数式に相当する処理により、重複視界範囲のx軸方向の下限位置のx座標Xminと、重複視界範囲のx軸方向の上限位置のx座標Xmaxとを算出する。 The calculation unit 3 calculates the x-coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range and the x-coordinate Xmax of the upper limit position of the overlapping view range in the x-axis direction by a process corresponding to the above formula.
 以上により、重複視界範囲のx軸方向の下限位置のx座標Xminと、重複視界範囲のx軸方向の上限位置のx座標Xmaxとが決定される。 As described above, the x coordinate Xmin of the lower limit position in the x-axis direction of the overlapping view range and the x coordinate Xmax of the upper limit position in the x-axis direction of the overlapping view range are determined.
 次に、ユーザーの視点のy座標位置の特定方法について、説明する。 Next, a method for specifying the y coordinate position of the user's viewpoint will be described.
 図8は、ユーザーの視点のy座標位置の特定方法について、説明するための図である。 FIG. 8 is a diagram for explaining a method of specifying the y coordinate position of the user's viewpoint.
 図8は、側方(x軸方向)から見た図であり、撮像部1と、表示パネルLCDの表示面Scnと、ユーザーの両眼が含まれる平面とを模式的に示す図である。また、図8では、撮像部1により、垂直画角βで撮像した撮像画像Img2を模式的に示している。なお、撮像画像Img2は、y軸の位置関係を一致させて示している。また、撮像画像Img2は、図5と同様に、左右を反転させた画像(ミラー画像)として、図示している。 FIG. 8 is a view seen from the side (x-axis direction), and schematically shows the imaging unit 1, the display surface Scn of the display panel LCD, and a plane including both eyes of the user. FIG. 8 schematically shows a captured image Img2 captured by the imaging unit 1 at a vertical angle of view β. Note that the captured image Img2 is shown with the positional relationship of the y-axis matched. Further, the captured image Img2 is illustrated as an image (mirror image) obtained by inverting the left and right as in FIG.
 また、図8に示すように、x軸、y軸、および、z軸を設定し、撮像部1の位置(撮像部1の光学系の光軸の位置)のy座標を「0」(原点)とする。なお、表示面Scnの中心点のy座標も「0」である。 Also, as shown in FIG. 8, the x-axis, y-axis, and z-axis are set, and the y-coordinate of the position of the imaging unit 1 (the optical axis position of the optical system of the imaging unit 1) is set to “0” (the origin ). The y coordinate of the center point of the display surface Scn is also “0”.
 また、図8に示すように、ユーザーの両眼の中心位置のy座標をY1とし、表示面Scnから距離Dの平面上において、垂直画角βにより撮像される範囲のy軸方向の長さをHcとする。 Also, as shown in FIG. 8, the y coordinate of the center position of both eyes of the user is Y1, and the length in the y-axis direction of the range imaged at the vertical angle of view β on the plane at the distance D from the display surface Scn. Is Hc.
 また、図8に示すように、撮像画像Img2の垂直方向の幅をHpicとし、撮像画像の中心点からユーザーの両眼の中心位置までのy軸方向(垂直方向)の長さをYpcとする。 Also, as shown in FIG. 8, the vertical width of the captured image Img2 is Hpic, and the length in the y-axis direction (vertical direction) from the center point of the captured image to the center position of both eyes of the user is Ypc. .
 上記のように設定した場合、演算部3は、ユーザーの両眼の中心点のy座標Y1を、
  Y1=(Hc/2)×Ypc/(Hpic/2)
により算出する。
When set as described above, the calculation unit 3 sets the y-coordinate Y1 of the center point of both eyes of the user as
Y1 = (Hc / 2) × Ypc / (Hpic / 2)
Calculated by
 なお、演算部3は、
  Hc=2D×tan(β)
により、長さHcを算出する。
The calculation unit 3
Hc = 2D × tan (β)
To calculate the length Hc.
 上記処理により、ユーザーの視点(ユーザーの両眼の中心点)のy座標位置が特定される。 Through the above process, the y coordinate position of the user's viewpoint (the center point of both eyes of the user) is specified.
 次に、表示面Scnから距離Z0に、ユーザーの指(表示面Scnに最も近い部分)が存在する場合において、ユーザーの視点(ユーザーの両眼の中心点)P1から表示面Scnを見たときに、ユーザーの指と表示面Scnとが重複して見えるy軸方向の下限位置(重複視界範囲のy軸方向の下限位置)と、上限位置(重複視界範囲のy軸方向の上限位置)の算出方法について説明する。なお、重複視界範囲のy軸方向の下限位置のy座標をYminとし、重複視界範囲のy軸方向の上限位置のy座標をYmaxとする。 Next, when the user's finger (the portion closest to the display surface Scn) is present at a distance Z0 from the display surface Scn, when the display surface Scn is viewed from the user's viewpoint (center point of both eyes of the user) P1 Furthermore, the lower limit position in the y-axis direction (the lower limit position in the y-axis direction of the overlapping view range) and the upper limit position (the upper limit position in the y-axis direction of the overlapping view range) in which the user's finger and the display surface Scn appear to overlap each other A calculation method will be described. Note that the y coordinate of the lower limit position in the y-axis direction of the overlapping view range is Ymin, and the y coordinate of the upper limit position of the overlapping view range in the y-axis direction is Ymax.
 図9は、重複視界範囲のy軸方向の下限位置のy座標Yminの算出方法を説明するための図である。 FIG. 9 is a diagram for explaining a method for calculating the y-coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range.
 図10は、重複視界範囲のy軸方向の上限位置のy座標Ymaxの算出方法を説明するための図である。 FIG. 10 is a diagram for explaining a method of calculating the y-coordinate Ymax of the upper limit position in the y-axis direction of the overlapping view range.
 なお、表示面Scnのy軸方向の長さ(垂直方向の長さ)をHtとし、表示面Scnの中央位置のy座標が「0」とし、ユーザーの視点P1のy座標をY1として、以下、説明する。 The length of the display surface Scn in the y-axis direction (vertical length) is Ht, the y-coordinate of the center position of the display surface Scn is “0”, and the y-coordinate of the user's viewpoint P1 is Y1. ,explain.
 図9に示すように、
  D:(D-Z0)=(Y1-(-Ht/2)):(Y1-Ymin)
であるので、重複視界範囲のy軸方向の下限位置のy座標Yminは、
  Ymin=(Ht/2+Y1)×Z0/D―Ht/2
により算出することができる。
As shown in FIG.
D: (D−Z0) = (Y1 − (− Ht / 2)): (Y1−Ymin)
Therefore, the y coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range is
Ymin = (Ht / 2 + Y1) × Z0 / D−Ht / 2
Can be calculated.
 また、図10に示すように、
  D:(D-Z0)=(Ht/2-Y1):(Ymax-Y1)
であるので、重複視界範囲のy軸方向の上限位置のy座標Ymaxは、
  Ymax=(Y1-Ht/2)×Z0/D+Ht/2
により算出することができる。
Also, as shown in FIG.
D: (D−Z0) = (Ht / 2−Y1): (Ymax−Y1)
Therefore, the y coordinate Ymax of the upper limit position in the y-axis direction of the overlapping view range is
Ymax = (Y1−Ht / 2) × Z0 / D + Ht / 2
Can be calculated.
 演算部3は、上記数式に相当する処理により、重複視界範囲のy軸方向の下限位置のy座標Yminと、重複視界範囲のy軸方向の上限位置のy座標Ymaxとを算出する。 The calculation unit 3 calculates the y-coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range and the y-coordinate Ymax of the upper limit position in the y-axis direction of the overlapping view range by a process corresponding to the above mathematical formula.
 以上により、重複視界範囲のy軸方向の下限位置のy座標Yminと、重複視界範囲のy軸方向の上限位置のy座標Ymaxとが決定される。 As described above, the y-coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range and the y-coordinate Ymax of the upper limit position of the overlapping view range in the y-axis direction are determined.
 上記処理により、重複視界範囲が特定される。すなわち、演算部3は、上記処理を実行することで、表示面Scnから距離Z0に、ユーザーの指(表示面Scnに最も近い部分)が存在する場合における、(1)重複視界範囲のx軸方向の下限位置のx座標Xminと、重複視界範囲のx軸方向の上限位置のx座標Xmaxと、(2)重複視界範囲のy軸方向の下限位置のy座標Yminと、重複視界範囲のy軸方向の上限位置のy座標Ymaxとを特定(算出)する。 The overlapping view range is specified by the above processing. That is, the calculation unit 3 performs the above processing, and (1) the x-axis of the overlapping view range when the user's finger (portion closest to the display surface Scn) exists at a distance Z0 from the display surface Scn. X coordinate Xmin of the lower limit position in the direction, x coordinate Xmax of the upper limit position in the x-axis direction of the overlapping view range, (2) y coordinate Ymin of the lower limit position in the y-axis direction of the overlapping view range, and y of the overlapping view range The y coordinate Ymax of the upper limit position in the axial direction is specified (calculated).
 (S7):
 ステップS7において、演算部3は、ユーザーの指(表示面Scnに最も近い部分)が、重複視界範囲内であるか否かを判定する。これについて、図11、図12を用いて説明する。
(S7):
In step S <b> 7, the calculation unit 3 determines whether or not the user's finger (the part closest to the display surface Scn) is within the overlapping view range. This will be described with reference to FIGS.
 図11は、ユーザーの指(表示面Scnに最も近い部分)の位置P0のx座標が、重複視界範囲内であるか否かを判定する処理を説明するための図である。 FIG. 11 is a diagram for explaining processing for determining whether or not the x-coordinate of the position P0 of the user's finger (portion closest to the display surface Scn) is within the overlapping view range.
 図12は、ユーザーの指(表示面Scnに最も近い部分)の位置P0のy座標が、重複視界範囲内であるか否かを判定する処理を説明するための図である。 FIG. 12 is a diagram for explaining processing for determining whether or not the y coordinate of the position P0 of the user's finger (portion closest to the display surface Scn) is within the overlapping view range.
 演算部3は、ユーザーの指の位置P0のx座標X0を取得し、当該x座標X0が、
  Xmin≦X0≦Xmax
を満たすか否かを判定する。
The calculation unit 3 acquires the x coordinate X0 of the position P0 of the user's finger, and the x coordinate X0 is
Xmin ≦ X0 ≦ Xmax
It is determined whether or not the above is satisfied.
 また、演算部3は、ユーザーの指の位置P0のy座標Y0を取得し、当該y座標Y0が、
  Ymin≦Y0≦Ymax
を満たすか否かを判定する。
Further, the calculation unit 3 acquires the y coordinate Y0 of the position P0 of the user's finger, and the y coordinate Y0 is
Ymin ≦ Y0 ≦ Ymax
It is determined whether or not the above is satisfied.
 上記2つの不等式の条件を満たす場合、演算部3は、ユーザーの指の位置P0が重複視界範囲内であると判定し(ステップS7において「Yes」)、処理をステップS8に進める。 If the above two inequality conditions are satisfied, the calculation unit 3 determines that the position P0 of the user's finger is within the overlapping field of view (“Yes” in step S7), and advances the process to step S8.
 一方、上記以外の場合、演算部3は、ユーザーの指の位置P0が重複視界範囲内ではないと判定し(ステップS7において「No」)、処理を終了させる。 On the other hand, in cases other than the above, the calculation unit 3 determines that the position P0 of the user's finger is not within the overlapping view range ("No" in step S7), and ends the process.
 (S8、S9):
 ステップS8において、演算部3は、ユーザーの指(表示面Scnに最も近い部分)の位置P0に対応する表示面Scn上の対応点P2の座標(X座標、Y座標)を算出する。
(S8, S9):
In step S8, the calculation unit 3 calculates the coordinates (X coordinate, Y coordinate) of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn).
 まず、点P2のX座標の算出処理について、図11を用いて説明する。なお、表示面Scn上のX座標は、図11に示すように、表示面Scnのx軸負方向の端部を「0」とし、表示面Scnのx軸正方向の端部を「Wd」(Wd:表示面Scnのx軸方向の長さ)とする。また、点P2の表示面Scn上のX座標をXoutとして、以下、説明する。 First, the calculation process of the X coordinate of the point P2 will be described with reference to FIG. As shown in FIG. 11, the X coordinate on the display surface Scn is “0” for the end of the display surface Scn in the negative x-axis direction and “Wd” for the end of the display surface Scn in the positive x-axis direction. (Wd: length of the display surface Scn in the x-axis direction). Further, the X coordinate of the point P2 on the display surface Scn is set as Xout and will be described below.
 図11に示すように、
  D:(D-Z0)=Xout:(X0-Xmin)
であるので、ユーザーの指(表示面Scnに最も近い部分)の位置P0に対応する表示面Scn上の対応点P2のX座標Xoutは、
  Xout=(X0-Xmin)×D/(D-Z0)
である。
As shown in FIG.
D: (D−Z0) = Xout: (X0−Xmin)
Therefore, the X coordinate Xout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn) is
Xout = (X0−Xmin) × D / (D−Z0)
It is.
 つまり、演算部3は、上記数式に相当する処理を行うことで、ユーザーの指(表示面Scnに最も近い部分)の位置P0に対応する表示面Scn上の対応点P2のX座標Xoutを算出する。 That is, the calculation unit 3 calculates the X coordinate Xout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn) by performing processing corresponding to the above formula. To do.
 次に、点P2のY座標の算出処理について、図12を用いて説明する。なお、表示面Scn上のY座標は、図12に示すように、表示面Scnのy軸負方向の端部(表示面Scnの下端部)を「0」とし、表示面Scnのx軸正方向の端部(表示面Scnの上端部)を「Ht」(Ht:表示面Scnのy軸方向の長さ)とする。また、点P2の表示面Scn上のY座標をYoutとして、以下、説明する。 Next, the calculation process of the Y coordinate of the point P2 will be described with reference to FIG. As shown in FIG. 12, the Y coordinate on the display surface Scn is “0” at the end of the display surface Scn in the negative y-axis direction (the lower end of the display surface Scn), and is positive on the x-axis of the display surface Scn. An end portion in the direction (upper end portion of the display surface Scn) is defined as “Ht” (Ht: length of the display surface Scn in the y-axis direction). Further, the Y coordinate of the point P2 on the display surface Scn is set as Yout and will be described below.
 図12に示すように、
  D:(D-Z0)=Yout:(Y0-Ymin)
であるので、ユーザーの指(表示面Scnに最も近い部分)の位置P0に対応する表示面Scn上の対応点P2のY座標Youtは、
  Yout=(Y0-Ymin)×D/(D-Z0)
である。
As shown in FIG.
D: (D−Z0) = Yout: (Y0−Ymin)
Therefore, the Y coordinate Yout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn) is
Yout = (Y0−Ymin) × D / (D−Z0)
It is.
 つまり、演算部3は、上記数式に相当する処理を行うことで、ユーザーの指(表示面Scnに最も近い部分)の位置P0に対応する表示面Scn上の対応点P2のY座標Youtを算出する。 That is, the calculation unit 3 performs a process corresponding to the above mathematical formula to calculate the Y coordinate Yout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (portion closest to the display surface Scn). To do.
 演算部3は、上記処理により取得したユーザーの指(表示面Scnに最も近い部分)の位置P0に対応する表示面Scn上の対応点P2のX座標XoutおよびY座標Youtを表示制御部4の入力制御部41に出力する(ステップS9)。 The calculation unit 3 uses the display control unit 4 to obtain the X coordinate Xout and the Y coordinate Yout of the corresponding point P2 on the display surface Scn corresponding to the position P0 of the user's finger (the portion closest to the display surface Scn) acquired by the above processing. It outputs to the input control part 41 (step S9).
 以上のようにして、表示装置1000(座標取得装置100)は、ユーザーの手の位置(3次元空間内の位置)を取得し、当該手の位置に対応する表示面Scn上の座標情報を取得(出力)する。 As described above, the display apparatus 1000 (coordinate acquisition apparatus 100) acquires the position of the user's hand (position in the three-dimensional space), and acquires coordinate information on the display surface Scn corresponding to the position of the hand. (Output.
 入力制御部41は、演算部3から入力された座標情報に基づいて、アプリケーション部42および/または表示処理部43に制御信号を出力する。例えば、入力制御部41は、演算部3から座標情報が入力されていない期間が一定期間継続した後、演算部3から座標情報が入力された場合、当該座標情報に応じて、表示面Scnの所定の表示を更新するために(例えば、アイコンの色を変化させる等の更新処理を実行するために)必要なデータおよび制御信号を生成し、生成したデータおよび制御信号をアプリケーション部42に出力する。 The input control unit 41 outputs a control signal to the application unit 42 and / or the display processing unit 43 based on the coordinate information input from the calculation unit 3. For example, when the coordinate information is input from the calculation unit 3 after a period in which the coordinate information is not input from the calculation unit 3 continues for a certain period, the input control unit 41 determines the display surface Scn according to the coordinate information. Data and control signals necessary for updating a predetermined display (for example, for executing update processing such as changing the color of an icon) are generated, and the generated data and control signals are output to the application unit 42. .
 また、入力制御部41は、表示処理部43を制御するための制御信号を、表示処理部43に出力する。 Further, the input control unit 41 outputs a control signal for controlling the display processing unit 43 to the display processing unit 43.
 アプリケーション部42は、入力制御部41から出力される制御信号に従い、入力制御部41から出力されるデータに対して所定の処理を実行する。アプリケーション部42は、例えば、入力制御部41から出力される、ユーザーの手の位置を示す座標情報に基づいて、表示面Scnの当該座標位置に所定の表示更新(例えば、アイコンの表示更新)を行うためのデータを生成し、生成したデータを表示処理部43に出力する。 The application unit 42 executes predetermined processing on data output from the input control unit 41 in accordance with a control signal output from the input control unit 41. For example, the application unit 42 performs predetermined display update (for example, icon display update) on the coordinate position of the display surface Scn based on the coordinate information indicating the position of the user's hand output from the input control unit 41. Data for performing is generated, and the generated data is output to the display processing unit 43.
 表示処理部43は、入力制御部41から出力される制御信号に従い、アプリケーション部42から出力されるデータに基づく表示が、表示面Scnにおいてなされるように、表示パネルLCDを制御する。つまり、表示処理部43は、表示パネルLCDの表示面Scnに表示させるための表示データを出力する。また、表示処理部43は、当該表示データが表示パネルLCDの表示面Scnに表示させるための表示制御信号を、表示パネルLCDに出力する。 The display processing unit 43 controls the display panel LCD according to the control signal output from the input control unit 41 so that the display based on the data output from the application unit 42 is performed on the display surface Scn. That is, the display processing unit 43 outputs display data to be displayed on the display surface Scn of the display panel LCD. Further, the display processing unit 43 outputs a display control signal for causing the display data to be displayed on the display surface Scn of the display panel LCD to the display panel LCD.
 表示パネルLCDは、表示制御部4の表示処理部43から出力される表示制御信号に基づいて、表示制御部4の表示処理部43から出力される表示データを、表示面Scnに表示する。 The display panel LCD displays the display data output from the display processing unit 43 of the display control unit 4 on the display surface Scn based on the display control signal output from the display processing unit 43 of the display control unit 4.
 以上のように、表示装置1000(座標取得装置100)では、撮像部1により取得された撮像画像からユーザーの視点の位置情報を取得し、さらに、センサーDS1およびセンサー制御部2により、3次元空間内のユーザーの手の位置(表示面Scnに最も近い部分)の座標情報を取得する。そして、表示装置1000(座標取得装置100)では、演算部3が、ユーザーの視点の位置情報および手の位置(表示面Scnに最も近い部分)の座標情報に基づいて、ユーザーが、表示面Scnを見たときに、ユーザーの手の位置(手の先端)(指示棒やペン等の先端であってもよい。)と表示面Scnとが重なって見える範囲である重複視界範囲を特定する。そして、表示装置1000(座標取得装置100)では、ユーザーの手の位置(表示面Scnに最も近い部分)が重複視界範囲内であるとき、演算部3が、ユーザーの視点とユーザーの手の位置とを結ぶ直線が表示面Scnと交差する点の座標(Xout,Yout)を算出し、出力する。このように、表示装置1000(座標取得装置100)では、タッチパネルを必要としないので、表示面Scnから離れた位置からでも、3次元空間内のユーザーの手の位置等に基づいて、ユーザーの手の位置等に対応する表示面Scn上の座標情報を適切に特定(取得)することができる。 As described above, in the display device 1000 (coordinate acquisition device 100), the position information of the user's viewpoint is acquired from the captured image acquired by the imaging unit 1, and further, the three-dimensional space is acquired by the sensor DS1 and the sensor control unit 2. The coordinate information of the position of the user's hand (the part closest to the display surface Scn) is acquired. In the display device 1000 (coordinate acquisition device 100), the calculation unit 3 allows the user to display the display surface Scn based on the position information of the user's viewpoint and the coordinate information of the hand position (the portion closest to the display surface Scn). When the user sees, the overlapping view range, which is the range in which the position of the user's hand (the tip of the hand) (or the tip of a pointer or pen, etc.) and the display surface Scn can be seen, is specified. Then, in the display device 1000 (coordinate acquisition device 100), when the position of the user's hand (the portion closest to the display surface Scn) is within the overlapping field of view, the calculation unit 3 calculates the position of the user's viewpoint and the position of the user's hand. The coordinates (Xout, Yout) of the point where the straight line connecting to the display surface Scn intersects are calculated and output. Thus, since the display device 1000 (coordinate acquisition device 100) does not require a touch panel, the user's hand can be determined based on the position of the user's hand in the three-dimensional space even from a position away from the display surface Scn. It is possible to appropriately specify (acquire) the coordinate information on the display surface Scn corresponding to the position and the like.
 なお、撮像部1により取得した撮像画像において、ユーザーの手と顔の位置を検出し、手と顔との距離が所定の値以下である場合、演算部3は、ステップS9の座標情報を出力する処理を実行しないようにしてもよい。これにより、例えば、表示装置1000において、ユーザーの手の動きをジェスチャーとして検出し、検出したジェスチャーに応じた処理を実行させる場合の誤動作を抑制することができる。つまり、上記のように処理することで、ユーザーが無意識に顔を触る動作をした場合に、表示装置1000が所定のジェスチャーとして間違って認識し、表示装置1000が誤動作することを適切に防止することができる。 In the captured image acquired by the imaging unit 1, the position of the user's hand and face is detected, and when the distance between the hand and the face is equal to or less than a predetermined value, the calculation unit 3 outputs the coordinate information in step S9. You may make it not perform the process to perform. Thereby, in the display apparatus 1000, for example, it is possible to detect a movement of the user's hand as a gesture and suppress malfunctions when executing processing according to the detected gesture. That is, by performing the processing as described above, when the user unconsciously touches his / her face, the display device 1000 can appropriately prevent the display device 1000 from erroneously recognizing as a predetermined gesture and malfunctioning. Can do.
 また、撮像部1により取得した撮像画像において、ユーザーの顔の位置を検出し、表示装置1000からユーザーの顔までの距離が一定値よりも大きい場合(ユーザーの顔が所定の距離よりも遠いところにある場合)、演算部3は、ステップS9の座標情報を出力する処理を実行しないようにしてもよい。これにより、表示装置1000が所定の距離よりも遠い位置にいるユーザーの手の動作を、所定のジェスチャーとして間違って認識し、表示装置1000が誤動作することを適切に防止することができる。 In addition, when the position of the user's face is detected in the captured image acquired by the imaging unit 1 and the distance from the display device 1000 to the user's face is greater than a certain value (where the user's face is farther than a predetermined distance) The calculation unit 3 may not execute the process of outputting the coordinate information in step S9. Accordingly, it is possible to appropriately prevent the display device 1000 from erroneously recognizing the motion of the user's hand in a position farther than the predetermined distance as the predetermined gesture and causing the display device 1000 to malfunction.
 また、(1)表示面Scnからユーザーの手の位置までの距離(z軸方向の距離)と、ユーザーの手の位置から顔までの距離(z軸方向の距離)との比が変わった場合であって、(2)ユーザーの視点とユーザーの手の位置とを結ぶ直線が表示面Scnと交差する点の座標(Xout,Yout)が変化しない場合、表示制御部4は、クリック動作であると判定し、所定の処理を実行し、当該処理に応じた表示が表示面Scnにおいてなされるように制御するようにしてもよい。 (1) When the ratio of the distance from the display surface Scn to the position of the user's hand (distance in the z-axis direction) and the distance from the position of the user's hand to the face (distance in the z-axis direction) changes (2) When the coordinates (Xout, Yout) of the point where the straight line connecting the user's viewpoint and the position of the user's hand intersects the display surface Scn does not change, the display control unit 4 performs a click operation. May be determined, a predetermined process may be executed, and control may be performed so that display according to the process is performed on the display surface Scn.
 例えば、図11、図12に示すように、ユーザーの視点の位置が変化せず、ユーザーの指が点P0から点P3に移動(あるいは往復移動)された場合、表示制御部4は、クリック動作であると判定し、所定の処理を実行し、当該処理に応じた表示が表示面Scnにおいてなされるように制御するようにしてもよい。 For example, as shown in FIGS. 11 and 12, when the position of the user's viewpoint does not change and the user's finger is moved (or reciprocated) from the point P0 to the point P3, the display control unit 4 performs the click operation. It may be determined that a predetermined process is executed, and display according to the process is performed on the display surface Scn.
 また、演算部3は、ユーザーの顔の位置を検出し、ユーザーの顔の位置が重複視界範囲外であると判定した場合、演算部3は、ステップS9の座標情報を出力する処理を実行しないようにしてもよい。なお、この場合、演算部3により検出されたユーザーの手の動きにより、表示制御部4は、当該ユーザーの手の動きを所定のジェスチャーであると判定するようにしてもよい。 In addition, when the calculation unit 3 detects the position of the user's face and determines that the position of the user's face is outside the overlapping view range, the calculation unit 3 does not execute the process of outputting the coordinate information in step S9. You may do it. In this case, the display control unit 4 may determine that the movement of the user's hand is a predetermined gesture based on the movement of the user's hand detected by the calculation unit 3.
 また、表示制御部4は、ユーザーの手の動きが同じである場合において、ユーザーの手の位置が異なる場合、異なるジェスチャーであると判定するようにしてもよい。つまり、表示制御部4は、検出されたユーザーの手の動き(ジェスチャー)が同じであっても、ユーザーの手の位置に応じて、異なる処理を実行するようにしてもよい。 In addition, when the movement of the user's hand is the same, the display control unit 4 may determine that the gesture is different when the position of the user's hand is different. That is, the display control unit 4 may execute different processes depending on the position of the user's hand even if the detected movement (gesture) of the user's hand is the same.
 また、「センサー部」は、例えば、センサーDS1とセンサー制御部2とによりその機能が実現される。 In addition, the function of the “sensor unit” is realized by the sensor DS1 and the sensor control unit 2, for example.
 [他の実施形態]
 上記実施形態において、撮像部1は、図1に示すように、表示パネルLCDの外に配置されているが、これに限定されることはなく、例えば、撮像部1は、表示パネルLCDの筐体内部に配置されるものであっても良い。
[Other Embodiments]
In the above embodiment, the imaging unit 1 is arranged outside the display panel LCD as shown in FIG. 1, but the present invention is not limited to this. For example, the imaging unit 1 includes the housing of the display panel LCD. It may be arranged inside the body.
 また、上記実施形態において、撮像部1の光学系の光軸の向き、画角等は、説明便宜のために、一例として説明したのであり、撮像部1の光学系の光軸の向き、画角等を上記実施形態のものから変更してもよい。例えば、撮像部1の光学系の光軸の向きがz軸に対して所定の傾きを有している場合、座標取得装置100は、撮像部1の光学系の光軸の傾きを考慮して、演算処理を実行すればよい。 In the above embodiment, the orientation of the optical axis and the angle of view of the optical system of the imaging unit 1 are described as an example for convenience of explanation. You may change a corner | angular etc. from the thing of the said embodiment. For example, when the direction of the optical axis of the optical system of the imaging unit 1 has a predetermined inclination with respect to the z-axis, the coordinate acquisition device 100 considers the inclination of the optical axis of the optical system of the imaging unit 1. The calculation process may be executed.
 また、上記実施形態の表示装置1000、座標取得装置100の一部または全部は、集積回路(例えば、LSI、システムLSI等)として実現されるものであってもよい。 In addition, part or all of the display device 1000 and the coordinate acquisition device 100 of the above embodiment may be realized as an integrated circuit (for example, an LSI, a system LSI, or the like).
 上記実施形態の各機能ブロックの処理の一部または全部は、プログラムにより実現されるものであってもよい。そして、上記実施形態の各機能ブロックの処理の一部または全部は、コンピュータにおいて、中央演算装置(CPU)により実行されるものであってもよい。また、それぞれの処理を行うためのプログラムは、ハードディスク、ROMなどの記憶装置に格納されており、中央演算装置(CPU)が、ROM、あるいはRAMから当該プログラムを読み出し、実行されるものであってもよい。 Part or all of the processing of each functional block in the above embodiment may be realized by a program. And a part or all of the processing of each functional block of the above embodiment may be executed by a central processing unit (CPU) in the computer. A program for performing each processing is stored in a storage device such as a hard disk or ROM, and a central processing unit (CPU) reads the program from the ROM or RAM and executes it. Also good.
 また、上記実施形態の各処理をハードウェアにより実現してもよいし、ソフトウェア(OS(オペレーティングシステム)、ミドルウェア、あるいは、所定のライブラリとともに実現される場合を含む。)により実現してもよい。さらに、ソフトウェアおよびハードウェアの混在処理により実現しても良い。さらに、ソフトウェアおよびハードウェアの混在処理により実現しても良い。 Further, each process of the above embodiment may be realized by hardware, or may be realized by software (including a case where it is realized together with an OS (operating system), middleware, or a predetermined library). Further, it may be realized by mixed processing of software and hardware. Further, it may be realized by mixed processing of software and hardware.
 また、上記実施形態における処理方法の実行順序は、必ずしも、上記実施形態の記載に制限されるものではなく、発明の要旨を逸脱しない範囲で、実行順序を入れ替えることができるものである。 Further, the execution order of the processing methods in the above embodiment is not necessarily limited to the description of the above embodiment, and the execution order can be changed without departing from the gist of the invention.
 前述した方法をコンピュータに実行させるコンピュータプログラム及びそのプログラムを記録したコンピュータ読み取り可能な記録媒体は、本発明の範囲に含まれる。ここで、コンピュータ読み取り可能な記録媒体としては、例えば、フレキシブルディスク、ハードディスク、CD-ROM、MO、DVD、大容量DVD、次世代DVD、半導体メモリを挙げることができる。 A computer program that causes a computer to execute the above-described method and a computer-readable recording medium that records the program are included in the scope of the present invention. Here, examples of the computer-readable recording medium include a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a large-capacity DVD, a next-generation DVD, and a semiconductor memory.
 上記コンピュータプログラムは、上記記録媒体に記録されたものに限られず、電気通信回線、無線又は有線通信回線、インターネットを代表とするネットワーク等を経由して伝送されるものであってもよい。 The computer program is not limited to the one recorded on the recording medium, but may be transmitted via a telecommunication line, a wireless or wired communication line, a network represented by the Internet, or the like.
 また、上記実施形態において、構成部材のうち、上記実施形態に必要な主要部材のみを簡略化して示している部分がある。したがって、上記実施形態において明示されなかった任意の構成部材を備えうる。また、上記実施形態および図面において、各部材の寸法は、実際の寸法および寸法比率等を忠実に表したものではない部分がある。 Further, in the above embodiment, there is a part in which only the main members necessary for the above embodiment are simplified among the constituent members. Therefore, it is possible to provide any constituent member that is not explicitly described in the above embodiment. Moreover, in the said embodiment and drawing, the dimension of each member has a part which does not represent an actual dimension, a dimension ratio, etc. faithfully.
 なお、本発明の具体的な構成は、前述の実施形態に限られるものではなく、発明の要旨を逸脱しない範囲で種々の変更および修正が可能である。 The specific configuration of the present invention is not limited to the above-described embodiment, and various changes and modifications can be made without departing from the scope of the invention.
 [付記]
 なお、本発明は、以下のようにも表現することができる。
[Appendix]
The present invention can also be expressed as follows.
 第1の発明は、表示装置の表示面上の座標情報を取得するための座標取得装置であって、撮像部と、センサー部と、演算部と、を備える。 1st invention is a coordinate acquisition apparatus for acquiring the coordinate information on the display surface of a display apparatus, Comprising: An imaging part, a sensor part, and a calculating part are provided.
 撮像部は、所定の画角により、被写体を撮像することで撮像画像を取得する。 The imaging unit acquires a captured image by imaging a subject with a predetermined angle of view.
 センサー部は、ユーザーの手またはユーザーが把持している物の位置であるユーザー近接点およびユーザー近接点までの距離を取得する。 The sensor unit acquires the user proximity point, which is the position of the user's hand or the object being held by the user, and the distance to the user proximity point.
 演算部は、
(1)撮像部により取得された撮像画像において、ユーザーの両眼を検出し、検出した両眼の撮像画像上での位置および両眼間の距離と、画角とに基づいて、自装置からユーザーの両眼を含む平面までの距離を算出し、
(2)センサー部により取得されたユーザー近接点および自装置からユーザー近接点までの距離と、表示装置の表示面の水平方向の幅および垂直方向の幅と、に基づいて、ユーザーの視点からユーザー近接点と表示装置の表示面とが重なって見える範囲である重複視界範囲を算出し、
(3)ユーザー近接点が、重複視界範囲内であると判定した場合、ユーザー近接点の座標情報を算出し出力する。
The calculation unit
(1) In the captured image acquired by the imaging unit, the user's eyes are detected, and based on the detected position of both eyes on the captured image, the distance between both eyes, and the angle of view, Calculate the distance to the plane containing the user ’s eyes,
(2) Based on the user proximity point acquired by the sensor unit and the distance from the user's own device to the user proximity point, and the horizontal and vertical widths of the display surface of the display device, Calculate the overlapping view range, which is the range where the proximity point and the display surface of the display device appear to overlap,
(3) When it is determined that the user proximity point is within the overlapping view range, coordinate information of the user proximity point is calculated and output.
 この座標取得装置では、撮像部により取得された撮像画像からユーザーの視点(両眼の中心点)の位置情報が取得され、さらに、センサー部により、ユーザー近接点(例えば、表示面に最も近い部分)の座標情報が取得される。そして、この座標取得装置では、演算部が、ユーザーの視点の位置情報およびユーザー近接点(表示面に最も近い部分)の座標情報に基づいて、ユーザーが、表示面を見たときに、ユーザー近接点(例えば、ユーザーの指やユーザーが把持している指示棒やペン等の先端であってもよい。)と表示面とが重なって見える範囲である重複視界範囲を特定する。そして、この座標取得装置では、ユーザー近接点が重複視界範囲内であるとき、演算部が、ユーザーの視点とユーザー近接点とを結ぶ直線が表示面と交差する点の座標(Xout,Yout)を算出し、出力する。このように、この座標取得装置では、タッチパネルを必要としないので、表示面から離れた位置からでも、3次元空間内のユーザーの手の位置等に基づいて、ユーザー近接点(ユーザーの手の位置等)に対応する表示面上の座標情報を適切に特定(取得)することができる。 In this coordinate acquisition device, the position information of the user's viewpoint (the center point of both eyes) is acquired from the captured image acquired by the imaging unit, and further, the sensor proximity point (for example, the part closest to the display surface) ) Coordinate information is acquired. In this coordinate acquisition device, when the user views the display surface based on the position information of the user's viewpoint and the coordinate information of the user proximity point (portion closest to the display surface), the user proximity An overlapping field of view that is a range in which a point (for example, the tip of a user's finger or a pointing stick or a pen held by the user) and a display surface can be seen is specified. In this coordinate acquisition device, when the user proximity point is within the overlapping view range, the calculation unit calculates the coordinates (Xout, Yout) of the point where the straight line connecting the user viewpoint and the user proximity point intersects the display surface. Calculate and output. As described above, since this coordinate acquisition apparatus does not require a touch panel, the user proximity point (the position of the user's hand) is determined based on the position of the user's hand in the three-dimensional space even from a position away from the display surface. Etc.) can be appropriately specified (acquired) on the display surface.
 第2の発明は、第1の発明である座標取得装置と、表示パネルと、表示制御部と、を備える表示装置である。 The second invention is a display device comprising the coordinate acquisition device according to the first invention, a display panel, and a display control unit.
 表示制御部は、表示パネルを制御する表示制御部であって、演算部が算出したユーザー近接点の座標情報を入力する。 The display control unit is a display control unit that controls the display panel, and inputs the coordinate information of the user proximity point calculated by the calculation unit.
 そして、演算部は、撮像部が取得した撮像画像において、ユーザーの手と顔の位置を検出し、手と顔との距離が所定の値以下であると判定した場合、ユーザー近接点の座標情報を表示制御部に出力しない。 The arithmetic unit detects the position of the user's hand and face in the captured image acquired by the imaging unit, and determines that the distance between the hand and the face is equal to or less than a predetermined value, the coordinate information of the user proximity point Is not output to the display controller.
 これにより、この表示装置では、例えば、ユーザーの手の動きをジェスチャーとして検出し、検出したジェスチャーに応じた処理を実行させる場合の誤動作を抑制することができる。つまり、この表示装置では、上記のように処理することで、ユーザーが無意識に顔を触る動作をした場合に、表示装置が所定のジェスチャーとして間違って認識し、誤動作することを適切に防止することができる。 Thereby, in this display device, for example, it is possible to detect a movement of the user's hand as a gesture and suppress malfunctions when executing processing according to the detected gesture. In other words, in this display device, the processing as described above can appropriately prevent the display device from erroneously recognizing it as a predetermined gesture when the user unconsciously touches the face and malfunctioning. Can do.
 第3の発明は、第1の発明である座標取得装置と、表示パネルと、表示制御部と、を備える表示装置である。 The third invention is a display device comprising the coordinate acquisition device according to the first invention, a display panel, and a display control unit.
 表示制御部は、表示パネルを制御する表示制御部であって、演算部が算出したユーザー近接点の座標情報を入力する。 The display control unit is a display control unit that controls the display panel, and inputs the coordinate information of the user proximity point calculated by the calculation unit.
 演算部は、撮像部が取得した撮像画像において、ユーザーの顔の位置を検出し、自装置からユーザーの顔までの距離が一定値よりも大きい場合、ユーザー近接点の座標情報を表示制御部に出力しない。 The calculation unit detects the position of the user's face in the captured image acquired by the imaging unit, and if the distance from the own device to the user's face is greater than a certain value, the coordinate information of the user proximity point is displayed on the display control unit. Do not output.
 これにより、この表示装置では、所定の距離よりも遠い位置にいるユーザーの手の動作を、所定のジェスチャーとして間違って認識し、誤動作することを適切に防止することができる。 Thereby, in this display device, it is possible to appropriately recognize the movement of the user's hand at a position farther than a predetermined distance as a predetermined gesture, and to prevent erroneous operation.
 第4の発明は、第1の発明である座標取得装置と、表示パネルと、表示制御部と、を備える表示装置である。 The fourth invention is a display device comprising the coordinate acquisition device according to the first invention, a display panel, and a display control unit.
 表示制御部は、表示パネルを制御する表示制御部であって、演算部が算出したユーザー近接点の座標情報を入力する。 The display control unit is a display control unit that controls the display panel, and inputs the coordinate information of the user proximity point calculated by the calculation unit.
 表示制御部は、(1)表示面からユーザー近接点までの距離と、ユーザー近接点からユーザーの両眼が存在する平面までの距離との比が変わったと判定した場合であって、(2)ユーザーの両眼の中心点である視点とユーザー近接点とを結ぶ直線が表示面と交差する点の座標が変化していないと判定した場合、ユーザーによるクリック動作であると判定し、表示面の、視点とユーザー近接点とを結ぶ直線が表示面と交差する点の座標における表示に対応づけられた所定の処理を実行する。 The display control unit determines that (1) the ratio between the distance from the display surface to the user proximity point and the distance from the user proximity point to the plane on which the user's eyes are present has changed, (2) If it is determined that the coordinates of the point where the straight line connecting the viewpoint that is the center point of the user's eyes and the user's proximity point intersects the display surface have not changed, it is determined that the user has clicked, and the display surface Then, a predetermined process associated with the display at the coordinates of the point where the straight line connecting the viewpoint and the user proximity point intersects the display surface is executed.
 これにより、この表示装置では、ユーザーの視点の位置が変化せず、ユーザー近接点(例えば、ユーザーの指)と表示装置との距離が変化し、かつ、ユーザー近接点に対応する表示面上の座標位置が変化しない場合、表示制御部が、クリック動作であると判定し、所定の処理を実行し、例えば、当該処理に応じた表示が表示面においてなされるように制御することができる。 As a result, in this display device, the position of the user's viewpoint does not change, the distance between the user proximity point (for example, the user's finger) and the display device changes, and on the display surface corresponding to the user proximity point. When the coordinate position does not change, the display control unit can determine that the operation is a click operation, execute a predetermined process, and control so that, for example, a display corresponding to the process is performed on the display surface.
 本発明によれば、タッチパネルを必要とせず、離れた位置からでも、3次元空間内のユーザーの手の位置等に基づいて、表示装置の表示画面上の座標等を適切に特定することができる座標取得装置および表示装置を実現することができるので、表示装置関連産業分野において、有用であり、当該分野において実施することができる。 According to the present invention, the coordinates on the display screen of the display device can be appropriately specified based on the position of the user's hand in the three-dimensional space even from a remote position without the need for a touch panel. Since the coordinate acquisition device and the display device can be realized, it is useful in the display device related industrial field and can be implemented in this field.
1000 表示装置
100 座標取得装置
DS1 センサー
1 撮像部
2 センサー制御部
3 演算部
4 表示制御部
LCD 表示パネル
Scn 表示面
1000 Display device 100 Coordinate acquisition device DS1 Sensor 1 Imaging unit 2 Sensor control unit 3 Calculation unit 4 Display control unit LCD Display panel Scn Display surface

Claims (4)

  1.  表示装置の表示面上の座標情報を取得するための座標取得装置であって、
     所定の画角により、被写体を撮像することで撮像画像を取得する撮像部と、
     ユーザーの手またはユーザーが把持している物の位置であるユーザー近接点および前記ユーザー近接点までの距離を取得するセンサー部と、
    (1)前記撮像部により取得された前記撮像画像において、ユーザーの両眼を検出し、検出した両眼の前記撮像画像上での位置および両眼間の距離と、前記画角とに基づいて、自装置からユーザーの両眼を含む平面までの距離を算出し、
    (2)前記センサー部により取得されたユーザー近接点および自装置から前記ユーザー近接点までの距離と、前記表示装置の前記表示面の水平方向の幅および垂直方向の幅と、に基づいて、ユーザーの視点から前記ユーザー近接点と前記表示装置の前記表示面とが重なって見える範囲である重複視界範囲を算出し、
    (3)前記ユーザー近接点が、前記重複視界範囲内であると判定した場合、前記ユーザー近接点の座標情報を算出し出力する演算部と、
    を備える座標取得装置。
    A coordinate acquisition device for acquiring coordinate information on a display surface of a display device,
    An imaging unit that acquires a captured image by imaging a subject with a predetermined angle of view;
    A sensor unit for obtaining a user proximity point which is a position of a user's hand or an object held by the user and a distance to the user proximity point;
    (1) In the captured image acquired by the imaging unit, the user's eyes are detected, based on the position of the detected both eyes on the captured image, the distance between both eyes, and the angle of view. , Calculate the distance from your device to the plane containing the user's eyes,
    (2) Based on the user proximity point acquired by the sensor unit and the distance from the user device to the user proximity point, and the horizontal width and the vertical width of the display surface of the display device, Calculating an overlapping view range that is a range in which the user proximity point and the display surface of the display device appear to overlap each other from the viewpoint of
    (3) When it is determined that the user proximity point is within the overlapping view range, a calculation unit that calculates and outputs coordinate information of the user proximity point;
    A coordinate acquisition device comprising:
  2.  請求項1に記載の座標取得装置と、
     表示パネルと、
     前記表示パネルを制御する表示制御部であって、前記演算部が算出した前記ユーザー近接点の座標情報を入力する前記表示制御部と、
    を備える表示装置であって、
     前記演算部は、
     前記撮像部が取得した前記撮像画像において、ユーザーの手と顔の位置を検出し、手と顔との距離が所定の値以下であると判定した場合、前記ユーザー近接点の座標情報を前記表示制御部に出力しない、
     表示装置。
    A coordinate acquisition device according to claim 1;
    A display panel;
    A display control unit for controlling the display panel, wherein the display control unit inputs coordinate information of the user proximity point calculated by the calculation unit;
    A display device comprising:
    The computing unit is
    In the captured image acquired by the imaging unit, when the position of the user's hand and face is detected and it is determined that the distance between the hand and the face is equal to or less than a predetermined value, the coordinate information of the user proximity point is displayed. Do not output to the control unit,
    Display device.
  3.  請求項1に記載の座標取得装置と、
     表示パネルと、
     前記表示パネルを制御する表示制御部であって、前記演算部が算出した前記ユーザー近接点の座標情報を入力する前記表示制御部と、
    を備える表示装置であって、
     前記演算部は、
     前記撮像部が取得した前記撮像画像において、ユーザーの顔の位置を検出し、自装置からユーザーの顔までの距離が一定値よりも大きい場合、前記ユーザー近接点の座標情報を前記表示制御部に出力しない、
     表示装置。
    A coordinate acquisition device according to claim 1;
    A display panel;
    A display control unit for controlling the display panel, wherein the display control unit inputs coordinate information of the user proximity point calculated by the calculation unit;
    A display device comprising:
    The computing unit is
    In the captured image acquired by the imaging unit, the position of the user's face is detected, and when the distance from the own device to the user's face is larger than a certain value, the coordinate information of the user proximity point is sent to the display control unit. Do not output,
    Display device.
  4.  請求項1に記載の座標取得装置と、
     表示パネルと、
     前記表示パネルを制御する表示制御部であって、前記演算部が算出した前記ユーザー近接点の座標情報を入力する前記表示制御部と、
    を備える表示装置であって、
     前記表示制御部は、
    (1)前記表示面から前記ユーザー近接点までの距離と、前記ユーザー近接点からユーザーの両眼が存在する平面までの距離との比が変わったと判定した場合であって、(2)ユーザーの両眼の中心点である視点と前記ユーザー近接点とを結ぶ直線が前記表示面と交差する点の座標が変化していないと判定した場合、ユーザーによるクリック動作であると判定し、前記表示面の、前記視点と前記ユーザー近接点とを結ぶ直線が前記表示面と交差する点の座標における表示に対応づけられた所定の処理を実行する、
     表示装置。
    A coordinate acquisition device according to claim 1;
    A display panel;
    A display control unit for controlling the display panel, wherein the display control unit inputs coordinate information of the user proximity point calculated by the calculation unit;
    A display device comprising:
    The display control unit
    (1) When it is determined that the ratio between the distance from the display surface to the user proximity point and the distance from the user proximity point to the plane on which the user's eyes are located has changed, (2) When it is determined that the coordinates of the point where the straight line connecting the viewpoint that is the center point of both eyes and the user proximity point intersects the display surface has not changed, it is determined that the click operation is performed by the user, and the display surface A predetermined process associated with a display at the coordinates of a point where a straight line connecting the viewpoint and the user proximity point intersects the display surface,
    Display device.
PCT/JP2015/079840 2014-10-28 2015-10-22 Coordinate acquisition device and display device WO2016068015A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014219412 2014-10-28
JP2014-219412 2014-10-28

Publications (1)

Publication Number Publication Date
WO2016068015A1 true WO2016068015A1 (en) 2016-05-06

Family

ID=55857354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/079840 WO2016068015A1 (en) 2014-10-28 2015-10-22 Coordinate acquisition device and display device

Country Status (1)

Country Link
WO (1) WO2016068015A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005321869A (en) * 2004-05-06 2005-11-17 Alpine Electronics Inc Operation input device and operation input method
JP2013080413A (en) * 2011-10-05 2013-05-02 Sony Corp Input apparatus and input recognition method
JP2013196329A (en) * 2012-03-19 2013-09-30 Ricoh Co Ltd Information processor, operator determination program and projection system
JP2014199479A (en) * 2013-03-29 2014-10-23 株式会社バンダイナムコゲームス Program and information processor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005321869A (en) * 2004-05-06 2005-11-17 Alpine Electronics Inc Operation input device and operation input method
JP2013080413A (en) * 2011-10-05 2013-05-02 Sony Corp Input apparatus and input recognition method
JP2013196329A (en) * 2012-03-19 2013-09-30 Ricoh Co Ltd Information processor, operator determination program and projection system
JP2014199479A (en) * 2013-03-29 2014-10-23 株式会社バンダイナムコゲームス Program and information processor

Similar Documents

Publication Publication Date Title
KR102408359B1 (en) Electronic device and method for controlling using the electronic device
US8730164B2 (en) Gesture recognition apparatus and method of gesture recognition
US20130257736A1 (en) Gesture sensing apparatus, electronic system having gesture input function, and gesture determining method
WO2017041433A1 (en) Touch control response method and apparatus for wearable device, and wearable device
KR20210069491A (en) Electronic apparatus and Method for controlling the display apparatus thereof
US20140317576A1 (en) Method and system for responding to user&#39;s selection gesture of object displayed in three dimensions
EP2825945A1 (en) Approaches for highlighting active interface elements
JP6344530B2 (en) Input device, input method, and program
US11669173B2 (en) Direct three-dimensional pointing using light tracking and relative position detection
US20170078570A1 (en) Image processing device, image processing method, and image processing program
KR20140100547A (en) Full 3d interaction on mobile devices
US20150339859A1 (en) Apparatus and method for navigating through volume image
JP2014026355A (en) Image display device and image display method
US20150277570A1 (en) Providing Onscreen Visualizations of Gesture Movements
US20110102318A1 (en) User input by pointing
US20170363936A1 (en) Image processing apparatus, image processing method, and program
CN114706489B (en) Virtual method, device, equipment and storage medium of input equipment
US9122346B2 (en) Methods for input-output calibration and image rendering
WO2016068015A1 (en) Coordinate acquisition device and display device
KR101542671B1 (en) Method and apparatus for space touch
US11604517B2 (en) Information processing device, information processing method for a gesture control user interface
TWI446214B (en) Remote control system based on user action and its method
JP6873326B2 (en) Eye 3D coordinate acquisition device and gesture operation device
JP2006323454A (en) Three-dimensional instruction input system, three-dimensional instruction input device, three-dimensional instruction input method, and program
EP3059664A1 (en) A method for controlling a device by gestures and a system for controlling a device by gestures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15855174

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15855174

Country of ref document: EP

Kind code of ref document: A1