Nothing Special   »   [go: up one dir, main page]

CN103955316B - A kind of finger tip touching detecting system and method - Google Patents

A kind of finger tip touching detecting system and method Download PDF

Info

Publication number
CN103955316B
CN103955316B CN201410175698.0A CN201410175698A CN103955316B CN 103955316 B CN103955316 B CN 103955316B CN 201410175698 A CN201410175698 A CN 201410175698A CN 103955316 B CN103955316 B CN 103955316B
Authority
CN
China
Prior art keywords
finger tip
image sensing
information
processing unit
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410175698.0A
Other languages
Chinese (zh)
Other versions
CN103955316A (en
Inventor
谢翔
李国林
蔡西蕾
宋玮
郑毅
吕众
任力飞
王志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201410175698.0A priority Critical patent/CN103955316B/en
Publication of CN103955316A publication Critical patent/CN103955316A/en
Application granted granted Critical
Publication of CN103955316B publication Critical patent/CN103955316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a kind of finger tip touching detecting system and method, this system includes projecting interface processing unit, for receiving the human-computer interaction interface information of outside input, and exports human-computer interaction interface information to projecting cell;Projecting cell, for projecting human-computer interaction interface information to projection plane;Image sensing cell, for acquired projections area image, and is sent to graphics processing unit by view field's image;Graphics processing unit, is used for judging whether to contain in view field's image finger tip, and calculates finger tip and the distance of human-computer interaction interface and the positional information in human-computer interaction interface thereof when containing finger tip;Touching detector unit, including judging to perform photocentre mapping position computing module on module, imaging surface, finger tip minimum point acquisition module, shade forward terminal acquisition module and finger tip touching judge module, is used for judging touching and output touching message.This invention is capable of low cost, low-power consumption, high-precision touching detection.

Description

A kind of finger tip touching detecting system and method
Technical field
The present invention relates to field of computer technology, be specifically related to a kind of finger tip touching detecting system and method.
Background technology
In recent years, along with smart mobile phone, panel computer and the appearance of Google's eyes, people get over the communication of digital world Closely, and this also further promotes the miniaturization of smart machine. is subject to the restriction of conventional display apparatus volume, rely on tradition to show The smart machine of device cannot be miniaturized further, people thus attention is transferred in projection display equipment.
In order to realize the judgement to projection screen touch action, occur in that many new man-machine interactive systems and side in recent years Method, is broadly divided into two classes, a class be in projection picture implicit striped formula structure light and judged by the side-play amount of striped be No generation touch event.This system major defect is that the projector equipment needing have high frame per second implies fringe structure light and realizes Touching judges.Another kind of is while projection human-computer interaction interface image, uses depth camera acquired projections face and touch control object Spatial information, after calculating perspective plane and touch control object spacing, judge whether touch event occurs by distance threshold.This is The defect of system is to need to use depth transducer sampling depth information, and the resolution of depth transducer is the most relatively low, thus Cause this system accuracy the highest.It addition, use depth transducer also by relatively big for relatively costly, the volume that causes system, power consumption relatively High.
Summary of the invention
For the deficiencies in the prior art, the present invention provides a kind of finger tip touching detecting system and method, it is possible to realize low one-tenth Basis, low-power consumption, high-precision touching detection.
For achieving the above object, the present invention is achieved by the following technical programs:
A kind of finger tip touching detecting system, this system includes:
Projection interface processing unit, for receiving the human-computer interaction interface information of outside input, and the man-machine friendship that will receive Interface information exports to projecting cell mutually;
Projecting cell, for projecting human-computer interaction interface information to projection plane;
Image sensing cell, the view field's image in acquired projections plane, and the view field's figure that will collect As being sent to graphics processing unit;
Whether graphics processing unit contains finger tip in being used for judging view field's image, and calculates finger tip when containing finger tip With the distance of human-computer interaction interface and the positional information in human-computer interaction interface thereof;If described finger tip and human-computer interaction interface Distance is less than or equal to the first threshold value, and output touching information is to touching detector unit;Otherwise output view field's figure containing finger tip Picture and the finger tip positional information in human-computer interaction interface is to touching detector unit;
Graphics processing unit is additionally operable to acquisition system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projection Homography matrix between unit and image sensing cell, and by outside variable to described system intrinsic parameter, the intrinsic outer parameter of system, system Homography matrix between parameter and projecting cell and image sensing cell exports to touching detector unit;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell;The intrinsic outer parameter of system is Relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell;The variable outer parameter of system is image sensing list Relative tertiary location between unit and projecting cell and projection plane and attitude Relation Parameters;
Touching detector unit, including judging to perform photocentre mapping position computing module, finger tip minimum point on module, imaging surface Acquisition module, shade forward terminal acquisition module and finger tip touching judge module, wherein,
Judge to perform module, for when receive from the output information of graphics processing unit be touching information time, the most defeated Go out touching information, if receiving from the output information of graphics processing unit is that the view field's image containing finger tip and finger tip are man-machine During positional information in interactive interface, perform photocentre mapping position computing module on imaging surface;
Photocentre mapping position computing module on imaging surface, for receiving the intrinsic outer ginseng of system of graphics processing unit output Homography matrix between number, the variable outer parameter of system, the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, by described The parameter received is obtained by geometrical calculation and crosses image sensing cell photocentre in image sensing cell plane and be perpendicular to projection The straight line of the straight line of plane and the intersection point b on perspective plane and mistake projecting cell photocentre and image sensing cell photocentre and perspective plane Intersection point a positional information, and by above two intersection point information outputs to described finger tip minimum point acquisition module and described shade forward terminal Acquisition module;
Finger tip minimum point acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and referring to Point positional information in human-computer interaction interface and from the friendship of photocentre mapping position computing module output on described imaging surface Point b positional information, utilize finger tip in human-computer interaction interface it may happen that touching area information and the colouring information of finger tip Obtain finger tip and finger tip is carried out rim detection acquisition finger tip edge, finger tip edge being justified matching and simulates center of circle o, connect Center of circle o is i.e. defined as finger tip minimum point f, if it is at image sensing cell imaging surface with intersection point b, straight line ob with finger tip edge intersection point On coordinate beAnd the positional information of finger tip minimum point f is exported to described shade forward terminal acquisition module;
Shade forward terminal acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and referring to Point positional information in human-computer interaction interface and receiving from photocentre mapping position computing module output on described imaging surface The positional information of finger tip minimum point f of the positional information of intersection point a, described finger tip minimum point acquisition module output, utilizes shade to exist Feature in hsv color space obtains finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge Place's intersection point is shade forward terminal s, if its coordinate on image sensing cell imaging surface isAnd export finger tip Low spot f and shade forward terminal s positional information touch judge module to described finger tip;
Finger tip touching judge module, for receiving projecting cell and the image sensing cell of the output of described graphics processing unit Between homography matrix, and from described shade forward terminal acquisition module output finger tip lowest point, shade forward terminal position Put, calculate shade forward terminal position in projecting cell plane by homography matrix between projecting cell and image sensing cell PutIt is single with projection that the intrinsic inside and outside parameter of system utilizing graphics processing unit to export can obtain image sensing cell Spin matrix R between unit and translation matrix T, calculate the projecting cell photocentre position under the same coordinate system and image sensing cell Photocentre position, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell Photocentre is Tc→p, finger tip minimum point isShade forward terminal isWherein Rp→cIt is tied to the spin matrix of image sensing cell coordinate system, T by projecting cell coordinate for 3x3c→pFor 1x3 by image sensing list Unit's coordinate is tied to the translation matrix of projecting cell coordinate system, fc、fpIt is respectively image sensing cell and projecting cell focal length,WithIt is respectively after image sensing cell corrects with projecting cell intrinsic parameterWithAccording to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position on image sensing cell imaging surface And image sensing cell photocentre and projecting cell photocentre position, then space is crossed in projecting cell photocentre, projecting cell plane cloudy Straight line [the x of shadow forward terminalc,yc,zc]=λ1v1+Tc→pWith finger tip in image sensing cell photocentre, image sensing cell plane excessively Straight line [the x of minimum pointc,yc,zc]=λ2v2The space coordinates of intersection point is λ2'v2, and using this position of intersecting point as fingertip location, its Inλ1、λ2For proportionality coefficient, then λ2' for definite value and meet It is v1Transposed matrix,It is v2Transposed matrix, finally lead to The variable outer parameter of system of crossing calculates perspective plane equation, with normal vector ncWith single-point pcRepresent, calculate and obtain above-mentioned intersection point and projection Distance between the surfaceWhen distance is less than the second threshold value, it is judged that for touching operation;When touching During operation, finger tip touches judge module also by the homography matrix between image sensing cell and the projecting cell according to reception, Then calculating the coordinate position obtaining touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touches Information of touching at least includes the location coordinate information in the human-computer interaction interface of finger tip touching original projection;
Control unit, the every other unit in control system, and coordinate the work of every other unit;Control single Unit also control system can be in the intrinsic inside and outside parameter acquiring state of system, system variable element acquisition state and finger tip acquisition With touch operation state.
Wherein, when the human-computer interaction interface of outside input exists point, line or the surface information with obvious characteristic, described Projection interface processing unit is additionally operable to extract fisrt feature information from the human-computer interaction interface information of outside input, and first Characteristic information exports to graphics processing unit, and wherein fisrt feature information includes characteristic point, line dough-making powder, and characteristic point, line and Face coordinate position in interface;
Described graphics processing unit extracts second feature letter for the view field's image from image sensing cell output Breath, wherein second feature information includes characteristic point, line dough-making powder;
Second feature information and described fisrt feature information are compared, according to second feature information by graphics processing unit Whether deform upon relative to the characteristic point in fisrt feature information, line or face, it is judged that in view field, whether contain finger tip, and Calculate the finger tip at deformation from the distance of human-computer interaction interface and the positional information in view field thereof.
Wherein, when there is not point, line or the surface information with obvious characteristic in the human-computer interaction interface of outside input, institute State projection interface processing unit and be additionally operable to extract third feature information from the human-computer interaction interface information of outside input, and the Three characteristic information outputs are to graphics processing unit, and wherein third feature information includes the border of the human-computer interaction interface projected;
Described graphics processing unit extracts fourth feature letter for the view field's image from image sensing cell output Breath, wherein fourth feature information includes the border of human-computer interaction interface;
Third feature information and described fourth feature information are compared, according to third feature information by graphics processing unit Whether deform upon relative to fourth feature information, it is judged that whether contain finger tip in view field, and calculate the finger tip at deformation Distance and finger tip positional information in human-computer interaction interface from human-computer interaction interface.
Wherein, between described projection interface processing unit and projecting cell, it is provided with projection interface module, is used for docking The human-computer interaction interface information received carries out shape predistortion correction process.
Wherein, between described image sensing cell and graphics processing unit, it is provided with image sensing interface module, is used for The image gathered is carried out optic aberrance revising process.
A kind of finger tip detection method of touch based on described finger tip touching detecting system, the method includes:
Step S1: projection interface processing unit receives the human-computer interaction interface information of outside input, and man-machine by receive Interactive interface information exports to projecting cell;
Step S2: human-computer interaction interface information is projected to projection plane by projecting cell;
Step S3: the view field's image in image sensing cell acquired projections plane, and this view field's image is sent out Give graphics processing unit;
Step S4: graphics processing unit obtains system intrinsic parameter, the intrinsic outer parameter of system and the variable outer parameter of system, and profit The homography matrix between projecting cell and image sensing cell is calculated by the intrinsic outer parameter of described system and the variable outer parameter of system;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell;The intrinsic outer parameter of system is Relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell;The variable outer parameter of system is image sensing list Relative tertiary location between unit and projecting cell and projection plane and attitude Relation Parameters;
Step S5: graphics processing unit judges whether to contain on view field's image finger tip, and calculates when containing finger tip Finger tip and the distance of human-computer interaction interface;If described finger tip is less than or equal to the first threshold value with the distance of human-computer interaction interface, defeated Go out touching information to touching detector unit;Otherwise export the view field's image containing finger tip and finger tip in human-computer interaction interface Positional information is to touching detector unit;
Step S6: when touch detector unit receive from the output information of graphics processing unit be touching information time, directly Output touching information, if receiving from the output information of graphics processing unit is that the view field's image containing finger tip and finger tip are people During positional information in machine interactive interface, obtain fingertip location, it is judged that whether finger tip is touched, if touching, export touching information, Particularly as follows: receive graphics processing unit output the intrinsic outer parameter of system, the variable outer parameter of system, the intrinsic intrinsic parameter of system and Homography matrix between projecting cell and image sensing cell, obtains image sensing by the described parameter received by geometrical calculation Cross image sensing cell photocentre in unit plane and be perpendicular to the straight line of projection plane and the intersection point b on perspective plane and cross projection list The straight line of unit's photocentre and image sensing cell photocentre and the intersection point a positional information on perspective plane;Utilize finger tip at human-computer interaction interface In it may happen that touching area information and finger tip colouring information obtain finger tip and finger tip is carried out rim detection obtain finger tip Edge, justifies matching and simulates center of circle o finger tip edge, connects center of circle o with intersection point b, straight line ob with finger tip edge intersection point i.e. It is defined as finger tip minimum point f, if its coordinate on image sensing cell imaging surface isUtilize shade in HSV face Feature in the colour space obtains finger tip shadow region, by connecting intersection point a and finger tip minimum point f, hands at line and Shadow edge Point is shade forward terminal s, if its coordinate on image sensing cell imaging surface isBy projecting cell and figure As between sensing unit, homography matrix calculates shade forward terminal position in projecting cell planeUtilize image The intrinsic inside and outside parameter of system of processing unit output can obtain the spin matrix R between image sensing cell and projecting cell and put down Move matrix T, calculate the projecting cell photocentre position under the same coordinate system and image sensing cell photocentre position, if unified to figure As, under sensing unit photocentre coordinate system, image sensing cell photocentre is (0,0,0), and projecting cell photocentre is Tc→p, finger tip minimum point ForShade forward terminal isWherein Rp→cSit by projecting cell for 3x3 Mark is tied to the spin matrix of image sensing cell coordinate system, Tc→pIt is tied to projecting cell coordinate by image sensing cell coordinate for 1x3 The translation matrix of system, fc、fpIt is respectively image sensing cell and projecting cell focal length,WithIt is respectively and passes through After image sensing cell corrects with projecting cell intrinsic parameterWithAccording on image sensing cell imaging surface Finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position Put, then space is crossed projecting cell photocentre, the straight line [x of projecting cell plane top shadow forward terminalc,yc,zc]=λ1v1+Tc→pWith mistake Straight line [the x of finger tip minimum point in image sensing cell photocentre, image sensing cell planec,yc,zc]=λ2v2The space of intersection point is sat It is designated as λ2'v2, and using this position of intersecting point as fingertip location, wherein λ1、λ2For proportionality coefficient, then λ2' for definite value and meet It is v1's Transposed matrix,It is v2Transposed matrix, calculate perspective plane equation, with normal vector n finally by the variable outer parameter of systemcWith Single-point pcRepresent, calculate and obtain above-mentioned intersection point and perspective plane spacingWhen distance is less than second During threshold value, it is judged that for touching operation;When there is touching operation, finger tip touching judge module also will pass according to the image received Homography matrix between sense unit and projecting cell, calculates the seat obtaining touch points in the human-computer interaction interface of projection then Cursor position, and export finger tip touching information, finger tip touching information at least includes the human-computer interaction interface of finger tip touching original projection In location coordinate information.
Wherein, graphics processing unit described in step S5 judges that whether containing finger tip on view field's image includes:
When the human-computer interaction interface of outside input exists point, line or the surface information with obvious characteristic, described projection Interface processing unit is additionally operable to extract fisrt feature information from the human-computer interaction interface information of outside input, and fisrt feature Information exports to graphics processing unit, and wherein fisrt feature information includes characteristic point, line dough-making powder, and characteristic point, line dough-making powder exist Coordinate position in interface;
Described graphics processing unit extracts second feature letter for the view field's image from image sensing cell output Breath, wherein second feature information includes characteristic point, line dough-making powder;
Second feature information and described fisrt feature information are compared, according to second feature information by graphics processing unit Whether deform upon relative to the characteristic point in fisrt feature information, line or face, it is judged that in view field, whether contain finger tip, and Calculate the finger tip at deformation from the distance of human-computer interaction interface and the positional information in view field thereof.
Wherein, graphics processing unit described in step S5 judges that whether containing finger tip on view field's image includes:
When there is not point, line or the surface information with obvious characteristic in the human-computer interaction interface of outside input, described throwing Film-world's face processing unit is additionally operable to extract third feature information from the human-computer interaction interface information of outside input, and special the 3rd Reference breath output is to graphics processing unit, and wherein third feature information includes the border of the human-computer interaction interface projected;
Described graphics processing unit extracts fourth feature letter for the view field's image from image sensing cell output Breath, wherein fourth feature information includes the border of human-computer interaction interface;
Third feature information and described fourth feature information are compared, according to third feature information by graphics processing unit Whether deform upon relative to fourth feature information, it is judged that whether contain finger tip in view field, and calculate the finger tip at deformation From the distance of human-computer interaction interface and the positional information in view field thereof.
Wherein, described projection interface processing unit the human-computer interaction interface information of reception exported to projecting cell it Before, the human-computer interaction interface information received is carried out shape predistortion correction process.
Wherein, before the image of collection is sent to graphics processing unit by image sensing cell, the image gathered is entered Row optic aberrance revising processes.
The present invention at least has a following beneficial effect:
1, the present invention can complete finger merely with a common projecting cell and a common image sensing cell Whether point touches the judgement of projection plane, it is achieved that low cost, low-power consumption, high-precision touch detection, further, since the present invention Need not use depth transducer, therefore the volume of system is the least.
2, finger tip of the present invention touching detection algorithm is not affected by characteristic information in human-computer interaction interface, even if handing over Not obvious point, line or the basic feature unit such as Else Rule figure or image in interface mutually, this algorithm also can be well real Execute.
3, finger tip of the present invention touching detection algorithm does not relies on image feature information, and projector is without projection Dominant or recessive structure light, therefore less demanding to the acquisition frame rate of image sensing cell and projecting cell so that algorithm More universality;
4, finger tip of the present invention touching detection algorithm is by calculating finger tip with perspective plane spacing as touching The foundation judged, therefore this touching detection algorithm is affected not quite by finger tip its thickness, and touching accuracy in detection is high.
Certainly, either method or the product of implementing the present invention are not necessarily required to reach all the above advantage simultaneously.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing In having technology to describe, the required accompanying drawing used is briefly described, it should be apparent that, the accompanying drawing in describing below is the present invention Some embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to according to These accompanying drawings obtain other accompanying drawing.
Fig. 1 is the structural representation that the finger tip of the embodiment of the present invention touches detecting system;
Fig. 2 be the embodiment of the present invention have touch control object and the feature striped deformation schematic diagram without touch control object;
Fig. 3 is that the projecting cell photocentre of the embodiment of the present invention is illustrated relative to the position on perspective plane with graphics processing unit photocentre Figure;
Fig. 4 is finger tip minimum point and the shade forward terminal position view of the embodiment of the present invention;
Fig. 5 is the schematic diagram of embodiment of the present invention touching determination methods;
Fig. 6 is the flow chart of the finger tip detection method of touch of the embodiment of the present invention.
Detailed description of the invention
For making the purpose of the embodiment of the present invention, technical scheme and advantage clearer, below in conjunction with the embodiment of the present invention In accompanying drawing, the technical scheme in the embodiment of the present invention is carried out clear, complete description, it is clear that described embodiment is The a part of embodiment of the present invention rather than whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art The every other embodiment obtained under not making creative work premise, broadly falls into the scope of protection of the invention.
The embodiment of the present invention proposes a kind of finger tip touching detecting system, sees Fig. 1, including:
Projection interface processing unit 101 is for receiving the human-computer interaction interface information of outside input and man-machine by receive Interactive interface information exports to projecting cell;
Projecting cell 102, for projecting human-computer interaction interface information to projection plane;
Image sensing cell 103, the view field's image in acquired projections plane, and this view field's image is sent out Give graphics processing unit;
Whether graphics processing unit 104 contains finger tip in being used for judging view field's image, and calculates when containing finger tip The distance of finger tip and human-computer interaction interface and finger tip positional information in human-computer interaction interface;If described finger tip and man-machine friendship The distance at interface is less than or equal to the first threshold value mutually, and output touching information is to touching detector unit;The otherwise output throwing containing finger tip Shadow zone area image and the finger tip positional information in human-computer interaction interface is to touching detector unit;
Graphics processing unit 104 is additionally operable to acquisition system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and throwing Homography matrix between shadow unit and image sensing cell, and by variable to described system intrinsic parameter, the intrinsic outer parameter of system, system Homography matrix between outer parameter and projecting cell and image sensing cell exports to touching detector unit;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell;The intrinsic outer parameter of system is Relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell;The variable outer parameter of system is image sensing list Relative tertiary location between unit and projecting cell and projection plane and attitude Relation Parameters;
Touching detector unit 105, including judging to perform photocentre mapping position computing module on module, imaging surface, finger tip Low spot acquisition module, shade forward terminal acquisition module and finger tip touching judge module, wherein,
Judge to perform module, for when receive from the output information of graphics processing unit be touching information time, the most defeated Go out touching information, if receiving from the output information of graphics processing unit is that the view field's image containing finger tip and finger tip are man-machine During positional information in interactive interface, perform photocentre mapping position computing module on imaging surface;
Photocentre mapping position computing module on imaging surface, for receiving the intrinsic outer ginseng of system of graphics processing unit output Homography matrix between number, the variable outer parameter of system, the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, by described The parameter received is obtained by geometrical calculation and crosses image sensing cell photocentre in image sensing cell plane and be perpendicular to projection The straight line of the straight line of plane and the intersection point b on perspective plane and mistake projecting cell photocentre and image sensing cell photocentre and perspective plane Intersection point a positional information, as it is shown on figure 3, and by above two intersection point information outputs to described finger tip minimum point acquisition modules and described Shade forward terminal acquisition module;
Finger tip minimum point acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and referring to Point positional information in human-computer interaction interface and from the friendship of photocentre mapping position computing module output on described imaging surface Point b positional information, utilize finger tip in human-computer interaction interface it may happen that touching area information and the colouring information of finger tip Obtain finger tip and finger tip is carried out rim detection acquisition finger tip edge, finger tip edge being justified matching and simulates center of circle o, connect Center of circle o is i.e. defined as finger tip minimum point f, if it is at image sensing cell imaging surface with intersection point b, straight line ob with finger tip edge intersection point On coordinate beAnd the positional information of finger tip minimum point f is exported to described shade forward terminal acquisition module;
Shade forward terminal acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and referring to Point positional information in human-computer interaction interface and receiving from photocentre mapping position computing module output on described imaging surface The positional information of finger tip minimum point f of the positional information of intersection point a, described finger tip minimum point acquisition module output, as shown in Figure 4, Shade feature in hsv color space is utilized to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line It is shade forward terminal s, if its coordinate on image sensing cell imaging surface is with intersection point at Shadow edgeAnd Output finger tip minimum point f and shade forward terminal s positional information touch judge module to described finger tip;
Finger tip touching judge module, for receiving projecting cell and the image sensing cell of the output of described graphics processing unit Between homography matrix, and from described shade forward terminal acquisition module output finger tip lowest point, shade forward terminal position Put, calculate shade forward terminal position in projecting cell plane by homography matrix between projecting cell and image sensing cell PutIt is single with projection that the intrinsic inside and outside parameter of system utilizing graphics processing unit to export can obtain image sensing cell Spin matrix R between unit and translation matrix T, calculate the projecting cell photocentre position under the same coordinate system and image sensing cell Photocentre position, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell Photocentre is Tc→p, finger tip minimum point isShade forward terminal isWherein Rp→cIt is tied to the spin matrix of image sensing cell coordinate system, T by projecting cell coordinate for 3x3c→pFor 1x3 by image sensing list Unit's coordinate is tied to the translation matrix of projecting cell coordinate system, fc、fpIt is respectively image sensing cell and projecting cell focal length,WithIt is respectively after image sensing cell corrects with projecting cell intrinsic parameterWithAccording to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position on image sensing cell imaging surface And image sensing cell photocentre and projecting cell photocentre position, then space is crossed in projecting cell photocentre, projecting cell plane cloudy Straight line [the x of shadow forward terminalc,yc,zc]=λ1v1+Tc→pWith finger tip in image sensing cell photocentre, image sensing cell plane excessively Straight line [the x of minimum pointc,yc,zc]=λ2v2The space coordinates of intersection point is λ2'v2, and using this position of intersecting point as fingertip location, its Inλ1、λ2For proportionality coefficient, then λ2' for definite value and meet It is v1Transposed matrix,It is v2Transposed matrix, finally lead to The variable outer parameter of system of crossing calculates perspective plane equation, with normal vector ncWith single-point pcRepresent, calculate and obtain above-mentioned intersection point and projection Distance between the surfaceWhen distance is less than the second threshold value, it is judged that for touching operation;When touching During operation, finger tip touches judge module also by the homography matrix between image sensing cell and the projecting cell according to reception, Then calculating the coordinate position obtaining touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touches Information of touching at least includes the location coordinate information in the human-computer interaction interface of finger tip touching original projection.
When the human-computer interaction interface of outside input exists point, line or the surface information with obvious characteristic, described projection Interface processing unit 101 is additionally operable to extract fisrt feature information from the human-computer interaction interface information of outside input, and first Characteristic information exports to graphics processing unit, and wherein fisrt feature information includes characteristic point, line dough-making powder, and characteristic point, line and Face coordinate position in interface;
Described graphics processing unit 104 extracts second for the view field's image from image sensing cell 103 output Characteristic information, wherein second feature information includes characteristic point, line dough-making powder;
Second feature information and described fisrt feature information are compared, according to second feature by graphics processing unit 104 The deformation that information occurs relative to the characteristic point in fisrt feature information, line or face, calculates the finger tip at deformation from man-machine interaction The distance at interface and the finger tip positional information in human-computer interaction interface.
When there is not point, line or the surface information with obvious characteristic in the human-computer interaction interface of outside input, described throwing Film-world's face processing unit 101 is additionally operable to extract third feature information from the human-computer interaction interface information of outside input, and the Three characteristic information outputs are to graphics processing unit 104, and wherein third feature information includes the border of the human-computer interaction interface projected;
Described graphics processing unit 104 extracts fourth feature for the view field's image from image sensing cell output Information, wherein fourth feature information includes the border of human-computer interaction interface;
Third feature information and described fourth feature information are compared, according to third feature by graphics processing unit 104 Whether information deforms upon relative to fourth feature information, it is judged that whether contains finger tip in view field, and calculates at deformation Finger tip is from the distance of human-computer interaction interface and finger tip positional information in human-computer interaction interface.
Wherein, between described projection interface processing unit 101 and projecting cell 102, it is provided with projection interface module, its Belong to interface unit 106, for the human-computer interaction interface information received is carried out shape predistortion correction process.Projection interface mould The data for projection that block exports from projection interface processing unit 101 for reception, and based on the projection in the intrinsic intrinsic parameter of system The optical distortion parameter of unit, carries out optical distortion predistortion to the interface image to be projected from projection interface processing unit output Correction process, to eliminate the optical distortion distortion that projecting cell optical module characteristic is brought, the projection interface image after correction is defeated Go out to projecting cell.
Wherein, between described image sensing cell 103 and graphics processing unit 104, image sensing interface mould it is provided with Block, it belongs to interface unit 106, for the image gathered is carried out optic aberrance revising process.
Image sensing interface module is for receiving from the figure in the intrinsic intrinsic parameter of system of graphics processing unit 104 output Optical distortion parameter as sensing unit;Image sensing interface module is additionally operable to receive the view data of image sensing cell 103, And based on the optical distortion parameter of image sensing cell in the intrinsic intrinsic parameter of system, it is achieved image sensing cell 103 is gathered Image carries out optic aberrance revising process, abnormal to eliminate the optics brought due to optical module characteristic in image sensing cell 103 Become distortion, then the image after correction is exported to graphics processing unit 104.
Control unit 107, the every other unit in control system, and coordinate the work of every other unit;Control Unit processed also control system can be in the intrinsic inside and outside parameter acquiring state of system, system variable element acquisition state and finger tip Obtain and touch operation state.
Graphics processing unit is for receiving the feature letters such as point, line or the face of the obvious characteristic of projection interface processing unit output Breath, and utilize the intrinsic inside and outside parameter of system and the variable outer parameter of system, calculate and obtain projecting cell plane and image sensing cell Homography matrix (the also referred to as homography matrix between projecting cell and image sensing cell) between plane, thus calculate this A little characteristic informations two-dimensional coordinate information during imaging in image sensing cell plane (namely does not has user's hands in view field Or during the operation of other touch control object, on image sensing cell acquired projections face, the characteristic information in the human-computer interaction interface of projection is at figure Coordinate information as in sensor unit plane), referred to as fisrt feature information.This unit also receives from image sensing interface mould The view data of block output, and extract the characteristic informations such as obvious characteristic point, line or face in this image, referred to as second feature letter Breath, utilize projection interface in existing obvious characteristic information, calculate in view field due to occur operation thing (hands or other touch Control thing) and do not occur operate thing time, second feature information relative to the characteristic point in fisrt feature information, line or face occur shape The size become calculates finger tip position in perspective plane (as in figure 2 it is shown, do not have hands or touch-control on the human-computer interaction interface of projection During thing, and schematic diagram when having hands or a touch control object).When human-computer interaction interface does not exist the letters such as obvious characteristic point, line or face , the human-computer interaction interface border of projector is available characteristic information during breath (as shown in Figure 2 a), utilizes structure light former Manage and measure the finger distance from perspective plane at deformation according to the deformation size at the striped deformation of border, if this measurement distance value is little In or equal to the first threshold value, can directly judge that this region has finger tip to touch, and calculate the touching in human-computer interaction interface Position coordinates, output touching information is to touching detector unit, if this measurement distance value is more than the first threshold value, the most whole projection circle Face be it may happen that touching region and by this information export to touching detector unit;When in human-computer interaction interface, existence has bright During the information such as point, line or the face of aobvious feature (as shown in Figure 2 b), fringes shift size can be utilized to calculate according to above-mentioned principle equally At striped deformation, finger is from perspective plane distance, and when measuring distance value less than or equal to the first threshold value, output touching information is to touching Touch detector unit, if this measurement distance value is more than the first threshold value, man-machine project according to feature stripe information output finger It may happen that the area information of touching (i.e. fingertip location information, and this positional information is with the increase of stripe information in interactive interface And more accurate) to touching detector unit;Described graphics processing unit also, under the control of control unit is coordinated, calculates acquisition system (i.e. image sensing cell and the intrinsic parameter of projecting cell, refers mainly to the optical module characteristic due to each of which to intrinsic intrinsic parameter The optical distortion parameter brought), and the intrinsic intrinsic parameter of the system that stores is to nonvolatile memory, and the output intrinsic internal reference of system Number is to interface unit;Described graphics processing unit is for receiving the corresponding feature of image data extraction from image sensing cell Information, and carry out matching treatment with the characteristic information projecting the output of interface processing unit, calculate and obtain image sensing cell and throwing Position between shadow unit and attitude relation (the referred to as intrinsic outer parameter of system), and between man-machine interactive system and projection plane Position and attitude relation (the referred to as variable outer parameter of system);Described graphics processing unit is also in the variable outer parameter of system obtained Projection plane and projecting cell between position, attitude Relation Parameters output to projection interface processing unit.
System is by projection interface processing unit recipient's machine interactive interface information, and is believed at this interface after treatment Breath delivers to projection unit projects to some daily planes, such as desktop, metope, paper, palm, arm etc., accurate projection Display man machine interface, then user can carry out the similar naked hands operation touching screen in the man machine interface of this projection, and schemes As the image in sensing unit then acquired projections region, identification man-machine interactive operation, and export the information of corresponding interactive operation, its Middle key needs to identify whether staff is touched on projection interface, and the position that touch points is in the man machine interface of projection, its The work of other unit all in middle control unit then coordinated control system.
The present embodiment is to be far longer than finger thickness, therefore according to image sensing cell and projecting cell with finger spacing Block projection ray to produce both finger tip minimum points that finger tip point and the image sensing cell of shade front end photograph and can be approximately In same point, i.e. Fig. 5, A, B 2 can be approximately same point.Based on above-mentioned principle, the present invention is by the intersection point with straight line PA Yu CB As fingertip location.If finger tip to be approximately 1/4th spheroids, then finger tip minimum point be positioned at image sensing cell photocentre with The finger tip centre of sphere and being perpendicular in the plane on perspective plane, in image sensing cell acquired image, this plane is crossing a some b (some b For cross image sensing cell photocentre and be perpendicular to the straight line of projection plane and the intersection point on perspective plane) straight line on.As shown in Figure 4, To cross straight line ob and the finger tip edge intersection point of finger tip fitting circle center of circle o and some b as finger tip in finger tip minimum point acquisition module Low spot.For calculating PA Yu CB intersection point, PA with CB needs coplanar, and therefore PM with CB is coplanar, then M point is on PCB plane, corresponding at image Sensing unit gathers on image was straight line PC and the intersection point a on perspective plane with on the straight line of finger tip minimum point f, be straight line af with The intersection point s of Shadow edge.Coordinate according to some f, s utilizes binocular principle to calculate the intersecting point coordinate of PM Yu CB, therefore this algorithm Can preferably calculate fingertip location, less so that precision is higher by finger tip thickness effect.
More generally, if f not finger tip minimum point, straight line af and Shadow edge intersection point s is still at this finger tip point and straight line On PC formed plane, PM Yu CB intersection point still can characterize fingertip location.
The embodiment of the present invention is in system touching operation acquisition process, and control unit controls whole system and is in touching operation Acquisition state, the finger tip acquisition module in graphics processing unit receives the figure of the view field gathered from image sensing cell Picture, according to characteristic information, obtains the positional information of finger fingertip in the picture, and this result is exported to touching detector unit; Described touching detector unit receives the fingertip location information obtained from graphics processing unit, calculates finger tip minimum point and corresponding Shade forward terminal, also from graphics processing unit receive the intrinsic inside and outside parameter of system, utilize " binocular range measurement principle ", can calculate Go out the position relationship of finger tip minimum point relative image sensing unit, finally utilize image sensing cell phase in the variable outer parameter of system Position and attitude Relation Parameters to perspective plane, thus calculate the finger tip distance relative to projection plane, if distance value is one In the range of individual threshold value, then it is judged as touching operation.For when there is touching operation, also need to calculate this touch points at projection interface In position, the present embodiment is preferably using finger tip lowest point as finger tip touch points position.When touch event occurs, profit Finger tip minimum point is calculated with the homography matrix between image sensing cell and the projecting cell obtained from graphics processing unit Coordinate position in original projection interface, touching detector unit output touching operation and corresponding touch points are at human-computer interaction interface In location coordinate information.Last finger tip touching module output touch points position coordinates in human-computer interaction interface.
When being on-fixed plane in view of projection plane, can exist projection plane rotate relative to man-machine interactive system or Translation, at this point it is possible to pass through the variable outer parameter of acquisition system that graphics processing unit is real-time, i.e. obtains projection plane man-machine The relative position of interactive system and attitude information, when finding that its relative position and attitude change, after reacquiring change The variable outer parameter of system, and variable for system outer parameter is exported again to projecting interface processing module, described projection interface Reason module updates these parameters, and carries out projection of shape pre-distortion based on this new parameter, real to realize projection plane Time tracking.
It should be noted that in all embodiments of the invention, except the situation of hands touching projection plane can be processed, Touch control object for other is also the most permissible, and its principle is identical, is not repeated herein.
The embodiment of the present invention can complete merely with a common projection device and a common image sensing device Whether finger tip is touched the judgement of projection plane, it is achieved that low cost, low-power consumption, high-precision touch detection, further, since this Invention need not use depth transducer, and therefore the volume of system is the least.
The finger tip touching detection algorithm of the embodiment of the present invention is not affected by characteristic information in human-computer interaction interface, even if handing over Not obvious point, line or the basic feature unit such as Else Rule figure or image in interface mutually, this algorithm also can be well real Execute.
Described in the embodiment of the present invention finger tip touching detection algorithm do not rely on image feature information, projector without Project dominant or recessive structure light, therefore less demanding to the acquisition frame rate of image sensing cell and projecting cell so that Algorithm more universality;
Finger tip touching detection algorithm described in the embodiment of the present invention is by calculating finger tip and perspective plane spacing conduct The foundation that touching judges, therefore this touching detection algorithm is affected not quite by finger tip its thickness, and touching accuracy in detection is high.
An alternative embodiment of the invention also proposed a kind of finger tip detection method of touch, sees Fig. 6, and the method includes:
Step 601: projection interface processing unit receives the human-computer interaction interface information of outside input, and man-machine by receive Interactive interface information exports to projecting cell.
In this step, the human-computer interaction interface information of reception is exported to projection single at described projection interface processing unit Before unit, the human-computer interaction interface information received is carried out shape predistortion correction process.
Step 602: human-computer interaction interface information is projected to projection plane by projecting cell.
Step 603: the view field's image in image sensing cell acquired projections plane, and view field's image is sent To graphics processing unit.
In this step, before the image of collection is sent to graphics processing unit by image sensing cell, to gather Image carries out optic aberrance revising process.
Step 604: graphics processing unit obtains system intrinsic parameter, the intrinsic outer parameter of system and the variable outer parameter of system, and The intrinsic outer parameter of described system and the variable outer parameter of system is utilized to calculate the homography square between projecting cell and image sensing cell Battle array.
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell;The intrinsic outer parameter of system is Relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell;The variable outer parameter of system is image sensing list Relative tertiary location between unit and projecting cell and projection plane and attitude Relation Parameters.
Step 605: graphics processing unit judges whether to contain in view field's image finger tip, and calculates when containing finger tip The distance of finger tip and human-computer interaction interface and finger tip positional information in human-computer interaction interface;If described finger tip and man-machine friendship The distance at interface is less than or equal to the first threshold value mutually, and output touching information is to touching detector unit;The otherwise output throwing containing finger tip Shadow zone area image and finger tip in human-computer interaction interface positional information to touching detector unit.
In this step, when the human-computer interaction interface of outside input exists point, line or the surface information with obvious characteristic Time, described projection interface processing unit is additionally operable to extract fisrt feature information from the human-computer interaction interface information of outside input, And fisrt feature information is exported to graphics processing unit, wherein fisrt feature information includes characteristic point, line dough-making powder, and feature Point, line dough-making powder coordinate position in interface.
Described graphics processing unit extracts second feature letter for the view field's image from image sensing cell output Breath, wherein second feature information includes characteristic point, line dough-making powder.
Second feature information and described fisrt feature information are compared, according to second feature information by graphics processing unit The deformation occurred relative to the characteristic point in fisrt feature information, line or face, calculates the finger tip at deformation from human-computer interaction interface Distance and finger tip positional information in human-computer interaction interface.
When there is not point, line or the surface information with obvious characteristic in the human-computer interaction interface of outside input, described throwing Film-world's face processing unit is additionally operable to extract third feature information from the human-computer interaction interface information of outside input, and special the 3rd Reference breath output is to graphics processing unit, and wherein third feature information includes the border of the human-computer interaction interface projected.
Described graphics processing unit extracts fourth feature letter for the view field's image from image sensing cell output Breath, wherein fourth feature information includes the border of human-computer interaction interface.
Third feature information and described fourth feature information are compared, according to third feature information by graphics processing unit Whether deform upon relative to fourth feature information, it is judged that whether contain finger tip in view field, and calculate the finger tip at deformation Distance and finger tip positional information in human-computer interaction interface from human-computer interaction interface.
Step 606: when touch detector unit receive from the output information of graphics processing unit be touching information time, directly Output touching information, if receiving from the output information of graphics processing unit is that the view field's image containing finger tip and finger tip are people In machine interactive interface during positional information, obtain fingertip location, it is judged that whether finger tip is touched, if touching, export touching information, tool Body is: receive the intrinsic outer parameter of system, the variable outer parameter of system, the intrinsic intrinsic parameter of system and the throwing of graphics processing unit output Homography matrix between shadow unit and image sensing cell, obtains image sensing list by the described parameter received by geometrical calculation Cross image sensing cell photocentre in unit's plane and be perpendicular to the straight line of projection plane and the intersection point b on perspective plane and cross projecting cell The straight line of photocentre and image sensing cell photocentre and the intersection point a positional information on perspective plane;Utilize finger tip in human-computer interaction interface It may happen that the colouring information of the area information of touching and finger tip obtains finger tip and finger tip carries out rim detection acquisition finger tip limit Edge, justifies matching and simulates center of circle o, connect center of circle o the most fixed with finger tip edge intersection point with intersection point b, straight line ob finger tip edge Justice is finger tip minimum point f, if its coordinate on image sensing cell imaging surface isUtilize shade at hsv color Feature in space obtains finger tip shadow region, by connecting intersection point at intersection point a and finger tip minimum point f, line and Shadow edge It is shade forward terminal s, if its coordinate on image sensing cell imaging surface isBy projecting cell and image Between sensing unit, homography matrix calculates shade forward terminal position in projecting cell planeUtilize at image The intrinsic inside and outside parameter of system of reason unit output can obtain the spin matrix R between image sensing cell and projecting cell and translation Matrix T, calculates the projecting cell photocentre position under the same coordinate system and image sensing cell photocentre position, if unified to image Under sensing unit photocentre coordinate system, image sensing cell photocentre is (0,0,0), and projecting cell photocentre is Tc→p, finger tip minimum point isShade forward terminal isWherein Rp→cFor 3x3 by projecting cell coordinate It is tied to the spin matrix of image sensing cell coordinate system, Tc→pIt is tied to projecting cell coordinate system by image sensing cell coordinate for 1x3 Translation matrix, fc、fpIt is respectively image sensing cell and projecting cell focal length,WithIt is respectively through figure After correcting with projecting cell intrinsic parameter as sensing unitWithRefer to according on image sensing cell imaging surface Point lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position Put, then space is crossed projecting cell photocentre, the straight line [x of projecting cell plane top shadow forward terminalc,yc,zc]=λ1v1+Tc→pWith mistake Straight line [the x of finger tip minimum point in image sensing cell photocentre, image sensing cell planec,yc,zc]=λ2v2The space coordinates of intersection point For λ2'v2, and using this position of intersecting point as fingertip location, whereinλ1、 λ2For proportionality coefficient, then λ2' for definite value and meet It is v1Turn Put matrix,It is v2Transposed matrix, calculate perspective plane equation, with normal vector n finally by the variable outer parameter of systemcAnd list Point pcRepresent, calculate and obtain above-mentioned intersection point and perspective plane spacingWhen distance is less than second During limit value, it is judged that for touching operation;When there is touching operation, finger tip touching judge module is also by according to the image sensing received Homography matrix between unit and projecting cell, calculates the coordinate obtaining touch points in the human-computer interaction interface of projection then Position, and export finger tip touching information, finger tip touching information at least includes in the human-computer interaction interface of finger tip touching original projection Location coordinate information.
The present embodiment is to be far longer than finger thickness, therefore according to image sensing cell and projecting cell with finger spacing Block projection ray to produce both finger tip minimum points that finger tip point and the image sensing cell of shade front end photograph and can be approximately In same point, i.e. Fig. 5, A, B 2 can be approximately same point.Based on above-mentioned principle, the present invention is by the intersection point with straight line PA Yu CB As fingertip location.If finger tip to be approximately 1/4th spheroids, then finger tip minimum point be positioned at image sensing cell photocentre with The finger tip centre of sphere and being perpendicular in the plane on perspective plane, in image sensing cell acquired image, this plane is crossing a some b (some b For cross image sensing cell photocentre and be perpendicular to the straight line of projection plane and the intersection point on perspective plane) straight line on.As shown in Figure 4, To cross straight line ob and the finger tip edge intersection point of finger tip fitting circle center of circle o and some b as finger tip in finger tip minimum point acquisition module Low spot.For calculating PA Yu CB intersection point, PA with CB needs coplanar, and therefore PM with CB is coplanar, then M point is on PCB plane, corresponding at image Sensing unit gathers on image was straight line PC and the intersection point a on perspective plane with on the straight line of finger tip minimum point f, be straight line af with The intersection point s of Shadow edge.Coordinate according to some f, s utilizes binocular principle to calculate the intersecting point coordinate of PM Yu CB, therefore this algorithm Can preferably calculate fingertip location, less so that precision is higher by finger tip thickness effect.
More generally, if f not finger tip minimum point, straight line af and Shadow edge intersection point s is still at this finger tip point and straight line On PC formed plane, PM Yu CB intersection point still can characterize fingertip location.
The present embodiment is in system touching operation acquisition process, and control unit control whole system is in touching operation and obtains State, the finger tip acquisition module in graphics processing unit receives the image of the view field gathered from image sensing cell, root According to characteristic information, obtain the positional information of finger fingertip in the picture, and this result is exported to touching detector unit;Described tactile Touch detector unit and receive the fingertip location information obtained from graphics processing unit, calculate finger tip minimum point and corresponding shade thereof Forward terminal, also receives the intrinsic inside and outside parameter of system from graphics processing unit, utilizes " binocular range measurement principle ", can calculate finger tip The position relationship of minimum point relative image sensing unit, finally utilizes image sensing cell in the variable outer parameter of system relatively to project The position in face and attitude Relation Parameters, thus calculate the finger tip distance relative to projection plane, if distance value is at a thresholding In the range of value, then it is judged as touching operation.For when there is touching operation, also need the position calculating this touch points in projection interface Putting, the present embodiment is preferably using finger tip lowest point as finger tip touch points position.When touch event occurs, utilize from figure Homography matrix between the image sensing cell and the projecting cell that obtain in picture processing unit calculates finger tip minimum point original Coordinate position in projection interface, touching detector unit output touching operation and corresponding touch points position in human-computer interaction interface Put coordinate information.Last finger tip touching module output touch points position coordinates in human-computer interaction interface.
When being on-fixed plane in view of projection plane, can exist projection plane rotate relative to man-machine interactive system or Translation, at this point it is possible to pass through the variable outer parameter of acquisition system that graphics processing unit is real-time, i.e. obtains projection plane man-machine The relative position of interactive system and attitude information, when finding that its relative position and attitude change, after reacquiring change The variable outer parameter of system, and variable for system outer parameter is exported again to projecting interface processing module, described projection interface Reason module updates these parameters, and carries out projection of shape pre-distortion based on this new parameter, real to realize projection plane Time tracking.
The embodiment of the present invention can complete merely with a common projection device and a common image sensing device Whether finger tip is touched the judgement of projection plane, it is achieved that low cost, low-power consumption, high-precision touch detection, further, since this Invention need not use depth transducer, and therefore the volume of system is the least.
Finger tip touching detection algorithm described in the embodiment of the present invention is not affected by characteristic information in human-computer interaction interface, i.e. Just not obvious point, line or the basic feature unit such as Else Rule figure or image in interactive interface, this algorithm also can be fine Enforcement.
Described in the embodiment of the present invention finger tip touching detection algorithm do not rely on image feature information, projector without Project dominant or recessive structure light, therefore less demanding to the acquisition frame rate of image sensing cell and projecting cell so that Algorithm more universality.
Finger tip touching detection algorithm described in the embodiment of the present invention is by calculating finger tip and perspective plane spacing conduct The foundation that touching judges, therefore this touching detection algorithm is affected not quite by finger tip its thickness, and touching accuracy in detection is high.
It should be noted that in all embodiments of the invention, except the situation of hands touching projection plane can be processed, Touch control object for other is also the most permissible, and its principle is identical, is not repeated herein.
Above example is merely to illustrate technical scheme, is not intended to limit;Although with reference to previous embodiment The present invention is described in detail, it will be understood by those within the art that: it still can be to aforementioned each enforcement Technical scheme described in example is modified, or wherein portion of techniques feature is carried out equivalent;And these are revised or replace Change, do not make the essence of appropriate technical solution depart from the spirit and scope of various embodiments of the present invention technical scheme.

Claims (10)

1. a finger tip touching detecting system, it is characterised in that this system includes:
Projection interface processing unit, for receiving the human-computer interaction interface information of outside input, and man-machine interaction circle that will receive Surface information exports to projecting cell;
Projecting cell, for projecting human-computer interaction interface information to projection plane;
Image sensing cell, the view field's image in acquired projections plane, and the view field's image collected is sent out Give graphics processing unit;
Whether graphics processing unit contains finger tip in being used for judging view field's image, and calculates finger tip and people when containing finger tip The distance of machine interactive interface and the positional information in human-computer interaction interface thereof;If described finger tip and the distance of human-computer interaction interface Less than or equal to the first threshold value, output touching information is to touching detector unit;Otherwise output containing finger tip view field's image and Finger tip positional information in human-computer interaction interface is to touching detector unit;
Graphics processing unit is additionally operable to acquisition system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell And the homography matrix between image sensing cell, and by described system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system With the homography matrix between projecting cell and image sensing cell exports to touching detector unit;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell;The intrinsic outer parameter of system is projection Relative tertiary location and attitude Relation Parameters between unit and image sensing cell;The variable outer parameter of system be image sensing cell and Relative tertiary location between projecting cell and projection plane and attitude Relation Parameters;
Touching detector unit, including judging to perform photocentre mapping position computing module on module, imaging surface, the acquisition of finger tip minimum point Module, shade forward terminal acquisition module and finger tip touching judge module, wherein,
Judge to perform module, for when receive from the output information of graphics processing unit be touching information time, directly output is tactile Touch information, if receiving from the output information of graphics processing unit is that the view field's image containing finger tip and finger tip are in man-machine interaction During positional information in interface, perform photocentre mapping position computing module on imaging surface;
Photocentre mapping position computing module on imaging surface, for receiving the intrinsic outer parameter of system of graphics processing unit output, being Unite homography matrix between variable outer parameter, the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, by described reception To parameter obtained by geometrical calculation and to cross image sensing cell photocentre in image sensing cell plane and to be perpendicular to projection plane The intersection point b on straight line and perspective plane and cross the intersection point of the projecting cell photocentre straight line with image sensing cell photocentre and perspective plane A positional information, and above two intersection point information outputs are obtained to described finger tip minimum point acquisition module and described shade forward terminal Module;
Finger tip minimum point acquisition module, exists for the view field's image containing finger tip and finger tip receiving graphics processing unit output Positional information in human-computer interaction interface and from the intersection point b of photocentre mapping position computing module output on described imaging surface Positional information, utilizes finger tip in human-computer interaction interface it may happen that the colouring information of the area information of touching and finger tip obtains and refers to Point also carries out rim detection acquisition finger tip edge to finger tip, finger tip edge is justified matching and simulates center of circle o, connect center of circle o Finger tip minimum point f i.e. it is defined as, if it is on image sensing cell imaging surface with finger tip edge intersection point with intersection point b, straight line ob Coordinate isAnd the positional information of finger tip minimum point f is exported to described shade forward terminal acquisition module;
Shade forward terminal acquisition module, exists for the view field's image containing finger tip and finger tip receiving graphics processing unit output Positional information in human-computer interaction interface and receiving from the intersection point a of photocentre mapping position computing module output on described imaging surface Positional information, the positional information of finger tip minimum point f of described finger tip minimum point acquisition module output, utilize shade in HSV face Feature in the colour space obtains finger tip shadow region, by connecting intersection point a and finger tip minimum point f, hands at line and Shadow edge Point is shade forward terminal s, if its coordinate on image sensing cell imaging surface isAnd export finger tip minimum point F and shade forward terminal s positional information touch judge module to described finger tip;
Finger tip touching judge module, for single between the projecting cell receiving the output of described graphics processing unit and image sensing cell Answering property matrix, and from the finger tip lowest point of described shade forward terminal acquisition module output, shade forward terminal position, logical Cross homography matrix between projecting cell and image sensing cell and calculate shade forward terminal position in projecting cell planeThe intrinsic inside and outside parameter of system utilizing graphics processing unit to export can obtain image sensing cell and projecting cell Between spin matrix R and translation matrix T, calculate the projecting cell photocentre position under the same coordinate system and image sensing cell light Heart position, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell light The heart is Tc→p, finger tip minimum point isShade forward terminal isWherein Rp→cIt is tied to the spin matrix of image sensing cell coordinate system, T by projecting cell coordinate for 3x3c→pFor 1x3 by image sensing list Unit's coordinate is tied to the translation matrix of projecting cell coordinate system, fc、fpIt is respectively image sensing cell and projecting cell focal length,WithIt is respectively after image sensing cell corrects with projecting cell intrinsic parameterWithAccording to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position on image sensing cell imaging surface And image sensing cell photocentre and projecting cell photocentre position, then space is crossed in projecting cell photocentre, projecting cell plane cloudy Straight line [the x of shadow forward terminalc,yc,zc]=λ1v1+Tc→pWith finger tip in image sensing cell photocentre, image sensing cell plane excessively Straight line [the x of minimum pointc,yc,zc]=λ2v2The space coordinates of intersection point is λ2'v2, and using this position of intersecting point as fingertip location, its Inλ1、λ2For proportionality coefficient, then λ2' for definite value and meet It is v1Transposed matrix,It is v2Transposed matrix, finally lead to The variable outer parameter of system of crossing calculates perspective plane equation, with normal vector ncWith single-point pcRepresent, calculate and obtain above-mentioned intersection point and projection Distance between the surfaceWhen distance is less than the second threshold value, it is judged that for touching operation;When touching During operation, finger tip touches judge module also by the homography matrix between image sensing cell and the projecting cell according to reception, Then calculating the coordinate position obtaining touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touches Information of touching at least includes the location coordinate information in the human-computer interaction interface of finger tip touching original projection;
Control unit, the every other unit in control system, and coordinate the work of every other unit;Control unit is also Control system can be in the intrinsic inside and outside parameter acquiring state of system, system variable element acquisition state and finger tip acquisition and touch Touch mode of operation.
System the most according to claim 1, it is characterised in that when in the human-computer interaction interface of outside input, existence has bright When showing point, line or the surface information of feature, described projection interface processing unit is additionally operable to the human-computer interaction interface letter from outside input Extracting fisrt feature information in breath, and fisrt feature information is exported to graphics processing unit, wherein fisrt feature information includes Characteristic point, line dough-making powder, and characteristic point, line dough-making powder coordinate position in interface;
Described graphics processing unit extracts second feature information for the view field's image from image sensing cell output, its Middle second feature information includes characteristic point, line dough-making powder;
Second feature information and described fisrt feature information are compared by graphics processing unit, relative according to second feature information Whether characteristic point, line or face in fisrt feature information deform upon, it is judged that whether contain finger tip in view field, and calculate Finger tip at deformation is from the distance of human-computer interaction interface and the positional information in view field thereof.
System the most according to claim 1, it is characterised in that have when not existing in the human-computer interaction interface of outside input During point, line or the surface information of obvious characteristic, described projection interface processing unit is additionally operable to the human-computer interaction interface from outside input Information is extracted third feature information, and third feature information is exported to graphics processing unit, wherein third feature information bag Include the border of the human-computer interaction interface of projection;
Described graphics processing unit extracts fourth feature information for the view field's image from image sensing cell output, its Middle fourth feature information includes the border of human-computer interaction interface;
Third feature information and described fourth feature information are compared by graphics processing unit, relative according to third feature information In the deformation whether fourth feature information occurs, it is judged that whether contain finger tip in view field, and calculate finger tip at deformation from The distance of human-computer interaction interface and the finger tip positional information in human-computer interaction interface.
System the most according to claim 1, it is characterised in that between described projection interface processing unit and projecting cell It is provided with projection interface module, for the human-computer interaction interface information received is carried out shape predistortion correction process.
System the most according to claim 1, it is characterised in that between described image sensing cell and graphics processing unit It is provided with image sensing interface module, for the image gathered is carried out optic aberrance revising process.
6. the finger tip detection method of touch touching detecting system based on the finger tip described in claim 1, it is characterised in that should Method includes:
Step S1: projection interface processing unit receives the human-computer interaction interface information of outside input, and the man-machine interaction that will receive Interface information exports to projecting cell;
Step S2: human-computer interaction interface information is projected to projection plane by projecting cell;
Step S3: the view field's image in image sensing cell acquired projections plane, and this view field's image is sent to Graphics processing unit;
Step S4: graphics processing unit obtains system intrinsic parameter, the intrinsic outer parameter of system and the variable outer parameter of system, and utilizes institute The intrinsic outer parameter of system of stating and the variable outer parameter of system calculate the homography matrix between projecting cell and image sensing cell;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell;The intrinsic outer parameter of system is projection Relative tertiary location and attitude Relation Parameters between unit and image sensing cell;The variable outer parameter of system be image sensing cell and Relative tertiary location between projecting cell and projection plane and attitude Relation Parameters;
Step S5: graphics processing unit judges whether to contain on view field's image finger tip, and calculates finger tip when containing finger tip Distance with human-computer interaction interface;If described finger tip touches less than or equal to the first threshold value, output with the distance of human-computer interaction interface Information of touching is to touching detector unit;Otherwise output contains view field's image and the finger tip position in human-computer interaction interface of finger tip Information is to touching detector unit;
Step S6: when touch detector unit receive from the output information of graphics processing unit be touching information time, directly export Touching information, if receiving from the output information of graphics processing unit is that the view field's image containing finger tip and finger tip are in man-machine friendship Mutually during the positional information in interface, obtain fingertip location, it is judged that whether finger tip is touched, if touching, export touching information, specifically For: receive the intrinsic outer parameter of system, the variable outer parameter of system, the intrinsic intrinsic parameter of system and the projection of graphics processing unit output Homography matrix between unit and image sensing cell, obtains image sensing cell by the described parameter received by geometrical calculation Cross image sensing cell photocentre in plane and be perpendicular to the straight line of projection plane and the intersection point b on perspective plane and cross projecting cell light The straight line of the heart and image sensing cell photocentre and the intersection point a positional information on perspective plane;Utilize the finger tip can in human-computer interaction interface The area information of touching and the colouring information of finger tip can be occurred to obtain finger tip and finger tip carries out rim detection acquisition finger tip edge, Finger tip edge is justified matching and simulates center of circle o, connect center of circle o and be i.e. defined as with finger tip edge intersection point with intersection point b, straight line ob Finger tip minimum point f, if its coordinate on image sensing cell imaging surface isUtilize shade in hsv color space In feature obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, at line and Shadow edge, intersection point is Shade forward terminal s, if its coordinate on image sensing cell imaging surface isBy projecting cell and image sensing Between unit, homography matrix calculates shade forward terminal position in projecting cell planeUtilize image procossing list The intrinsic inside and outside parameter of system of unit's output can obtain the spin matrix R between image sensing cell and projecting cell and translation matrix T, calculates the projecting cell photocentre position under the same coordinate system and image sensing cell photocentre position, if unified to image sensing Under unit photocentre coordinate system, image sensing cell photocentre is (0,0,0), and projecting cell photocentre is Tc→p, finger tip minimum point isShade forward terminal isWherein Rp→cFor 3x3 by projecting cell coordinate It is tied to the spin matrix of image sensing cell coordinate system, Tc→pIt is tied to projecting cell coordinate system by image sensing cell coordinate for 1x3 Translation matrix, fc、fpIt is respectively image sensing cell and projecting cell focal length,WithIt is respectively through figure After correcting with projecting cell intrinsic parameter as sensing unitWithRefer to according on image sensing cell imaging surface Point lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position Put, then space is crossed projecting cell photocentre, the straight line [x of projecting cell plane top shadow forward terminalc,yc,zc]=λ1v1+Tc→pWith mistake Straight line [the x of finger tip minimum point in image sensing cell photocentre, image sensing cell planec,yc,zc]=λ2v2The space coordinates of intersection point For λ2'v2, and using this position of intersecting point as fingertip location, whereinλ1、 λ2For proportionality coefficient, then λ2' for definite value and meet It is v1Turn Put matrix,It is v2Transposed matrix, calculate perspective plane equation, with normal vector n finally by the variable outer parameter of systemcAnd list Point pcRepresent, calculate and obtain above-mentioned intersection point and perspective plane spacingWhen distance is less than second During limit value, it is judged that for touching operation;When there is touching operation, finger tip touching judge module is also by according to the image sensing received Homography matrix between unit and projecting cell, calculates the coordinate obtaining touch points in the human-computer interaction interface of projection then Position, and export finger tip touching information, finger tip touching information at least includes in the human-computer interaction interface of finger tip touching original projection Location coordinate information.
Method the most according to claim 6, it is characterised in that described in step S5, graphics processing unit judges view field Whether contain finger tip on image to include:
When the human-computer interaction interface of outside input exists point, line or the surface information with obvious characteristic, described projection interface Processing unit is additionally operable to extract fisrt feature information from the human-computer interaction interface information of outside input, and fisrt feature information Output is to graphics processing unit, and wherein fisrt feature information includes characteristic point, line dough-making powder, and characteristic point, line dough-making powder are at interface In coordinate position;
Described graphics processing unit extracts second feature information for the view field's image from image sensing cell output, its Middle second feature information includes characteristic point, line dough-making powder;
Second feature information and described fisrt feature information are compared by graphics processing unit, relative according to second feature information Whether characteristic point, line or face in fisrt feature information deform upon, it is judged that whether contain finger tip in view field, and calculate Finger tip at deformation is from the distance of human-computer interaction interface and the positional information in view field thereof.
Method the most according to claim 6, it is characterised in that described in step S5, graphics processing unit judges view field Whether contain finger tip on image to include:
When there is not point, line or the surface information with obvious characteristic in the human-computer interaction interface of outside input, described projection circle Face processing unit is additionally operable to extract third feature information from the human-computer interaction interface information of outside input, and third feature is believed Breath output is to graphics processing unit, and wherein third feature information includes the border of the human-computer interaction interface projected;
Described graphics processing unit extracts fourth feature information for the view field's image from image sensing cell output, its Middle fourth feature information includes the border of human-computer interaction interface;
Third feature information and described fourth feature information are compared by graphics processing unit, relative according to third feature information Whether deform upon in fourth feature information, it is judged that whether contain finger tip in view field, and calculate the finger tip at deformation from people The distance of machine interactive interface and the positional information in view field thereof.
Method the most according to claim 6, it is characterised in that in the man-machine friendship that described projection interface processing unit will receive Before interface information exports to projecting cell mutually, the human-computer interaction interface information received is carried out shape predistortion correction process.
Method the most according to claim 6, it is characterised in that the image of collection is sent to figure at image sensing cell As, before processing unit, the image gathered being carried out optic aberrance revising process.
CN201410175698.0A 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method Active CN103955316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410175698.0A CN103955316B (en) 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410175698.0A CN103955316B (en) 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method

Publications (2)

Publication Number Publication Date
CN103955316A CN103955316A (en) 2014-07-30
CN103955316B true CN103955316B (en) 2016-09-21

Family

ID=51332597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410175698.0A Active CN103955316B (en) 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method

Country Status (1)

Country Link
CN (1) CN103955316B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033286A (en) * 2015-03-08 2016-10-19 青岛通产软件科技有限公司 Virtual touch interaction method and device based on projection display and robot
CN106648263B (en) * 2016-11-11 2022-01-04 珠海格力电器股份有限公司 Terminal equipment and control system, control method and device thereof
CN107092350A (en) * 2017-03-22 2017-08-25 深圳大学 A kind of remote computer based system and method
KR20200003407A (en) * 2017-07-28 2020-01-09 구글 엘엘씨 Systems and methods for predicting and summarizing medical events from electronic health records
CN107943351B (en) * 2017-11-22 2021-01-05 苏州佳世达光电有限公司 Projection surface touch identification system and method
CN108363484B (en) * 2018-01-24 2021-04-09 广州杰赛科技股份有限公司 Control method, device and system of non-touch display screen equipment and computer equipment
CN108363485B (en) * 2018-01-25 2021-06-18 广州杰赛科技股份有限公司 Control method, device and system of non-touch screen display terminal and computer equipment
US10917320B2 (en) * 2018-09-10 2021-02-09 Aveva Software, Llc Edge HMI module server system and method
CN114419148A (en) * 2021-12-08 2022-04-29 科大讯飞股份有限公司 Touch detection method, device, equipment and computer readable storage medium
CN114637394A (en) * 2022-02-08 2022-06-17 武汉光庭信息技术股份有限公司 Interactive operation system and method for bare hand and simulated touch screen interface in VR environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201025078A (en) * 2008-12-26 2010-07-01 Inventec Appliances Corp Virtual keyboard of an electronic device and a data inputting method therefor
CN102508574A (en) * 2011-11-09 2012-06-20 清华大学 Projection-screen-based multi-touch detection method and multi-touch system
CN103279225A (en) * 2013-05-30 2013-09-04 清华大学 Projection type man-machine interactive system and touch control identification method
CN103336634A (en) * 2013-07-24 2013-10-02 清华大学 Adaptive hierarchical structure light-based touch detection system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201025078A (en) * 2008-12-26 2010-07-01 Inventec Appliances Corp Virtual keyboard of an electronic device and a data inputting method therefor
CN102508574A (en) * 2011-11-09 2012-06-20 清华大学 Projection-screen-based multi-touch detection method and multi-touch system
CN103279225A (en) * 2013-05-30 2013-09-04 清华大学 Projection type man-machine interactive system and touch control identification method
CN103336634A (en) * 2013-07-24 2013-10-02 清华大学 Adaptive hierarchical structure light-based touch detection system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"A new finger touch detection algorithm and prototype system architecture for pervasive bare-human computer interaction";Yi Xu et al;《ISCAS》;20130523;725-728 *

Also Published As

Publication number Publication date
CN103955316A (en) 2014-07-30

Similar Documents

Publication Publication Date Title
CN103955316B (en) A kind of finger tip touching detecting system and method
CN102508574B (en) Projection-screen-based multi-touch detection method and multi-touch system
CN104199550B (en) Virtual keyboard operation device, system and method
US20160004908A1 (en) Shape recognition device, shape recognition program, and shape recognition method
CN103838365B (en) Penetrating head-wearing display system and interactive operation method
CN103279225B (en) Projection type man-machine interactive system and touch control identification method
CN110209273A (en) Gesture identification method, interaction control method, device, medium and electronic equipment
CN108628533A (en) Three-dimensional graphical user interface
CN104423569A (en) Pointing position detecting device, method and computer readable recording medium
KR20160108386A (en) 3d silhouette sensing system
CN103336634B (en) Based on touching detection system and the method for adaptive layered structured light
TW201214243A (en) Optical touch system and object detection method therefor
CN109359514A (en) A kind of gesture tracking identification federation policies method towards deskVR
CN110222651A (en) A kind of human face posture detection method, device, terminal device and readable storage medium storing program for executing
CN104850842A (en) Mobile terminal iris identification man-machine interaction method
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN103176606B (en) Based on plane interaction system and the method for binocular vision identification
CN103761011B (en) A kind of method of virtual touch screen, system and the equipment of calculating
CN105786289B (en) A kind of method and electronic equipment reducing screen input nonlinearities
CN106415460A (en) Wearable device with intelligent user input interface
CN107577334A (en) A kind of somatosensory operation method and device of mobile terminal
CN113989831A (en) Myopia prevention and control method, device, terminal equipment and storage medium
CN110442242B (en) Intelligent mirror system based on binocular space gesture interaction and control method
CN102023759A (en) Writing and locating method of active pen

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant