CN104142730A - Method for mapping gesture tracking results to mouse events - Google Patents
Method for mapping gesture tracking results to mouse events Download PDFInfo
- Publication number
- CN104142730A CN104142730A CN201410318600.2A CN201410318600A CN104142730A CN 104142730 A CN104142730 A CN 104142730A CN 201410318600 A CN201410318600 A CN 201410318600A CN 104142730 A CN104142730 A CN 104142730A
- Authority
- CN
- China
- Prior art keywords
- mouse
- mapped
- tracking results
- gesture
- mapping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Image Analysis (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention discloses a method for mapping gesture tracking results to mouse events. The gesture tracking results in a physical space are mapped into the movement of mousse to realize the accurate and smooth control of the mousse in an operating system. The method comprises the following steps of: (1) carrying out time warping on the computation speeds of tracking algorithms under different platforms; (2) carrying out self-adaptive distance warping according to a distance between people and a camera; (3) setting a threshold value and parameters; (4) distinguishing shaking and slow movement situations for tracking results; (5) mapping hands in images to mouse events by a mapping function; (6) carrying out the speed smoothing process on the mouse movement process; (7) adopting an inter-frame interpolation mode to fill mouse points mapped by two frames, and displaying an interpolation. The method provided by the invention is not limited to the tracking algorithm, and the tracking results can be effectively mapped to the mouse events; for the algorithms with poor tracking effects, the final mapping results can be repaired by the mouse events, so that the availability of the algorithms in practice can be improved.
Description
Technical field
The present invention relates to computer vision and follow the tracks of and field of human-computer interaction, be specifically related to a kind of tracking of the gesture based on computer vision and be mapped to mouse calibration method.
Background technology
Along with the development of computer vision technique, gesture identification and tracking technique are increasingly mature, and its Related product also moves towards industrialization from scientific research gradually, become actual property commodity and come into society.Human-computer interaction technology is the key that the gesture identification based on computer vision and tracking technique and machine are contacted.Staff manipulates the cursor in computing machine by certain mapping algorithm, realizes the movement of cursor in operation interface and the control to operating system.
2D gesture track algorithm of today still has some limitations, in tracing process due to the existence of noise, tracking results inevitably produces shake in various degree, simultaneously, in reciprocal process, the friendly of user interactions and comfort level also become the key factor that interactive map algorithm is considered.Therefore,, to the applicability of common gesture track algorithm, improving algorithm robustness and meeting mankind's natural interaction custom becomes mapping algorithm development trend in human-computer interaction technology.
Summary of the invention
Target of the present invention is to overcome the deficiency that existing gesture mapping cursor algorithm exists in actual applications, provides a kind of natural, harmonious, efficient gesture tracking results to be mapped to the method for mouse event, and concrete technical scheme is as follows.
Gesture tracking results is mapped to a method for mouse event, comprises the following steps:
Current gesture track algorithm is carried out to Time alignment;
The tracking results of current gesture track algorithm is carried out to distance regular;
Differentiation shake and at a slow speed situation of movement;
Hand tracking result is mapped as to mouse event;
Mouse Scroll is done to smoothing processing;
Fill up the mouse point of two frame mappings.
Therein in an embodiment, described current gesture track algorithm is carried out to Time alignment specifically: gesture tracking results is multiplied by a coefficient of comparing with the standard testing platform interframe time, makes the same speed of staff in physical world can be mapped as same Mouse Scroll in the different interframe processing times.
Therein in an embodiment, describedly the tracking results of current gesture track algorithm is carried out to distance is regular specifically to be comprised the following steps: (a) utilize
In tracking results, follow the tracks of the size of frame, calculate the distance between staff and camera;
(b) adjust adaptively the human hand movement speed of calculating in drawing.
In an embodiment, described differentiation shake and at a slow speed situation of movement specifically comprise the following steps therein:
(a) accumulated history trace point is obtained average filter point;
(b) current trace point and this filtering point are compared, distinguish and move at a slow speed and shake;
(c) shielding shake, amplifies and moves at a slow speed.
Therein in an embodiment, the described concrete mapping scheme that hand tracking result is mapped to mouse event comprises: adopt piecewise function, by mapping multiplying power corresponding to the human hand movement speed after regular.
Therein in an embodiment, describedly Mouse Scroll is done to smoothing processing specifically comprise: the Mouse Scroll that the Mouse Scroll to current mapping and former frame obtain does weighted sum, avoids the display speed of mouse in tracing process visually undergoing mutation alternately.
Therein in an embodiment, the described mouse point of filling up two frames mappings is specifically: within two frame algorithm process times, adopt the mode of interframe linear interpolation to fill up the mouse point that two frames shine upon, insert the number of mouse point depending on the specific algorithm processing time, thereby avoid the long situation that causes mouse to be beated of some algorithm processing time.
Further optimize, said method also comprises: according to user's movement tendency, and the transverse and longitudinal mapping track of standardization mouse.
Compared with prior art, tool has the following advantages and technique effect the method that a kind of gesture tracking results of the present invention is mapped to mouse event:
(1) algorithm complex of the present invention is low, and the resource consuming is in actual applications few.
(2) the present invention is not limited to concrete track algorithm, has universality.
(3) the present invention can make up the deficiency of track algorithm itself to a certain extent, improves track algorithm robustness in practice.
(4) the present invention can provide and the naturally mutual body sense of computing machine for user, and operation sense is strong.
Brief description of the drawings
Fig. 1 is the method flow diagram that gesture tracking results of the present invention is mapped to mouse event.
Fig. 2 is the imaging schematic diagram of different user service range.
Fig. 3 is motion state classification chart.
Fig. 4 is mapping function of the present invention.
Fig. 5 is the result comparison diagram of the present invention and the distance map method that one of them is conventional.
Embodiment
Below in conjunction with accompanying drawing, specific embodiment of the invention method is described further; but enforcement of the present invention and protection are not limited to this; need instructions, below every not specified symbol, step or process, be all that those skilled in the art can be with reference to existing techniques in realizing.
As shown in Figure 1, the method that gesture tracking results of the present invention is mapped to mouse event comprises following flow process.
(1) current gesture track algorithm is carried out to Time alignment, gesture tracking results is multiplied by a coefficient of comparing with the standard testing platform interframe time, make the same speed of staff in physical world can be mapped as same Mouse Scroll in the different interframe processing times, Time alignment method is as follows:
2 Euclidean distances of gesture speed v=interframe (pixel count) d/ interframe time t (ms)
In standard testing platform algorithm entirety consuming time be T ms, determine mapping range steady, v1, v2, be chosen in the more excellent test parameter in this test platform.In the time moving under other platforms, process these parameters by Time alignment, make the same speed of staff in physical world can be mapped as same cursor speed in the different interframe processing times.Implementation is to be multiplied by a coefficient of comparing with the standard testing platform interframe time.
(2) do distance according to the distance of user and camera regular, utilize the size of following the tracks of frame (204,205) in tracking results, as shown in Figure 2, calculate the distance L between staff area S and staff 201 and different distance camera 203, the standard staff area S on test platform
standardwhen more excellent (shine upon corresponding), staff 201 and demarcate the gauged distance L between camera 202
standardcompare the zoom factor F that obtains the human hand movement speed that current algorithm calculates, the human hand movement speed of utilizing zoom factor to adjust adaptively to calculate in drawing.
(3) differentiation shake and at a slow speed situation of movement, Fig. 3 comprises jitter points 302 and transfer point 303 at a slow speed, be averaged and obtain medium filtering point 301 by 25 trace points of accumulated history, current trace point and this filtering point are compared, instead of carry out comparison with the point of previous frame.Thereby dwindle shake 304, amplify at a slow speed and move 305, as shown in Figure 3.
(4) hand tracking result is mapped to the mapping scheme of mouse event, adopts piecewise function, by mapping multiplying power corresponding to the human hand movement speed after regular.Mapping scheme of the present invention as shown in Figure 4.In figure, transverse axis is the movement velocity of the staff that obtains of algorithm keeps track, and the longitudinal axis is mapping multiplying power (movement velocity of Mouse Scroll/staff that multiplying power k=will shine upon).Set different threshold speed v1, v2, v3, mapping function is piecewise nonlinear function, comprises exponential function, logarithmic function, and wherein the value of α need to be according to the actual mapping multiplying power adjustment that will reach.
(5) may cause the sudden change of speed to the tracking of staff in order to reduce algorithm, the present invention does smoothing processing to Mouse Scroll, the Mouse Scroll that Mouse Scroll to current mapping and former frame obtain does weighted sum, avoid the display speed of mouse in tracing process visually undergoing mutation alternately, formula is as follows:
Mapping speed+0.5* present frame of speed=0.5* previous frame of mapping does not have the mapping speed before filtering
(6) within two frame algorithm process times, adopt the mode of interframe linear interpolation to fill up the mouse point of two frame mappings, insert the number of mouse point depending on the specific algorithm processing time, thereby avoid the long situation that causes mouse to be beated of some algorithm processing time.
Below only for giving an example: the use in concrete track algorithm, taking average drifting (mean shift) algorithm as target tracking algorism as example, using hand as tracking target, by mapping method of the present invention and the tracking results that the distance Relative Maps method comparatively generally using at present acts on respectively, realize gesture roaming and be mapped as the function that mouse moves.For effective ratio is compared with two kinds of mapping methods, adopt same section of test video as input, in video, user's hand is at the uniform velocity laterally roamed.Be extracted in respectively the absolute displacement of mouse on screen under two kinds of mapping methods, in same coordinate system, depict, as shown in Figure 5, solid line is the mapping result of method proposed by the invention, dotted line is the mapping result of following the tracks of Relative Maps method, calculating the variance of two kinds of mapping methods on y axle can obtain, and the variance of conventional distance map method approximates 21.9447, and variance of the present invention is about 11.9944.This shows, mapping method of the present invention is comparatively stable, and Path balance is shaken less.
These are only specific embodiments of the invention, do not limit protection scope of the present invention with this; Not violating any replacement and the improvement done on the basis of the present invention's design, all belong to protection scope of the present invention.
Claims (8)
1. gesture tracking results is mapped to a method for mouse event, it is characterized in that comprising the steps:
(1) current gesture track algorithm is carried out to Time alignment;
(2) tracking results of current gesture track algorithm is carried out to distance regular;
(3) differentiation shake and at a slow speed situation of movement;
(4) hand tracking result is mapped as to mouse event;
(5) Mouse Scroll is done to smoothing processing;
(6) fill up the mouse point that two frames shine upon.
2. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that described current gesture track algorithm being carried out to Time alignment specifically: gesture tracking results is multiplied by a coefficient of comparing with the standard testing platform interframe time, makes the same speed of staff in physical world can be mapped as same Mouse Scroll in the different interframe processing times.
3. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that described distance is specifically regular: do distance according to the distance of user and camera regular, utilize the size of following the tracks of frame in tracking results, calculate the distance between staff and camera, adjust adaptively the human hand movement speed of calculating in picture.
4. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that described differentiation shake and situation of movement is specifically at a slow speed: accumulated history trace point obtains average filter point, current trace point and this filtering point are compared, distinguish and move at a slow speed and shake, and shielding shake, amplify and move at a slow speed.
5. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that step (4) adopts piecewise function, by mapping multiplying power corresponding to the human hand movement speed after regular.
6. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that in step (5), the Mouse Scroll that Mouse Scroll to current mapping and former frame obtain does weighted sum, avoids the display speed of mouse in tracing process visually undergoing mutation alternately.
7. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that step (6) adopts the mode of interframe linear interpolation to fill up the mouse point of two frame mappings within two frame algorithm process times, insert the number of mouse point and according to the hardware condition of the efficiency of chip or computing machine, the concrete gesture track algorithm processing time is determined, thereby avoid the long situation that causes mouse to be beated of some algorithm processing time.
8. a kind of gesture tracking results as claimed in claim 1 is mapped to the method for mouse event, it is characterized in that the movement tendency according to user, the transverse and longitudinal mapping track of standardization mouse.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410318600.2A CN104142730B (en) | 2014-07-04 | 2014-07-04 | A kind of method that gesture tracking result is mapped to mouse event |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410318600.2A CN104142730B (en) | 2014-07-04 | 2014-07-04 | A kind of method that gesture tracking result is mapped to mouse event |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104142730A true CN104142730A (en) | 2014-11-12 |
CN104142730B CN104142730B (en) | 2017-06-06 |
Family
ID=51851932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410318600.2A Active CN104142730B (en) | 2014-07-04 | 2014-07-04 | A kind of method that gesture tracking result is mapped to mouse event |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104142730B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104793744A (en) * | 2015-04-16 | 2015-07-22 | 天脉聚源(北京)传媒科技有限公司 | Gesture operation method and device |
CN111330243A (en) * | 2018-12-18 | 2020-06-26 | 上海赢赞数字科技有限公司 | Rock climbing wall somatosensory interaction method, control system and interaction system |
WO2021032097A1 (en) * | 2019-08-19 | 2021-02-25 | 华为技术有限公司 | Air gesture interaction method and electronic device |
CN112671972A (en) * | 2020-12-21 | 2021-04-16 | 四川长虹电器股份有限公司 | Method for controlling movement of large-screen television mouse by mobile phone |
CN114840126A (en) * | 2022-05-23 | 2022-08-02 | 北京字跳网络技术有限公司 | Object control method, device, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102707802A (en) * | 2012-05-09 | 2012-10-03 | 华南理工大学 | Method for controlling speed of mapping of gesture movement to interface |
US20130083037A1 (en) * | 2011-10-01 | 2013-04-04 | Oracle International Corporation | Moving a display object within a display frame using a discrete gesture |
CN103324281A (en) * | 2013-04-18 | 2013-09-25 | 苏州易乐展示系统工程有限公司 | Filtering method of non-contact interactive display system |
CN103324277A (en) * | 2012-03-22 | 2013-09-25 | 扬州永利宁科技有限公司 | Touch free user input recognition |
CN103400118A (en) * | 2013-07-30 | 2013-11-20 | 华南理工大学 | Gesture control method capable of adaptively adjusting mapping relation |
CN103488294A (en) * | 2013-09-12 | 2014-01-01 | 华南理工大学 | Non-contact gesture control mapping adjustment method based on user interactive habits |
-
2014
- 2014-07-04 CN CN201410318600.2A patent/CN104142730B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130083037A1 (en) * | 2011-10-01 | 2013-04-04 | Oracle International Corporation | Moving a display object within a display frame using a discrete gesture |
CN103324277A (en) * | 2012-03-22 | 2013-09-25 | 扬州永利宁科技有限公司 | Touch free user input recognition |
CN102707802A (en) * | 2012-05-09 | 2012-10-03 | 华南理工大学 | Method for controlling speed of mapping of gesture movement to interface |
CN103324281A (en) * | 2013-04-18 | 2013-09-25 | 苏州易乐展示系统工程有限公司 | Filtering method of non-contact interactive display system |
CN103400118A (en) * | 2013-07-30 | 2013-11-20 | 华南理工大学 | Gesture control method capable of adaptively adjusting mapping relation |
CN103488294A (en) * | 2013-09-12 | 2014-01-01 | 华南理工大学 | Non-contact gesture control mapping adjustment method based on user interactive habits |
Non-Patent Citations (1)
Title |
---|
ARGELAGUET F ANDUJAR C: "A survey of 3D object selection techniques for virtual environments", 《COMPUTERS & GRAPHICS》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104793744A (en) * | 2015-04-16 | 2015-07-22 | 天脉聚源(北京)传媒科技有限公司 | Gesture operation method and device |
CN111330243A (en) * | 2018-12-18 | 2020-06-26 | 上海赢赞数字科技有限公司 | Rock climbing wall somatosensory interaction method, control system and interaction system |
WO2021032097A1 (en) * | 2019-08-19 | 2021-02-25 | 华为技术有限公司 | Air gesture interaction method and electronic device |
US12001612B2 (en) | 2019-08-19 | 2024-06-04 | Huawei Technologies Co., Ltd. | Air gesture-based interaction method and electronic device |
CN112671972A (en) * | 2020-12-21 | 2021-04-16 | 四川长虹电器股份有限公司 | Method for controlling movement of large-screen television mouse by mobile phone |
CN114840126A (en) * | 2022-05-23 | 2022-08-02 | 北京字跳网络技术有限公司 | Object control method, device, electronic equipment and storage medium |
CN114840126B (en) * | 2022-05-23 | 2024-01-23 | 北京字跳网络技术有限公司 | Object control method, device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN104142730B (en) | 2017-06-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109711304B (en) | Face feature point positioning method and device | |
US9995578B2 (en) | Image depth perception device | |
US10430951B2 (en) | Method and device for straight line detection and image processing | |
CN104142730A (en) | Method for mapping gesture tracking results to mouse events | |
CN103188493B (en) | Image encoding apparatus and image encoding method | |
Barranco et al. | Contour motion estimation for asynchronous event-driven cameras | |
CN103970264B (en) | Gesture recognition and control method and device | |
CN103208006B (en) | Object motion mode identification method and equipment based on range image sequence | |
KR20160019110A (en) | High-performance plane detection with depth camera data | |
CN104574439A (en) | Kalman filtering and TLD (tracking-learning-detection) algorithm integrated target tracking method | |
CN102902355A (en) | Space interaction method of mobile equipment | |
CN103985137A (en) | Moving object tracking method and system applied to human-computer interaction | |
CN104517100B (en) | Gesture pre-judging method and system | |
CN110263714A (en) | Method for detecting lane lines, device, electronic equipment and storage medium | |
CN105511619B (en) | A kind of the human-computer interaction control system and method for view-based access control model infrared induction technology | |
CN103413137B (en) | Based on the interaction gesture movement locus dividing method of more rules | |
US20130336577A1 (en) | Two-Dimensional to Stereoscopic Conversion Systems and Methods | |
Asad et al. | Kinect depth stream pre-processing for hand gesture recognition | |
US9384576B2 (en) | Method and device for computing a change in an image scale of an object | |
CN110889347A (en) | Density traffic flow counting method and system based on space-time counting characteristics | |
CN104423568A (en) | control system, input device and control method for display screen | |
TW201913352A (en) | Method and electronic apparatus for wave detection | |
CN103632365B (en) | A kind of stereoscopic image disparity estimation method | |
CN102831616B (en) | Video stream motion vector calculation method | |
CN103076874A (en) | Method and system for improving high delay of computer-vision-based somatosensory input equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |