CN108446073A - A kind of method, apparatus and terminal for simulating mouse action using gesture - Google Patents
A kind of method, apparatus and terminal for simulating mouse action using gesture Download PDFInfo
- Publication number
- CN108446073A CN108446073A CN201810200113.4A CN201810200113A CN108446073A CN 108446073 A CN108446073 A CN 108446073A CN 201810200113 A CN201810200113 A CN 201810200113A CN 108446073 A CN108446073 A CN 108446073A
- Authority
- CN
- China
- Prior art keywords
- gesture
- user
- event
- mouse
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A kind of method, apparatus and terminal for simulating mouse action using gesture is disclosed, this method includes:It obtains gesture collecting device and acquires the obtained gesture information of user gesture;The gesture information is identified, the gesture operation event of user is obtained;It is searched according to the gesture operation event of the user and presets mapping ensemblen, the default mapping ensemblen includes the correspondence of at least one set of gesture operation event and mouse action event, wherein the mouse action event includes at least mouse-click event, mouse moving event;If finding the gesture operation event of the user in the default mapping ensemblen, mouse action event corresponding with the gesture operation event of the user is triggered.
Description
Technical field
This specification embodiment is related to technical field of hand gesture recognition more particularly to a kind of simulating mouse action using gesture
Method, apparatus and terminal.
Background technology
With the development of information technology, intelligent terminal indispensable part in having become for people's lives, user can lead to
It crosses intelligent terminal and realizes a variety of operations.Currently, during user is realized by intelligent terminal and operated, usually by mouse pair
Intelligent terminal is operated.But in actual use, will inevitably occur mouse can not service condition, for example,
Situations such as mouse failure, mouse electricity are used up has no any emergency measure in the prior art for this kind of situation, to,
In the case of this kind, user can not then operate intelligent terminal, and user experience is poor.
Invention content
In view of the above technical problems, this specification embodiment provides a kind of method, dress for simulating mouse action using gesture
It sets and terminal, technical solution is as follows:
According to this specification embodiment in a first aspect, providing a kind of method for simulating mouse action using gesture, the party
Method includes:
It obtains gesture collecting device and acquires the obtained gesture information of user gesture;
The gesture information is identified, the gesture operation event of user is obtained;
It is searched according to the gesture operation event of the user and presets mapping ensemblen, the default mapping ensemblen includes at least one set of hand
The correspondence of gesture action event and mouse action event, wherein the mouse action event include at least mouse-click event,
Mouse moving event;
If finding the gesture operation event of the user in the default mapping ensemblen, the hand of triggering and the user
The corresponding mouse action event of gesture action event.
According to the second aspect of this specification embodiment, a kind of device for simulating mouse action using gesture, the dress are provided
Set including:
Acquisition module acquires the obtained gesture information of user gesture for obtaining gesture collecting device;
Identification module obtains the gesture operation event of user for the gesture information to be identified;
Searching module presets mapping ensemblen, the default mapping ensemblen for being searched according to the gesture operation event of the user
Include the correspondence of at least one set of gesture operation event and mouse action event, wherein the mouse action event is at least wrapped
Include mouse-click event, mouse moving event;
Trigger module, if the gesture operation event for finding the user in the default mapping ensemblen, is triggered
Mouse action event corresponding with the gesture operation event of the user.
According to the third aspect of this specification embodiment, a kind of terminal is provided, including memory, processor and be stored in
On reservoir and the computer program that can run on a processor, wherein the processor realizes this explanation when executing described program
Any method for simulating mouse action using gesture that book embodiment provides.
The technical solution that this specification embodiment is provided is acquired by obtaining gesture collecting device obtained by user gesture
Gesture information, gesture information is identified, the gesture operation event of user is obtained, according to the gesture operation event of the user
The default mapping ensemblen for the correspondence for including at least one set of gesture operation event and mouse action event is searched, if in default mapping
The gesture operation event for finding user is concentrated, then triggers mouse action event corresponding with the gesture operation event of user, from
And realize and simulate mouse action using gesture, a kind of novel intelligent terminal operating method is provided to the user, in certain journey
User demand can be met on degree, promote user experience.
It should be understood that above general description and following detailed description is only exemplary and explanatory, not
This specification embodiment can be limited.
In addition, any embodiment in this specification embodiment does not need to reach above-mentioned whole effects.
Description of the drawings
In order to illustrate more clearly of this specification embodiment or technical solution in the prior art, below will to embodiment or
Attached drawing needed to be used in the description of the prior art is briefly described, it should be apparent that, the accompanying drawings in the following description is only
Some embodiments described in this specification embodiment for those of ordinary skill in the art can also be attached according to these
Figure obtains other attached drawings.
Fig. 1 is that a kind of application scenarios for being simulated mouse action using gesture shown in one exemplary embodiment of this specification are shown
It is intended to;
Fig. 2 is a kind of implementation of method for simulating mouse action using gesture shown in one exemplary embodiment of this specification
Example flow chart;
Fig. 3 is the schematic diagram of the default gesture shown in one exemplary embodiment of this specification;
Fig. 4 is a kind of implementation of device for simulating mouse action using gesture shown in one exemplary embodiment of this specification
Example block diagram;
Fig. 5 shows a kind of more specifically terminal hardware structural schematic diagram that this specification embodiment is provided.
Specific implementation mode
In order to make those skilled in the art more fully understand the technical solution in this specification embodiment, below in conjunction with this
Attached drawing in specification embodiment is described in detail the technical solution in this specification embodiment, it is clear that described
Embodiment is only a part of the embodiment of this specification, instead of all the embodiments.The embodiment of base in this manual,
The every other embodiment that those of ordinary skill in the art are obtained, should all belong to the range of protection.
Fig. 1 is referred to, for a kind of answering using gesture simulation mouse action shown in one exemplary embodiment of this specification
Use schematic diagram of a scenario.Fig. 1 includes intelligent terminal 110, image capture device 120, under the application scenarios, image capture device
120, which can be directed to user gesture (not shown in figure 1), acquires gesture information, and collected gesture information is transferred to intelligent terminal
110, intelligent terminal 110 can then execute the method for simulating mouse action using gesture of this specification embodiment offer, with logical
It crosses execution this method and determines user gesture, and determine the mouse action event corresponding to the user gesture, trigger the mouse action
Event, realization operate intelligent terminal 110.
As an example it is assumed that user watches video by intelligent terminal 110, in watching process, user wants pause and regards
Frequency plays, if user realizes that pause video playing, specific action process may include by operating mouse (not shown in figure 1):
User moves mouse so that mouse pointer is shown on the display interface of intelligent terminal 110, further, user moves mouse, makes
Mouse pointer is obtained to be moved on " pause " control, finally, user presses left mouse button and unclamps, after left mouse button is released,
Video, which suspends, to be played.
Corresponding to the action process for realizing pause video playing above by operation mouse, in this specification embodiment,
First, user can make being used to indicate with face image capture device 120 shows mouse on the display interface of intelligent terminal 110
The gesture of pointer is marked, intelligent terminal 110 then can show mouse pointer according to the gesture on display interface;Further, it uses
Family face image capture device 120 makes the gesture for being used to indicate the mobile mouse pointer on the display interface of intelligent terminal 110,
Intelligent terminal 110 can move mouse pointer according to the gesture on display interface, until mouse pointer is moved to " pause "
On control;Further, user's face image capture device 120 makes the gesture for being pressed and unclamping for indicating left mouse button,
Intelligent terminal 110 can click " pause " control according to the gesture trigger mouse pointer, realize pause video playing.
It should be noted that acquiring the gesture information of user gesture as just act above by image capture device 120
Example can also pass through other equipment, such as the gesture information of infrared sensor acquisition user gesture, this theory in practical applications
Bright book embodiment is not limited this.
It should also be noted that, the Image Acquisition 120 and the laying mode of intelligent terminal 110 shown in Fig. 1 are only made
For citing, in practical applications, intelligent terminal 110 can come with camera or infrared sensor, and this specification embodiment is to this
It is not limited.
As follows, the application scenarios in conjunction with shown in above-mentioned Fig. 1 show the utilization that following embodiments provide this specification embodiment
The method of gesture simulation mouse action illustrates.
Fig. 2 is referred to, is a kind of side for simulating mouse action using gesture shown in one exemplary embodiment of this specification
The embodiment flow chart of method, this method on the basis of application scenarios, can be applied to the intelligence shown in Fig. 1 shown in above-mentioned Fig. 1
In energy terminal 110, include the following steps:
Step 202:It obtains gesture collecting device and acquires the obtained gesture information of user gesture.
In this specification implementation, based on the application scenarios exemplified by Fig. 1, image capture device 120 is then that gesture acquires
Equipment, then, it is 120 collected user of Image Acquisition that gesture collecting device, which acquires the obtained gesture information of user gesture,
Images of gestures.
In addition, seen from the above description, gesture collecting device can also be infrared sensor, corresponding, and gesture acquisition is set
The standby acquisition obtained gesture information of user gesture is the collected infrared induction signal of infrared sensor.
Step 204:Gesture information is identified, the gesture operation event of user is obtained.
Illustrate first, in this specification embodiment, utilize gesture to simulate mouse action to realize, can be based on actually answering
To some gestures of the Operation Definition of mouse in, for convenience, defined gesture is known as default gesture.
In one embodiment, three classes can be defined and preset gesture, be respectively used to indicate the display interface in intelligent terminal 110
On show mouse pointer, be used to indicate left mouse button and be in down state, and be used to indicate left mouse button and be in and do not press
State is the schematic diagram of the default gesture shown in one exemplary embodiment of this specification for example, referring to Fig. 3, as shown in figure 3,
This presets gesture:Gesture of clenching fist (shown in Fig. 3 (a)), palm open gesture (shown in Fig. 3 (b)), singly refer to and stretch
Gesture (shown in Fig. 3 (c)).Wherein, palm opening gesture is used to indicate shows that mouse refers on 110 display interface of intelligent terminal
Needle, gesture of clenching fist are used to indicate left mouse button and are in down state, singly refer to stretch gesture be then used to indicate left mouse button be in not
Down state.
Meanwhile utilizing gesture to simulate mouse action to realize, it can the type based on mouse action in practical application stroke
Divide mouse action event, for example, two class mouse action events can be at least marked off, respectively mouse-click event, mouse movement
Event.Further, the operating characteristics of the mouse action event based on each type, establish mouse action event and gesture operation thing
The correspondence of part, for example, for mouse moving event, operating characteristics are " mouse are moved ", are based on this, can be with
Mobile first gesture action event occurs for a kind of gesture for indicating user of definition, which corresponds to
Mouse moving event;For mouse-click event, operating characteristics are " left mouse button are pressed ", it can be seen that, for
For mouse-click event, it is related to the transformation of user gesture, is based on this, a kind of gesture hair for indicating user can be defined
Change the second gesture action event changed, the i.e. corresponding mouse click event of the second gesture action event.
Based on above-mentioned default gesture, the definition of above-mentioned first gesture action event and second gesture action event can obtain
To the gesture operation event of example as shown in table 1 below:
It, can be with by above-mentioned table 1 it is found that using above-mentioned gesture operation event mapping mouse moving event and mouse-click event
Realize mapping of the gesture operation event to existing mouse event, for example, as described in Table 2, for gesture operation event and now
A kind of example of mapping relations between some mouse events:
Table 2
Gesture operation event | Mouse event |
Singly refer to and becomes fist event | MouseDown (is triggered) when left mouse button is pressed |
Singly refer to or fist moving event | MouseOver (is triggered) when mouse pointer slips over |
Fist becomes single self-explanatory characters' part | MouseUp (is triggered) when left mouse button is by being pressed to release |
Singly refer to or fist moving event | MouseOut (is triggered) when mouse pointer skids off |
Singly refer to or fist moving event | MouseMove (mouse pointer triggers when moving) |
By above-mentioned table 2 it is found that in this specification embodiment, user realizes corresponding gesture behaviour by making default gesture
Make event, you can existing mouse event is multiplexed, so as to the mouse event of the existing control enclosed inside of compatibility.
In addition, in addition to the gesture operation event shown in above-mentioned table 1, gesture operation event can also include:Palm
Become single self-explanatory characters' part, for indicating that the state of mouse pointer is in working condition from floating state adjustment;Singly refer to and become palm event, is used for
Indicate that the state of mouse pointer is adjusted to floating state from working condition.
It should be noted that when the state of mouse pointer is floating state, mouse can not be moved on display interface and referred to
Needle can first pass through palm and become single self-explanatory characters' part if mouse pointer need to be moved, and the state of mouse pointer is adjusted from floating state
It is in working condition.
Seen from the above description, whether first gesture action event, or be second gesture action event, it all refers to
Difference of the user between the front and back gesture made twice (is specially that gesture is identical, but relative position changes;Gesture is not
Together), thus, can be respectively to the gesture information that currently gets and the preceding hand once got in this specification embodiment
Gesture information is identified, and with the gesture for obtaining gesture that user currently makes and once being made before user, illustrates herein, in order to retouch
It states conveniently, the gesture that user currently makes is known as first gesture, the gesture once made before user is known as second gesture.
Subsequently, it can first determine whether first gesture belongs to above-mentioned default gesture with second gesture, if so, continuing to sentence
Whether disconnected first gesture and second gesture are identical, if identical, further determine that physics of the first gesture relative to second gesture
Displacement obtains if the physical displacement is more than predetermined threshold value for indicating that the gesture of user is moved by second gesture position
To the first gesture action event of first gesture position;If first gesture is differed with two gestures, it is available for
Indicate that the gesture of user is transformed to the second gesture action event of first gesture by second gesture.
It should be noted that in above process, by determining that first gesture is big relative to the physical displacement of second gesture
When predetermined threshold value, then first gesture action event is obtained, mistake can be led to avoid some slight movements are made due to user
Maloperation.
In addition, in this specification embodiment, it, can be by mouse if the gesture recognized is not belonging to above-mentioned default gesture
The state of pointer is set as floating state.
It is as follows, by taking gesture information is the images of gestures of user as an example, the process that gesture information is identified is illustrated:
First, the gesture area of user is extracted in user gesture image, for example, in practical applications, the hand of user
Gesture is often placed in before user's body, and so as to have different depth values from background area using gesture area, this is special
Sign, gesture area is extracted in user gesture image.Specifically, according to the depth value of pixel in user gesture image, system
Meter obtains the grey level histogram of the image, and grey level histogram can then represent the pixel with certain gray level in the image
Number.Since in user gesture image, gesture area is smaller relative to the area of background area, and gray value is smaller, because
This can search the ash that pixel number changes greatly in aforementioned grey level histogram according to the sequence of gray value from big to small
Angle value, using the gray value found as the gray threshold for region segmentation, for example, gray threshold is 235, then, then may be used
To carry out binaryzation, in obtained binary image, white pixel point institute table to user gesture image according to the gray threshold
The region shown is gesture area.
Further, feature extraction is carried out to the gesture area using preset feature extraction algorithm, for example, preset feature
Extraction algorithm can be SIFT feature extraction algorithm, the Shape Feature Extraction based on small echo and relative moment and allocation algorithm, model
Method etc., the feature extracted may include:The barycenter of gesture area, the feature vector of gesture area, fingers number etc..
Finally, gesture identification is carried out by the feature extracted, determines the gesture that user makes.
In the foregoing description, under first gesture scene identical with second gesture, determine first gesture relative to second
The detailed process of the physical displacement of gesture, may refer to description in the prior art, this is no longer described in detail in this specification embodiment.
Subsequently, the physical displacement determined can be converted into inches, further uses the physical displacement and removes
Correspond to actual range with each pixel on the screen of intelligent terminal 110, the actual range also in inches, obtained knot
Fruit is the number of the pixel of mouse pointer movement.
Step 206:It is searched according to the gesture operation event of user and presets mapping ensemblen, which includes at least one set
The correspondence of gesture operation event and mouse action event, wherein mouse action event includes at least mouse-click event, mouse
Mark moving event.
Step 208:If finding the gesture operation event of user in default mapping ensemblen, triggering and the gesture of user are grasped
Make the corresponding mouse action event of event.
It is as follows, above-mentioned steps 206 to step 208 are described in detail:
In this specification embodiment, mapping ensemblen can be pre-set, which includes at least one set of gesture operation
The correspondence of event and mouse action event, for example, as described above, which can be as described in Table 3:
Table 3
Based on the mapping ensemblen exemplified by above-mentioned table 2, in this specification embodiment, obtain user gesture operation event it
Afterwards, then the mapping ensemblen exemplified by Fig. 2 can be searched according to the gesture operation event, if finding the gesture operation event, touched
Send out mouse action event corresponding.
Technical solution provided by the present invention acquires the obtained gesture letter of user gesture by obtaining gesture collecting device
Breath, is identified gesture information, obtains the gesture operation event of user, includes according to the lookup of the gesture operation event of the user
The default mapping ensemblen of at least one set of gesture operation event and the correspondence of mouse action event, if being searched in default mapping ensemblen
To the gesture operation event of user, then mouse action event corresponding with the gesture operation event of user is triggered, to realize
Mouse action is simulated using gesture, has provided a kind of novel intelligent terminal operating method to the user, it to a certain extent can be with
Meet user demand, promotes user experience.
Corresponding to above method embodiment, this specification embodiment also provides a kind of dress for simulating mouse action using gesture
It sets, refers to Fig. 4, be a kind of reality of device for simulating mouse action using gesture shown in one exemplary embodiment of this specification
A block diagram is applied, which may include:Acquisition module 41, identification module 42, searching module 43, trigger module 44.
Wherein, acquisition module 41 can be used for obtaining the gesture collecting device acquisition obtained gesture information of user gesture;
Identification module 42 can be used for that the gesture information is identified, and obtain the gesture operation event of user;
Searching module 43 can be used for being searched according to the gesture operation event of the user and preset mapping ensemblen, described default
Mapping ensemblen includes the correspondence of at least one set of gesture operation event and mouse action event, wherein the mouse action event
Including at least mouse-click event, mouse moving event;
Trigger module 44, if can be used for finding the gesture operation event of the user in the default mapping ensemblen,
Then trigger mouse action event corresponding with the gesture operation event of the user.
In one embodiment, the gesture collecting device is image capture device, and the gesture information is adopted for described image
Collect the collected user gesture image of equipment.
In one embodiment, the identification module 42 may include (being not shown in Fig. 4):
Extracted region submodule, the gesture area for extracting user in the user gesture image;
Feature extraction submodule, for carrying out feature extraction to the gesture area using preset feature extraction algorithm;
Feature recognition submodule obtains the gesture operation thing of user for carrying out gesture identification by the feature extracted
Part.
In one embodiment, the gesture operation event of the user includes at least:For indicating that the gesture of the user is sent out
Raw mobile first gesture action event, the second gesture action event converted for indicating the gesture of the user;
Wherein, the first gesture action event corresponds to the mouse moving event, the second gesture action event pair
Answer the mouse click event.
In one embodiment, the identification module 42 may include (being not shown in Fig. 4):
Gesture identification submodule, for respectively to the gesture information currently got and the preceding gesture information once got
It is identified, the second gesture for obtaining first gesture that the user currently makes and once being made before the user;
First judging submodule, for judging whether the first gesture and the second gesture belong to default gesture;
Second judgment submodule judges institute if belonging to default gesture for the first gesture and the second gesture
It is whether identical as the second gesture to state first gesture;
Displacement determination sub-module, if identical as the second gesture for the first gesture, it is determined that described first-hand
Physical displacement of the gesture relative to the second gesture;
First determination sub-module obtains if being more than predetermined threshold value for the physical displacement for indicating the user
Gesture the first gesture action event of the first gesture position is moved to by the second gesture position;
Second determination sub-module obtains if different from the second gesture for the first gesture for indicating
The gesture for stating user is transformed to the second gesture action event of the first gesture by the second gesture.
In one embodiment, the default gesture includes at least:
Clench fist gesture, palm open gesture, singly refer to and stretch gesture.
It is understood that acquisition module 41, identification module 42, searching module 43 and trigger module 44 are used as four kinds
The module of functional independence, both can as shown in Figure 4 simultaneously configuration in a device, can also individually configure in a device, because
This structure shown in Fig. 4 should not be construed as the restriction to this specification example scheme.
In addition, the function of modules and the realization process of effect specifically refer to corresponding step in the above method in above-mentioned apparatus
Rapid realization process, details are not described herein.
This specification embodiment also provides a kind of terminal, includes at least memory, processor and storage on a memory
And the computer program that can be run on a processor, wherein processor realizes utilization gesture mould above-mentioned when executing described program
The method of quasi- mouse action.This method includes at least:It obtains gesture collecting device and acquires the obtained gesture information of user gesture;
The gesture information is identified, the gesture operation event of user is obtained;It is searched according to the gesture operation event of the user
Default mapping ensemblen, the default mapping ensemblen includes the correspondence of at least one set of gesture operation event and mouse action event,
In, the mouse action event includes at least mouse-click event, mouse moving event;If being searched in the default mapping ensemblen
To the gesture operation event of the user, then mouse action event corresponding with the gesture operation event of the user is triggered.
In one embodiment, the gesture collecting device is image capture device, and the gesture information is adopted for described image
Collect the collected user gesture image of equipment.
In one embodiment, described that the gesture information is identified, the gesture operation event of user is obtained, including:
The gesture area of user is extracted in the user gesture image;
Feature extraction is carried out to the gesture area using preset feature extraction algorithm;
Gesture identification is carried out by the feature extracted, obtains the gesture operation event of user.
In one embodiment, the gesture operation event of the user includes at least:For indicating that the gesture of the user is sent out
Raw mobile first gesture action event, the second gesture action event converted for indicating the gesture of the user;
Wherein, the first gesture action event corresponds to the mouse moving event, the second gesture action event pair
Answer the mouse click event.
In one embodiment, described that the gesture information is identified, the gesture operation event of user is obtained, including:
The gesture information and the preceding gesture information once got that currently get are identified respectively, obtain the use
The first gesture that family is currently made and the second gesture once made before the user;
Judge whether the first gesture belongs to default gesture with the second gesture, if so, judging described first-hand
Whether gesture is identical as the second gesture;
If identical, it is determined that physical displacement of the first gesture relative to the second gesture;If the physical displacement
More than predetermined threshold value, then obtain for indicating that the gesture of the user is moved to described first by the second gesture position
The first gesture action event of gesture position;
If it is different, then obtaining for indicating that the gesture of the user is transformed to the first gesture by the second gesture
Second gesture action event.
In one embodiment, the default gesture includes at least:Clench fist gesture, palm open gesture, singly refer to and stretch gesture.
Fig. 5 shows a kind of more specifically terminal hardware structural schematic diagram that this specification embodiment is provided, the end
End may include:Processor 510, memory 520, input/output interface 530, communication interface 540 and bus 550.Wherein handle
Device 510, memory 520, input/output interface 530 and communication interface 540 by bus 550 realize between in equipment
The communication connection in portion.
General CPU (Central Processing Unit, central processing unit), microprocessor may be used in processor 510
Device, application specific integrated circuit (Application Specific Integrated Circuit, ASIC) or one or
The modes such as multiple integrated circuits are realized, for executing relative program, to realize technical solution that this specification embodiment is provided.
ROM (Read Only Memory, read-only memory), RAM (Random Access may be used in memory 520
Memory, random access memory), static storage device, the forms such as dynamic memory realize.Memory 520 can store
Operating system and other applications are realizing technical solution that this specification embodiment is provided by software or firmware
When, relevant program code is stored in memory 520, and is executed by processor 510 to call.
Input/output interface 530 is for connecting input/output module, to realize information input and output.Input and output/
Module (can be not shown) in Fig. 5 in a device as component Configuration, can also be external in equipment to provide corresponding function.Wherein
Input equipment may include keyboard, mouse, touch screen, microphone, various kinds of sensors etc., output equipment may include display,
Loud speaker, vibrator, indicator light etc..
Communication interface 540 is used for connection communication module (being not shown in Fig. 5), to realize the communication of this equipment and other equipment
Interaction.Wherein communication module can be realized by wired mode (such as USB, cable etc.) and be communicated, can also be wirelessly
(such as mobile network, WIFI, bluetooth etc.) realizes communication.
Bus 550 includes an access, in various components (such as processor 510, memory 520, the input/output of equipment
Interface 530 and communication interface 540) between transmit information.
It should be noted that although above equipment illustrates only processor 510, memory 520, input/output interface
530, communication interface 540 and bus 550, but in specific implementation process, which can also include realizing normal operation
Necessary other assemblies.In addition, it will be appreciated by those skilled in the art that, can also only include to realize in above equipment
Component necessary to this specification example scheme, without including all components shown in figure.
This specification embodiment also provides a kind of computer readable storage medium, is stored thereon with computer program, the journey
The method above-mentioned for simulating mouse action using gesture is realized when sequence is executed by processor.This method includes at least:Obtain gesture
Collecting device acquires the obtained gesture information of user gesture;The gesture information is identified, the gesture behaviour of user is obtained
Make event;It is searched according to the gesture operation event of the user and presets mapping ensemblen, the default mapping ensemblen includes at least one set of hand
The correspondence of gesture action event and mouse action event, wherein the mouse action event include at least mouse-click event,
Mouse moving event;If finding the gesture operation event of the user in the default mapping ensemblen, triggering and the use
The corresponding mouse action event of gesture operation event at family.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology realizes information storage.Information can be computer-readable instruction, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable
Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM),
Digital versatile disc (DVD) or other optical storages, magnetic tape cassette, tape magnetic disk storage or other magnetic storage apparatus
Or any other non-transmission medium, it can be used for storage and can be accessed by a computing device information.As defined in this article, it calculates
Machine readable medium does not include temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
As seen through the above description of the embodiments, those skilled in the art can be understood that this specification
Embodiment can add the mode of required general hardware platform to realize by software.Based on this understanding, this specification is implemented
Substantially the part that contributes to existing technology can be expressed in the form of software products the technical solution of example in other words,
The computer software product can be stored in a storage medium, such as ROM/RAM, magnetic disc, CD, including some instructions are making
It is each to obtain computer equipment (can be personal computer, server or the network equipment etc.) execution this specification embodiment
Method described in certain parts of a embodiment or embodiment.
System, device, module or the unit that above-described embodiment illustrates can specifically realize by computer chip or entity,
Or it is realized by the product with certain function.A kind of typically to realize that equipment is computer, the concrete form of computer can
To be personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media play
In device, navigation equipment, E-mail receiver/send equipment, game console, tablet computer, wearable device or these equipment
The combination of arbitrary several equipment.
Each embodiment in this specification is described in a progressive manner, identical similar portion between each embodiment
Point just to refer each other, and each embodiment focuses on the differences from other embodiments.Especially for device reality
For applying example, since it is substantially similar to the method embodiment, so describing fairly simple, related place is referring to embodiment of the method
Part explanation.The apparatus embodiments described above are merely exemplary, wherein described be used as separating component explanation
Module may or may not be physically separated, can be each module when implementing this specification example scheme
Function realize in the same or multiple software and or hardware.Can also select according to the actual needs part therein or
Person's whole module achieves the purpose of the solution of this embodiment.Those of ordinary skill in the art are not the case where making the creative labor
Under, you can to understand and implement.
The above is only the specific implementation mode of this specification embodiment, it is noted that for the general of the art
For logical technical staff, under the premise of not departing from this specification embodiment principle, several improvements and modifications can also be made, this
A little improvements and modifications also should be regarded as the protection domain of this specification embodiment.
Claims (13)
1. a kind of method for simulating mouse action using gesture, the method includes:
It obtains gesture collecting device and acquires the obtained gesture information of user gesture;
The gesture information is identified, the gesture operation event of user is obtained;
It is searched according to the gesture operation event of the user and presets mapping ensemblen, the default mapping ensemblen includes that at least one set of gesture is grasped
Make the correspondence of event and mouse action event, wherein the mouse action event includes at least mouse-click event, mouse
Moving event;
If finding the gesture operation event of the user in the default mapping ensemblen, triggering and the gesture of the user are grasped
Make the corresponding mouse action event of event.
2. according to the method described in claim 1, the gesture collecting device is image capture device, the gesture information is institute
State the collected user gesture image of image capture device.
3. according to the method described in claim 2, described be identified the gesture information, the gesture operation thing of user is obtained
Part, including:
The gesture area of user is extracted in the user gesture image;
Feature extraction is carried out to the gesture area using preset feature extraction algorithm;
Gesture identification is carried out by the feature extracted, obtains the gesture operation event of user.
4. according to the method described in claim 1, the gesture operation event of the user includes at least:For indicating the user
Gesture mobile first gesture action event occurs, for indicating that the second gesture that the gesture of the user converts operates
Event;
Wherein, the first gesture action event corresponds to the mouse moving event, and the second gesture action event corresponds to institute
State mouse click event.
5. according to the method described in claim 4, described be identified the gesture information, the gesture operation thing of user is obtained
Part, including:
The gesture information and the preceding gesture information once got that currently get are identified respectively, the user is obtained and works as
Before the first gesture made and the second gesture once made before the user;
Judge whether the first gesture and the second gesture belong to default gesture, if so, judge the first gesture with
Whether the second gesture is identical;
If identical, it is determined that physical displacement of the first gesture relative to the second gesture;If the physical displacement is more than
Predetermined threshold value is then obtained for indicating that the gesture of the user is moved to the first gesture by the second gesture position
The first gesture action event of position;
If it is different, then obtaining for indicating that the gesture of the user is transformed to the second of the first gesture by the second gesture
Gesture operation event.
6. according to the method described in claim 5, the default gesture includes at least:
Clench fist gesture, palm open gesture, singly refer to and stretch gesture.
7. a kind of device for simulating mouse action using gesture, described device include:
Acquisition module acquires the obtained gesture information of user gesture for obtaining gesture collecting device;
Identification module obtains the gesture operation event of user for the gesture information to be identified;
Searching module presets mapping ensemblen for being searched according to the gesture operation event of the user, and the default mapping ensemblen includes
The correspondence of at least one set of gesture operation event and mouse action event, wherein the mouse action event includes at least mouse
Mark click event, mouse moving event;
Trigger module, if the gesture operation event for finding the user in the default mapping ensemblen, triggering and institute
State the corresponding mouse action event of gesture operation event of user.
8. device according to claim 7, the gesture collecting device is image capture device, and the gesture information is institute
State the collected user gesture image of image capture device.
9. device according to claim 8, the identification module include:
Extracted region submodule, the gesture area for extracting user in the user gesture image;
Feature extraction submodule, for carrying out feature extraction to the gesture area using preset feature extraction algorithm;
Feature recognition submodule obtains the gesture operation event of user for carrying out gesture identification by the feature extracted.
10. the gesture operation event of device according to claim 7, the user includes at least:For indicating the use
The second gesture behaviour that the gesture at family occurs mobile first gesture action event, converted for indicating the gesture of the user
Make event;
Wherein, the first gesture action event corresponds to the mouse moving event, and the second gesture action event corresponds to institute
State mouse click event.
11. device according to claim 10, the identification module include:
Gesture identification submodule, for being carried out respectively to the gesture information and the preceding gesture information once got that currently get
Identification, the second gesture for obtaining first gesture that the user currently makes and once being made before the user;
First judging submodule, for judging whether the first gesture and the second gesture belong to default gesture;
Second judgment submodule judges described the if belonging to default gesture for the first gesture and the second gesture
Whether one gesture is identical as the second gesture;
Displacement determination sub-module, if identical as the second gesture for the first gesture, it is determined that the first gesture phase
For the physical displacement of the second gesture;
First determination sub-module obtains the hand for indicating the user if being more than predetermined threshold value for the physical displacement
Gesture is moved to the first gesture action event of the first gesture position by the second gesture position;
Second determination sub-module obtains if different from the second gesture for the first gesture for indicating the use
The gesture at family is transformed to the second gesture action event of the first gesture by the second gesture.
12. according to the devices described in claim 11, the default gesture includes at least:
Clench fist gesture, palm open gesture, singly refer to and stretch gesture.
13. a kind of terminal, including memory, processor and storage are on a memory and the computer journey that can run on a processor
Sequence, wherein the processor realizes such as claim 1 to 6 any one of them method when executing described program.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810200113.4A CN108446073A (en) | 2018-03-12 | 2018-03-12 | A kind of method, apparatus and terminal for simulating mouse action using gesture |
TW107147676A TWI695311B (en) | 2018-03-12 | 2018-12-28 | Method, device and terminal for simulating mouse operation using gestures |
PCT/CN2019/072077 WO2019174398A1 (en) | 2018-03-12 | 2019-01-17 | Method, apparatus, and terminal for simulating mouse operation by using gesture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810200113.4A CN108446073A (en) | 2018-03-12 | 2018-03-12 | A kind of method, apparatus and terminal for simulating mouse action using gesture |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108446073A true CN108446073A (en) | 2018-08-24 |
Family
ID=63194033
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810200113.4A Pending CN108446073A (en) | 2018-03-12 | 2018-03-12 | A kind of method, apparatus and terminal for simulating mouse action using gesture |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN108446073A (en) |
TW (1) | TWI695311B (en) |
WO (1) | WO2019174398A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109696958A (en) * | 2018-11-28 | 2019-04-30 | 南京华捷艾米软件科技有限公司 | A kind of gestural control method and system based on depth transducer gesture identification |
CN110221717A (en) * | 2019-05-24 | 2019-09-10 | 李锦华 | Virtual mouse driving device, gesture identification method and equipment for virtual mouse |
WO2019174398A1 (en) * | 2018-03-12 | 2019-09-19 | 阿里巴巴集团控股有限公司 | Method, apparatus, and terminal for simulating mouse operation by using gesture |
CN111221406A (en) * | 2018-11-23 | 2020-06-02 | 杭州萤石软件有限公司 | Information interaction method and device |
CN112068699A (en) * | 2020-08-31 | 2020-12-11 | 北京市商汤科技开发有限公司 | Interaction method, interaction device, electronic equipment and storage medium |
CN114115536A (en) * | 2021-11-22 | 2022-03-01 | 北京字节跳动网络技术有限公司 | Interaction method, interaction device, electronic equipment and storage medium |
CN114138119A (en) * | 2021-12-08 | 2022-03-04 | 武汉卡比特信息有限公司 | Gesture recognition system and method for mobile phone interconnection split screen projection |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112671972A (en) * | 2020-12-21 | 2021-04-16 | 四川长虹电器股份有限公司 | Method for controlling movement of large-screen television mouse by mobile phone |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101339453A (en) * | 2008-08-15 | 2009-01-07 | 广东威创视讯科技股份有限公司 | Simulated mouse input method based on interactive input apparatus |
CN102854983A (en) * | 2012-09-10 | 2013-01-02 | 中国电子科技集团公司第二十八研究所 | Man-machine interaction method based on gesture recognition |
US20160253044A1 (en) * | 2013-10-10 | 2016-09-01 | Eyesight Mobile Technologies Ltd. | Systems, devices, and methods for touch-free typing |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2474536B (en) * | 2009-10-13 | 2011-11-02 | Pointgrab Ltd | Computer vision gesture based control of a device |
CN103926999B (en) * | 2013-01-16 | 2017-03-01 | 株式会社理光 | Palm folding gesture identification method and device, man-machine interaction method and equipment |
CN103530613B (en) * | 2013-10-15 | 2017-02-01 | 易视腾科技股份有限公司 | Target person hand gesture interaction method based on monocular video sequence |
CN107885316A (en) * | 2016-09-29 | 2018-04-06 | 阿里巴巴集团控股有限公司 | A kind of exchange method and device based on gesture |
CN108446073A (en) * | 2018-03-12 | 2018-08-24 | 阿里巴巴集团控股有限公司 | A kind of method, apparatus and terminal for simulating mouse action using gesture |
-
2018
- 2018-03-12 CN CN201810200113.4A patent/CN108446073A/en active Pending
- 2018-12-28 TW TW107147676A patent/TWI695311B/en active
-
2019
- 2019-01-17 WO PCT/CN2019/072077 patent/WO2019174398A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101339453A (en) * | 2008-08-15 | 2009-01-07 | 广东威创视讯科技股份有限公司 | Simulated mouse input method based on interactive input apparatus |
CN102854983A (en) * | 2012-09-10 | 2013-01-02 | 中国电子科技集团公司第二十八研究所 | Man-machine interaction method based on gesture recognition |
US20160253044A1 (en) * | 2013-10-10 | 2016-09-01 | Eyesight Mobile Technologies Ltd. | Systems, devices, and methods for touch-free typing |
CN105980965A (en) * | 2013-10-10 | 2016-09-28 | 视力移动科技公司 | Systems, devices, and methods for touch-free typing |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019174398A1 (en) * | 2018-03-12 | 2019-09-19 | 阿里巴巴集团控股有限公司 | Method, apparatus, and terminal for simulating mouse operation by using gesture |
CN111221406A (en) * | 2018-11-23 | 2020-06-02 | 杭州萤石软件有限公司 | Information interaction method and device |
CN111221406B (en) * | 2018-11-23 | 2023-10-13 | 杭州萤石软件有限公司 | Information interaction method and device |
CN109696958A (en) * | 2018-11-28 | 2019-04-30 | 南京华捷艾米软件科技有限公司 | A kind of gestural control method and system based on depth transducer gesture identification |
CN110221717A (en) * | 2019-05-24 | 2019-09-10 | 李锦华 | Virtual mouse driving device, gesture identification method and equipment for virtual mouse |
CN110221717B (en) * | 2019-05-24 | 2024-07-09 | 李锦华 | Virtual mouse driving device, gesture recognition method and device for virtual mouse |
CN112068699A (en) * | 2020-08-31 | 2020-12-11 | 北京市商汤科技开发有限公司 | Interaction method, interaction device, electronic equipment and storage medium |
CN114115536A (en) * | 2021-11-22 | 2022-03-01 | 北京字节跳动网络技术有限公司 | Interaction method, interaction device, electronic equipment and storage medium |
CN114138119A (en) * | 2021-12-08 | 2022-03-04 | 武汉卡比特信息有限公司 | Gesture recognition system and method for mobile phone interconnection split screen projection |
Also Published As
Publication number | Publication date |
---|---|
WO2019174398A1 (en) | 2019-09-19 |
TW201939260A (en) | 2019-10-01 |
TWI695311B (en) | 2020-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108446073A (en) | A kind of method, apparatus and terminal for simulating mouse action using gesture | |
JP7391102B2 (en) | Gesture processing methods and devices | |
CN110163048B (en) | Hand key point recognition model training method, hand key point recognition method and hand key point recognition equipment | |
CN111738220B (en) | Three-dimensional human body posture estimation method, device, equipment and medium | |
CN110807361B (en) | Human body identification method, device, computer equipment and storage medium | |
CN102906671B (en) | Gesture input device and gesture input method | |
CN109726659A (en) | Detection method, device, electronic equipment and the readable medium of skeleton key point | |
CN105338238B (en) | A kind of photographic method and electronic equipment | |
CN108958627B (en) | Touch operation method and device, storage medium and electronic equipment | |
CN108777766B (en) | Multi-person photographing method, terminal and storage medium | |
CN112749613B (en) | Video data processing method, device, computer equipment and storage medium | |
KR20140019950A (en) | Method for generating 3d coordinate using finger image from mono camera in terminal and mobile terminal for generating 3d coordinate using finger image from mono camera | |
CN112541375A (en) | Hand key point identification method and device | |
Jiang et al. | independent hand gesture recognition with Kinect | |
CN115223248A (en) | Hand gesture recognition method, and training method and device of hand gesture recognition model | |
US11755119B2 (en) | Scene controlling method, device and electronic equipment | |
KR101995799B1 (en) | Place recognizing device and method for providing context awareness service | |
Sarkar et al. | Augmented reality-based virtual smartphone | |
CN113763931A (en) | Waveform feature extraction method and device, computer equipment and storage medium | |
CN108829600B (en) | Method and device for testing algorithm library, storage medium and electronic equipment | |
CN114359335A (en) | Target tracking method and electronic equipment | |
CN114299615A (en) | Key point-based multi-feature fusion action identification method, device, medium and equipment | |
CN104113632B (en) | A kind of information processing method and electronic equipment | |
CN115221888A (en) | Entity mention identification method, device, equipment and storage medium | |
WO2013175341A2 (en) | Method and apparatus for controlling multiple devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180824 |