CN111580666A - Equipment control method, electronic equipment, equipment control system and storage medium - Google Patents
Equipment control method, electronic equipment, equipment control system and storage medium Download PDFInfo
- Publication number
- CN111580666A CN111580666A CN202010392640.7A CN202010392640A CN111580666A CN 111580666 A CN111580666 A CN 111580666A CN 202010392640 A CN202010392640 A CN 202010392640A CN 111580666 A CN111580666 A CN 111580666A
- Authority
- CN
- China
- Prior art keywords
- data
- gesture
- angular velocity
- axis
- gravity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an equipment control method, which is applied to electronic equipment connected with intelligent wearable equipment and comprises the steps of acquiring angular velocity data and gravitational acceleration data of a wearing part, which are acquired by the intelligent wearable equipment; generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp; and determining a user gesture corresponding to the gesture feature data, and executing a target instruction corresponding to the user gesture. The electronic equipment can be controlled without visual participation. The application also discloses an electronic device, a device control system and a storage medium, which have the beneficial effects.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an apparatus control method, an electronic apparatus, an apparatus control system, and a storage medium.
Background
With the development of science and technology, electronic devices have become necessities of people's lives. The control of electronic equipment through a human-computer interaction interface of the electronic equipment has become a mainstream equipment control mode. However, the above device control method for the human-computer interface must be implemented based on the visual function of the user, and when people with visual impairment or users who are inconvenient to observe the human-computer interface use the electronic device, the device cannot be controlled.
Therefore, how to realize control of an electronic device without visual participation is a technical problem that needs to be solved by those skilled in the art at present.
Disclosure of Invention
An object of the present application is to provide an apparatus control method, an electronic apparatus, an apparatus control system, and a storage medium, which can realize control of an electronic apparatus without visual participation.
In order to solve the technical problem, the application provides an apparatus control method applied to an electronic apparatus connected to an intelligent wearable apparatus, the apparatus control method including:
acquiring angular velocity data and gravitational acceleration data of a wearing part acquired by the intelligent wearing equipment;
generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp;
and determining a user gesture corresponding to the gesture feature data, and executing a target instruction corresponding to the user gesture.
Optionally, generating gesture feature data of the wearing part according to the angular velocity data and the gravitational acceleration data includes:
determining data time stamps corresponding to the angular velocity data and the gravity acceleration data;
performing separation operation on the gravity acceleration data to obtain gravity data and acceleration data;
calculating the finger pitch angle according to the angular velocity data, the gravity data and the acceleration data;
determining an X-axis angular velocity, a Y-axis angular velocity and a Z-axis angular velocity according to the angular velocity data;
determining X-axis gravity, Y-axis gravity and Z-axis gravity according to the gravity data;
determining an X-axis acceleration, a Y-axis acceleration and a Z-axis acceleration according to the acceleration data;
generating gesture feature data comprising the X-axis angular velocity, the Y-axis angular velocity, the Z-axis angular velocity, the X-axis gravity, the Y-axis gravity, the Z-axis gravity, the X-axis acceleration, the Y-axis acceleration, the Z-axis acceleration, the data timestamp, and the pitch angle.
Optionally, determining the user gesture corresponding to the gesture feature data includes:
determining a user gesture corresponding to the gesture feature data by using a machine learning model;
or determining the user gesture corresponding to the gesture feature data by using the gesture data set.
Optionally, before determining the user gesture corresponding to the gesture feature data by using the machine learning model, the method further includes:
obtaining a training sample; the training sample comprises preset motion characteristic data corresponding to each preset gesture, wherein the preset motion characteristic data comprise preset X-axis angular velocity, preset Y-axis angular velocity, preset Z-axis angular velocity, preset X-axis gravity, preset Y-axis gravity, preset Z-axis gravity, preset X-axis acceleration, preset Y-axis acceleration, preset Z-axis acceleration, a preset finger pitch angle and a preset data timestamp;
and training an original model by using the training sample to obtain the machine learning model.
Optionally, the determining, by using the gesture data set, the user gesture corresponding to the gesture feature data includes:
determining a touch mode of the wearing part according to the finger pitch angle;
when the touch mode is fingertip touch, comparing the similarity of the gesture characteristic data with fingertip movement characteristic data corresponding to fingertip gestures in a fingertip gesture data set, and setting a fingertip gesture instruction corresponding to the fingertip movement characteristic data with the highest similarity as the user gesture;
and when the touch mode is finger-belly touch, comparing the similarity of the gesture characteristic data with finger-belly motion characteristic data corresponding to finger-belly gestures in the finger-belly gesture data set, and setting finger-belly gesture instructions corresponding to the finger-belly motion characteristic data with the highest similarity as the user gestures.
Optionally, determining the user gesture corresponding to the gesture feature data includes:
executing gesture reduction operation based on the gesture feature data, and determining the user gesture according to a gesture reduction result; wherein the user gesture comprises a fingertip gesture or a finger belly gesture.
Optionally, before executing the target instruction, the method further includes:
judging whether the instruction recognition mode of the electronic equipment is a gesture recognition mode;
if yes, entering an operation step of executing the target instruction;
if not, judging whether the target instruction is a gesture recognition mode starting instruction or not; if the target instruction is a gesture recognition mode starting instruction, setting the instruction recognition mode of the electronic equipment to be the gesture recognition mode so as to execute the target instruction corresponding to the next user gesture; and if the target instruction is not the gesture recognition mode starting instruction, ending the process.
Optionally, the method further includes:
when a gesture recognition mode exit instruction is received, exiting from setting the instruction recognition mode of the electronic equipment to be a non-gesture recognition mode; the gesture recognition mode exit instruction comprises a touch instruction or a key instruction.
The present application also provides an electronic device, including:
the data acquisition module is used for acquiring angular velocity data and gravitational acceleration data of the wearing part acquired by the intelligent wearing equipment; wherein the wearing part is a finger;
the data analysis module is used for generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp;
and the instruction execution module is used for determining the user gesture corresponding to the gesture characteristic data and executing a target instruction corresponding to the user gesture.
The present application also provides an apparatus control system, including:
the intelligent wearable device is used for acquiring angular velocity data and gravitational acceleration data of a wearing part and transmitting the angular velocity data and the gravitational acceleration data to the electronic device;
the electronic equipment is connected with the intelligent wearable equipment and is used for acquiring angular velocity data and gravitational acceleration data of a wearing part, which are acquired by the intelligent wearable equipment; the gesture characteristic data of the wearing part is generated according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp; and the gesture recognition module is also used for determining a user gesture corresponding to the gesture feature data and executing a target instruction corresponding to the user gesture.
The present application also provides a storage medium having stored thereon a computer program that, when executed, implements the steps performed by the above-described apparatus control method.
The application provides an equipment control method which is applied to electronic equipment connected with intelligent wearable equipment and comprises the steps of obtaining angular velocity data and gravitational acceleration data of a wearing part, wherein the angular velocity data and the gravitational acceleration data are collected by the intelligent wearable equipment; generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp; and determining a user gesture corresponding to the gesture feature data, and executing a target instruction corresponding to the user gesture.
According to the method and the device, the angular velocity data and the gravitational acceleration data of the wearing part are firstly acquired, the gesture feature data which can be used for describing the motion features of the wearing part are generated according to the angular velocity data and the gravitational acceleration data of the wearing part, namely different gestures can correspond to different gesture feature data. After the gesture feature data is obtained, a target instruction corresponding to the user gesture can be determined and executed according to the corresponding relation between the gesture feature data and the user gesture. In the process, visual participation of a user is not needed, the target instruction corresponding to the gesture of the user can be determined only by collecting and analyzing the motion data of the wearing part of the intelligent wearable device, and then the electronic device is controlled without visual participation. The application also provides an electronic device, a device control system and a storage medium, which have the beneficial effects and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present application, the drawings needed for the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
Fig. 1 is a flowchart of an apparatus control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a gesture provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a flowchart of an apparatus control method according to an embodiment of the present disclosure.
The specific steps may include:
s101: acquiring angular velocity data and gravitational acceleration data of a wearing part acquired by the intelligent wearing equipment;
the execution main body of the embodiment can be an electronic device connected with the intelligent wearable device, the intelligent wearable device can be an intelligent ring, an intelligent finger stall or an intelligent glove, and the electronic device can be a mobile phone, a tablet personal computer or a personal computer. In this embodiment, the connection mode between the intelligent wearable device and the electronic device may be a wireless connection (e.g., a bluetooth connection or a network connection), or a wired connection (e.g., a data line connection). It can be understood that a sensor for detecting angular velocity and acceleration is arranged in the intelligent wearable device, for example, an Inertial Measurement Unit (IMU) may be arranged in the intelligent wearable device, the IMU may report acceleration and angular velocity of itself to the electronic device at a frequency of 100 to 4000 hz, and after data filtering, the direction of gravity may also be obtained. When the intelligent wearable device is an intelligent ring, the wearing part of the intelligent ring can be a first knuckle, a second knuckle or a third knuckle of a finger of a user. When a finger wearing the intelligent ring moves, the IMU sensor in the intelligent ring can acquire angular velocity data and gravitational acceleration data of the finger, the gravitational acceleration data are data describing actual acceleration of the intelligent wearable device, and the data transmission device in the intelligent ring can transmit the angular velocity data and the gravitational acceleration data acquired in real time to the electronic device.
S102: generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data;
the method comprises the following steps of establishing a step of separating the gravity acceleration data on the basis of receiving the angular velocity data and the gravity acceleration data, and finally obtaining triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp. The data time stamp is used for describing information of the acquisition time of the angular velocity data and the gravitational acceleration data.
S103: and determining a user gesture corresponding to the gesture feature data, and executing a target instruction corresponding to the user gesture.
In this embodiment, the corresponding relationship between the gesture feature data and each preset instruction in the instruction set may be pre-stored, and the target instruction corresponding to the gesture feature data may be determined on the basis of determining the gesture feature data, so that the electronic device executes the target instruction. Further, the embodiment may perform a gesture reduction operation based on the gesture feature data, and determine the user gesture according to a gesture reduction result; wherein the user gesture comprises a fingertip gesture or a finger belly gesture. Gesture reduction operation indicates: and reversely reducing according to the gesture characteristic data to obtain a gesture action corresponding to the gesture characteristic data.
As a feasible implementation manner, after the gesture feature data is obtained, the target gesture motion corresponding to the gesture feature data can be determined according to the corresponding relationship between the gesture feature data and the gesture motion, and then the target instruction corresponding to the target gesture motion is determined. Please refer to table 1, where table 1 is a table of correspondence between commands and gestures. Fig. 2 is a gesture diagram provided in the embodiment of the present application, and each gesture diagram in fig. 2 corresponds to an illustration code number in table 1.
TABLE 1 instruction and gesture correspondence table
Instructions | Gesture motion | Code number of the figure |
Last focus | Left drawing of finger abdomen | a1 |
Next focus point | Right abdomen with finger | a2 |
Selecting a current focus | Finger and abdomen double-click | a3 |
Return to | Rubbing the abdomen with fingers | a4 |
Open hidden menu | Four hits on finger abdomen | a5 |
Selecting a first focus | Rubbing the finger tip left and right | a6 |
Selecting the last focus | Rubbing the right and left of the finger tip | a7 |
Upper page turning | Upper scratch | a8 |
Down-turning page | Lower row | a9 |
Speech input | Finger tip three-click | a10 |
Right page turning | Right-hand drawing of finger tip | a11 |
Left page turning | Left drawing of fingertip | a12 |
Read aloud from current focus | Double click of fingertip | a13 |
Shortcut operation | Four-click on fingertip | a14 |
Opening/exiting ring mode | Finger three-strike | a15 |
When a user executes each gesture in table 1 or fig. 2, the intelligent wearable device can transmit the real-time detected angular velocity data and the real-time detected gravitational acceleration data to the electronic device, the electronic device generates gesture feature data according to the angular velocity data and the real-time detected gravitational acceleration data, the current gesture of the user can be determined according to the corresponding relationship between the gesture feature data and the gesture, and then the corresponding target instruction can be determined by combining table 1. Further, determining the current gesture action of the user according to the corresponding relationship between the gesture feature data and the gesture action may further include: four strokes of finger abdomen, right and left rubbing of finger abdomen, four strokes of finger tip, left and right rubbing of finger tip, right and left rubbing of finger tip, up and down rubbing, down and up rubbing, upward stroke first and then leftward stroke, downward stroke first and then leftward stroke, leftward stroke first and then upward stroke, upward stroke first and then rightward stroke, downward stroke first and then rightward stroke first and then downward stroke, long press, finger abdomen click + left stroke, finger abdomen click + right stroke, finger tip click + left stroke, finger abdomen click + finger tip click, finger abdomen click + up stroke, finger tip click + right stroke, finger tip click + finger abdomen click, finger abdomen click + down stroke.
In this embodiment, first, angular velocity data and gravitational acceleration data of the wearing part are acquired, and gesture feature data that can be used to describe the motion features of the wearing part are generated according to the angular velocity data and the gravitational acceleration data of the wearing part, that is, different gestures may correspond to different gesture feature data. After the gesture feature data is obtained, a target instruction corresponding to the user gesture can be determined and executed according to the corresponding relation between the gesture feature data and the user gesture. In the process, visual participation of a user is not needed, the target instruction corresponding to the gesture of the user can be determined only by collecting and analyzing the motion data of the wearing part of the intelligent wearable device, and then the electronic device is controlled without visual participation. When the electronic device of the embodiment is a mobile phone, the scheme of the embodiment is implemented. The operation of the mobile phone can be carried out at any time and any place, the trouble of taking out the mobile phone is avoided, and the single-hand interaction of a user is facilitated. The embodiment has small gesture degree and strong privacy, can effectively reduce the risk of shoulder surfing of the visually impaired user when operating the mobile phone, protects the privacy of the visually impaired user, and avoids the social problem caused by the special mobile phone operation posture. When intelligent wearing equipment adopted the IMU sensor, can both satisfied user's low price demand, can leave enough space for the intelligent appearance design that dresses to be indiscriminate again.
As a further description of the corresponding embodiment of fig. 1, the process of generating the gesture feature data of the wearing part in S102 in this embodiment may include the following operations: determining data time stamps corresponding to the angular velocity data and the gravity acceleration data; performing separation operation on the gravity acceleration data to obtain gravity data and acceleration data; calculating the finger pitch angle according to the angular velocity data, the gravity data and the acceleration data; determining an X-axis angular velocity, a Y-axis angular velocity and a Z-axis angular velocity according to the angular velocity data; determining X-axis gravity, Y-axis gravity and Z-axis gravity according to the gravity data; determining an X-axis acceleration, a Y-axis acceleration and a Z-axis acceleration according to the acceleration data; generating gesture feature data comprising the X-axis angular velocity, the Y-axis angular velocity, the Z-axis angular velocity, the X-axis gravity, the Y-axis gravity, the Z-axis gravity, the X-axis acceleration, the Y-axis acceleration, the Z-axis acceleration, the data timestamp, and the pitch angle.
As a further introduction to the corresponding embodiment of fig. 1, the determination of the user gesture corresponding to the gesture feature data in S103 may be implemented in the following two ways:
the first method is as follows: determining a user gesture corresponding to the gesture feature data by using a machine learning model;
the second method comprises the following steps: and determining the user gesture corresponding to the gesture feature data by using the gesture data set.
For the first mode, before the machine learning model is used to determine the user gesture corresponding to the gesture feature data, there may also be an operation of training the machine learning model, specifically as follows: obtaining a training sample; and training an original model by using the training sample to obtain the machine learning model. The training sample comprises preset motion characteristic data corresponding to each preset gesture, and the preset motion characteristic data comprise preset X-axis angular velocity, preset Y-axis angular velocity, preset Z-axis angular velocity, preset X-axis gravity, preset Y-axis gravity, preset Z-axis gravity, preset X-axis acceleration, preset Y-axis acceleration, preset Z-axis acceleration, preset finger pitch angle and preset data timestamp.
For the second mode, a plurality of preset gestures and reference gesture feature data corresponding to each preset gesture may be stored in the gesture data set in advance, similarity comparison may be performed between the gesture feature data and the reference gesture feature data, and the preset gesture corresponding to the reference gesture feature data with the highest similarity is used as the user gesture. The reference gesture feature data comprise 11 axis data of X axis angular velocity, Y axis angular velocity, Z axis angular velocity, X axis gravity, Y axis gravity, Z axis gravity, X axis acceleration, Y axis acceleration, Z axis acceleration, finger pitch angle and data timestamp.
Further, the process of determining the user gesture corresponding to the gesture feature data by using the gesture data set may include:
determining a touch mode of the wearing part according to the finger pitch angle;
when the touch mode is fingertip touch, comparing the similarity of the gesture characteristic data with fingertip movement characteristic data corresponding to fingertip gestures in a fingertip gesture data set, and setting the fingertip gesture corresponding to the fingertip movement characteristic data with the highest similarity as the target instruction;
and when the touch mode is finger-belly touch, comparing the similarity of the gesture characteristic data with finger-belly motion characteristic data corresponding to finger-belly gestures in the finger-belly gesture data set, and setting the finger-belly gesture corresponding to the finger-belly motion characteristic data with the highest similarity as the target instruction.
As a further introduction to the corresponding embodiment of fig. 1, before executing the target instruction, it may also be determined whether an instruction recognition mode of the electronic device is a gesture recognition mode; if the gesture recognition mode is adopted, the operation step of executing the target instruction in S103 is entered; if the target instruction is not in the gesture recognition mode, judging whether the target instruction is a gesture recognition mode starting instruction or not; if the target instruction is a gesture recognition mode starting instruction, setting the instruction recognition mode of the electronic equipment to be the gesture recognition mode so as to execute the target instruction corresponding to the next user gesture; and if the target instruction is not the gesture recognition mode starting instruction, ending the process.
When the electronic equipment starts a gesture recognition mode, receiving angular velocity data and gravitational acceleration data sent by the intelligent wearable equipment, and determining and executing a corresponding target instruction; when the electronic device starts a non-gesture recognition mode, the electronic device can receive angular velocity data and gravitational acceleration data sent by the intelligent wearable device, determine a corresponding target instruction, set the instruction recognition mode of the electronic device to the gesture recognition mode if the target instruction is a gesture recognition mode starting instruction, and not execute the target instruction if the target instruction is not the gesture recognition mode starting instruction. Further, when a gesture recognition mode exit instruction is received, the electronic device may exit setting the instruction recognition mode of the electronic device to a non-gesture recognition mode. Specifically, the gesture recognition mode exit instruction includes a touch instruction or a key instruction. The button instruction is an instruction generated by clicking a button on the ring, and the touch instruction is an instruction generated by touching a touch panel on the intelligent wearable device.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
the apparatus may include:
the data acquisition module 100 is configured to acquire angular velocity data and gravitational acceleration data of a wearing part acquired by the intelligent wearable device; wherein the wearing part is a finger;
the data analysis module 200 is configured to generate gesture feature data of the wearing part according to the angular velocity data and the gravitational acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp;
the instruction executing module 300 is configured to determine a user gesture corresponding to the gesture feature data, and execute a target instruction corresponding to the user gesture.
In this embodiment, first, angular velocity data and gravitational acceleration data of the wearing part are acquired, and gesture feature data that can be used to describe the motion features of the wearing part are generated according to the angular velocity data and the gravitational acceleration data of the wearing part, that is, different gestures may correspond to different gesture feature data. After the gesture feature data is obtained, a target instruction corresponding to the user gesture can be determined and executed according to the corresponding relation between the gesture feature data and the user gesture. In the process, visual participation of a user is not needed, the target instruction corresponding to the gesture of the user can be determined only by collecting and analyzing the motion data of the wearing part of the intelligent wearable device, and then the electronic device is controlled without visual participation.
Further, the data parsing module 200 includes:
the time determining unit is used for determining data time stamps corresponding to the angular velocity data and the gravity acceleration data;
the gravity acceleration separation unit is used for performing separation operation on the gravity acceleration data to obtain gravity data and acceleration data;
the pitch angle calculation unit is used for calculating the finger pitch angle according to the angular velocity data, the gravity data and the acceleration data;
the triaxial angular velocity determining unit is used for determining an X-axis angular velocity, a Y-axis angular velocity and a Z-axis angular velocity according to the angular velocity data;
the three-axis gravity determining unit is used for determining X-axis gravity, Y-axis gravity and Z-axis gravity according to the gravity data;
the three-axis acceleration determining unit is used for determining X-axis acceleration, Y-axis acceleration and Z-axis acceleration according to the acceleration data;
a result generation unit for generating gesture feature data including the X-axis angular velocity, the Y-axis angular velocity, the Z-axis angular velocity, the X-axis gravity, the Y-axis gravity, the Z-axis gravity, the X-axis acceleration, the Y-axis acceleration, the Z-axis acceleration, the data timestamp, and the pitch angle.
Further, the instruction execution module 300 includes:
the first instruction determining unit is used for determining a user gesture corresponding to the gesture feature data by using a machine learning model;
or the second instruction determining unit is used for determining the user gesture corresponding to the gesture feature data by utilizing the gesture data set.
Further, the method also comprises the following steps:
the model training unit is used for acquiring a training sample before the user gesture corresponding to the gesture feature data is determined by using a machine learning model; the training sample comprises preset motion characteristic data corresponding to each preset gesture, wherein the preset motion characteristic data comprise preset X-axis angular velocity, preset Y-axis angular velocity, preset Z-axis angular velocity, preset X-axis gravity, preset Y-axis gravity, preset Z-axis gravity, preset X-axis acceleration, preset Y-axis acceleration, preset Z-axis acceleration, a preset finger pitch angle and a preset data timestamp; and the training sample is used for training an original model to obtain the machine learning model.
Further, the second instruction determining unit is used for determining a touch mode of the wearing part according to the finger pitch angle; the touch mode is fingertip touch, the similarity of the gesture characteristic data and fingertip movement characteristic data corresponding to a fingertip gesture instruction in a fingertip gesture data set is compared, and a fingertip gesture instruction corresponding to the fingertip movement characteristic data with the highest similarity is set as the user gesture; and when the touch mode is finger and abdomen touch, the gesture characteristic data and finger and abdomen motion characteristic data corresponding to the finger and abdomen gesture instruction in the finger and abdomen gesture data set are compared in similarity, and the finger and abdomen gesture instruction corresponding to the finger and abdomen motion characteristic data with the highest similarity is set as the user gesture.
Further, the instruction execution module 300 includes:
the gesture reduction unit is used for executing gesture reduction operation based on the gesture feature data and determining the user gesture according to a gesture reduction result; wherein the user gesture comprises a fingertip gesture or a finger belly gesture.
Further, the method also comprises the following steps:
the mode judging module is used for judging whether the instruction identification mode of the electronic equipment is a gesture identification mode before the target instruction is executed; if yes, entering an operation step of executing the target instruction in the instruction execution module 300; if not, judging whether the target instruction is a gesture recognition mode starting instruction or not; if the target instruction is a gesture recognition mode starting instruction, setting the instruction recognition mode of the electronic equipment to be the gesture recognition mode so as to execute the target instruction corresponding to the next user gesture; and if the target instruction is not the gesture recognition mode starting instruction, ending the process.
Further, the method also comprises the following steps:
the mode switching module is used for quitting the setting of the instruction identification mode of the electronic equipment to be a non-gesture identification mode when receiving a gesture identification mode quitting instruction; the gesture recognition mode exit instruction comprises a touch instruction or a key instruction.
Since the embodiment of the electronic device portion and the embodiment of the method portion correspond to each other, please refer to the description of the embodiment of the method portion for the embodiment of the electronic device portion, which is not repeated here.
The present application also provides a storage medium having a computer program stored thereon, which when executed, may implement the steps provided by the above-described embodiments. The storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The present application also provides an apparatus control system, including:
the intelligent wearable device is used for acquiring angular velocity data and gravitational acceleration data of a wearing part and transmitting the angular velocity data and the gravitational acceleration data to the electronic device;
the electronic equipment is connected with the intelligent wearable equipment and is used for acquiring angular velocity data and gravitational acceleration data of a wearing part, which are acquired by the intelligent wearable equipment; the gesture characteristic data of the wearing part is generated according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp; and the gesture recognition module is also used for determining a user gesture corresponding to the gesture feature data and executing a target instruction corresponding to the user gesture.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Claims (10)
1. The device control method is applied to electronic devices connected with intelligent wearable devices, and comprises the following steps:
acquiring angular velocity data and gravitational acceleration data of a wearing part acquired by the intelligent wearing equipment;
generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp;
and determining a user gesture corresponding to the gesture feature data, and executing a target instruction corresponding to the user gesture.
2. The device control method according to claim 1, wherein generating gesture feature data of the wearing part from the angular velocity data and the gravitational acceleration data includes:
determining data time stamps corresponding to the angular velocity data and the gravity acceleration data;
performing separation operation on the gravity acceleration data to obtain gravity data and acceleration data;
calculating the finger pitch angle according to the angular velocity data, the gravity data and the acceleration data;
determining an X-axis angular velocity, a Y-axis angular velocity and a Z-axis angular velocity according to the angular velocity data;
determining X-axis gravity, Y-axis gravity and Z-axis gravity according to the gravity data;
determining an X-axis acceleration, a Y-axis acceleration and a Z-axis acceleration according to the acceleration data;
generating gesture feature data comprising the X-axis angular velocity, the Y-axis angular velocity, the Z-axis angular velocity, the X-axis gravity, the Y-axis gravity, the Z-axis gravity, the X-axis acceleration, the Y-axis acceleration, the Z-axis acceleration, the data timestamp, and the pitch angle.
3. The device control method of claim 1, wherein determining the user gesture corresponding to the gesture feature data comprises:
determining a user gesture corresponding to the gesture feature data by using a machine learning model;
or determining the user gesture corresponding to the gesture feature data by using the gesture data set.
4. The device control method according to claim 3, before determining the user gesture corresponding to the gesture feature data by using a machine learning model, further comprising:
obtaining a training sample; the training sample comprises preset motion characteristic data corresponding to each preset gesture, wherein the preset motion characteristic data comprise preset X-axis angular velocity, preset Y-axis angular velocity, preset Z-axis angular velocity, preset X-axis gravity, preset Y-axis gravity, preset Z-axis gravity, preset X-axis acceleration, preset Y-axis acceleration, preset Z-axis acceleration, a preset finger pitch angle and a preset data timestamp;
and training an original model by using the training sample to obtain the machine learning model.
5. The device control method according to claim 3, wherein the determining, by using the gesture data set, the user gesture corresponding to the gesture feature data comprises:
determining a touch mode of the wearing part according to the finger pitch angle;
when the touch mode is fingertip touch, comparing the similarity of the gesture characteristic data with fingertip movement characteristic data corresponding to fingertip gestures in a fingertip gesture data set, and setting the fingertip gesture corresponding to the fingertip movement characteristic data with the highest similarity as the user gesture;
and when the touch mode is finger-belly touch, comparing the similarity of the gesture characteristic data with finger-belly motion characteristic data corresponding to finger-belly gestures in the finger-belly gesture data set, and setting the finger-belly gesture corresponding to the finger-belly motion characteristic data with the highest similarity as the user gesture.
6. The device control method of claim 1, wherein determining the user gesture corresponding to the gesture feature data comprises:
executing gesture reduction operation based on the gesture feature data, and determining the user gesture according to a gesture reduction result; wherein the user gesture comprises a fingertip gesture or a finger belly gesture.
7. The apparatus control method according to any one of claims 1 to 6, further comprising, before executing the target instruction:
judging whether the instruction recognition mode of the electronic equipment is a gesture recognition mode;
if yes, entering an operation step of executing the target instruction;
if not, judging whether the target instruction is a gesture recognition mode starting instruction or not; if the target instruction is a gesture recognition mode starting instruction, setting the instruction recognition mode of the electronic equipment to be the gesture recognition mode so as to execute the target instruction corresponding to the next user gesture; if the target instruction is not a gesture recognition mode starting instruction, ending the process;
when a gesture recognition mode exit instruction is received, setting an instruction recognition mode of the electronic equipment to be a non-gesture recognition mode; the gesture recognition mode exit instruction comprises a touch instruction or a key instruction.
8. An electronic device, comprising:
the data acquisition module is used for acquiring angular velocity data and gravitational acceleration data of the wearing part acquired by the intelligent wearing equipment; wherein the wearing part is a finger;
the data analysis module is used for generating gesture feature data of the wearing part according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp;
and the instruction execution module is used for determining the user gesture corresponding to the gesture characteristic data and executing a target instruction corresponding to the user gesture.
9. An appliance control system, comprising:
the intelligent wearable device is used for acquiring angular velocity data and gravitational acceleration data of a wearing part and transmitting the angular velocity data and the gravitational acceleration data to the electronic device;
the electronic equipment is connected with the intelligent wearable equipment and is used for acquiring angular velocity data and gravitational acceleration data of a wearing part, which are acquired by the intelligent wearable equipment; the gesture characteristic data of the wearing part is generated according to the angular velocity data and the gravity acceleration data; the gesture feature data comprise triaxial acceleration data, triaxial angular velocity data, triaxial gravity data, a finger pitch angle and a data timestamp; and the gesture recognition module is also used for determining a user gesture corresponding to the gesture feature data and executing a target instruction corresponding to the user gesture.
10. A storage medium having stored thereon computer-executable instructions which, when loaded and executed by a processor, carry out the steps of a device control method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010392640.7A CN111580666B (en) | 2020-05-11 | 2020-05-11 | Equipment control method, electronic equipment, equipment control system and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010392640.7A CN111580666B (en) | 2020-05-11 | 2020-05-11 | Equipment control method, electronic equipment, equipment control system and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111580666A true CN111580666A (en) | 2020-08-25 |
CN111580666B CN111580666B (en) | 2022-04-29 |
Family
ID=72112638
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010392640.7A Active CN111580666B (en) | 2020-05-11 | 2020-05-11 | Equipment control method, electronic equipment, equipment control system and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111580666B (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103809912A (en) * | 2014-03-03 | 2014-05-21 | 欧浦登(福建)光学有限公司 | Tablet personal computer based on multi-touch screen |
CN104185050A (en) * | 2014-07-30 | 2014-12-03 | 哈尔滨工业大学深圳研究生院 | OTT television based intelligent remote control system and control method thereof |
CN104793747A (en) * | 2015-04-24 | 2015-07-22 | 百度在线网络技术(北京)有限公司 | Method, device and system for inputting through wearable device |
CN105929940A (en) * | 2016-04-13 | 2016-09-07 | 哈尔滨工业大学深圳研究生院 | Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method |
CN106056085A (en) * | 2016-05-31 | 2016-10-26 | 广东美的制冷设备有限公司 | Gesture recognition method, gesture recognition device and equipment |
CN106445130A (en) * | 2016-09-19 | 2017-02-22 | 武汉元生创新科技有限公司 | Motion capture glove for gesture recognition and calibration method thereof |
CN106468945A (en) * | 2015-08-20 | 2017-03-01 | 上海汽车集团股份有限公司 | Wearable device and its control method |
CN106468951A (en) * | 2016-08-29 | 2017-03-01 | 华东师范大学 | A kind of intelligent remote control systems based on the fusion of both hands ring sensor and its method |
CN106527466A (en) * | 2016-12-15 | 2017-03-22 | 鹰眼电子科技有限公司 | Wearing type unmanned aerial vehicle control system |
CN107036596A (en) * | 2017-04-12 | 2017-08-11 | 无锡研测技术有限公司 | Industrial bracelet based on MEMS inertial sensor module |
CN107457762A (en) * | 2016-06-02 | 2017-12-12 | 巨擘科技股份有限公司 | Robot arm control device, robot arm system including the same, and robot arm control method |
CN107485864A (en) * | 2017-08-17 | 2017-12-19 | 刘嘉成 | A kind of wearable gesture remote control method and its wearable gesture remote control device |
CN109085885A (en) * | 2018-08-14 | 2018-12-25 | 李兴伟 | Intelligent ring |
CN109993073A (en) * | 2019-03-14 | 2019-07-09 | 北京工业大学 | A kind of complicated dynamic gesture identification method based on Leap Motion |
CN110308795A (en) * | 2019-07-05 | 2019-10-08 | 济南大学 | A kind of dynamic gesture identification method and system |
CN110348275A (en) * | 2018-04-08 | 2019-10-18 | 中兴通讯股份有限公司 | Gesture identification method, device, smart machine and computer readable storage medium |
-
2020
- 2020-05-11 CN CN202010392640.7A patent/CN111580666B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103809912A (en) * | 2014-03-03 | 2014-05-21 | 欧浦登(福建)光学有限公司 | Tablet personal computer based on multi-touch screen |
CN104185050A (en) * | 2014-07-30 | 2014-12-03 | 哈尔滨工业大学深圳研究生院 | OTT television based intelligent remote control system and control method thereof |
CN104793747A (en) * | 2015-04-24 | 2015-07-22 | 百度在线网络技术(北京)有限公司 | Method, device and system for inputting through wearable device |
CN106468945A (en) * | 2015-08-20 | 2017-03-01 | 上海汽车集团股份有限公司 | Wearable device and its control method |
CN105929940A (en) * | 2016-04-13 | 2016-09-07 | 哈尔滨工业大学深圳研究生院 | Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method |
CN106056085A (en) * | 2016-05-31 | 2016-10-26 | 广东美的制冷设备有限公司 | Gesture recognition method, gesture recognition device and equipment |
CN107457762A (en) * | 2016-06-02 | 2017-12-12 | 巨擘科技股份有限公司 | Robot arm control device, robot arm system including the same, and robot arm control method |
CN106468951A (en) * | 2016-08-29 | 2017-03-01 | 华东师范大学 | A kind of intelligent remote control systems based on the fusion of both hands ring sensor and its method |
CN106445130A (en) * | 2016-09-19 | 2017-02-22 | 武汉元生创新科技有限公司 | Motion capture glove for gesture recognition and calibration method thereof |
CN106527466A (en) * | 2016-12-15 | 2017-03-22 | 鹰眼电子科技有限公司 | Wearing type unmanned aerial vehicle control system |
CN107036596A (en) * | 2017-04-12 | 2017-08-11 | 无锡研测技术有限公司 | Industrial bracelet based on MEMS inertial sensor module |
CN107485864A (en) * | 2017-08-17 | 2017-12-19 | 刘嘉成 | A kind of wearable gesture remote control method and its wearable gesture remote control device |
CN110348275A (en) * | 2018-04-08 | 2019-10-18 | 中兴通讯股份有限公司 | Gesture identification method, device, smart machine and computer readable storage medium |
CN109085885A (en) * | 2018-08-14 | 2018-12-25 | 李兴伟 | Intelligent ring |
CN109993073A (en) * | 2019-03-14 | 2019-07-09 | 北京工业大学 | A kind of complicated dynamic gesture identification method based on Leap Motion |
CN110308795A (en) * | 2019-07-05 | 2019-10-08 | 济南大学 | A kind of dynamic gesture identification method and system |
Non-Patent Citations (1)
Title |
---|
叶植超: ""基于Kinect的空中手写系统"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Also Published As
Publication number | Publication date |
---|---|
CN111580666B (en) | 2022-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhang et al. | TapSkin: Recognizing on-skin input for smartwatches | |
US9965044B2 (en) | Information processing method, apparatus, and device | |
Kim et al. | A new wearable input device: SCURRY | |
US20150084859A1 (en) | System and Method for Recognition and Response to Gesture Based Input | |
CN106569613A (en) | Multi-modal man-machine interaction system and control method thereof | |
CN104463152B (en) | A kind of gesture identification method, system, terminal device and Wearable | |
CN105159539A (en) | Touch control response method of wearable equipment, touch control response device of wearable equipment and wearable equipment | |
US20240077948A1 (en) | Gesture-based display interface control method and apparatus, device and storage medium | |
WO2011092549A1 (en) | Method and apparatus for assigning a feature class value | |
Yin et al. | A high-performance training-free approach for hand gesture recognition with accelerometer | |
Siddhpuria et al. | Exploring at-your-side gestural interaction for ubiquitous environments | |
Bakhtiyari et al. | Fuzzy model of dominance emotions in affective computing | |
CN109634439B (en) | Intelligent text input method | |
Li et al. | Hand gesture recognition and real-time game control based on a wearable band with 6-axis sensors | |
CN108829239A (en) | Control method, device and the terminal of terminal | |
Farooq et al. | A comparison of hardware based approaches for sign language gesture recognition systems | |
CN111145891A (en) | Information processing method and device and electronic equipment | |
CN111580666B (en) | Equipment control method, electronic equipment, equipment control system and storage medium | |
TW202004433A (en) | Control instruction input method and apparatus | |
CN106547339B (en) | Control method and device of computer equipment | |
Ellavarason et al. | A framework for assessing factors influencing user interaction for touch-based biometrics | |
Bakhtiyari et al. | Implementation of emotional-aware computer systems using typical input devices | |
Huang et al. | SpeciFingers: Finger Identification and Error Correction on Capacitive Touchscreens | |
Khandagale et al. | Jarvis-AI Based Virtual Mouse | |
Guerreiro et al. | Mnemonical body shortcuts: improving mobile interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |