Nothing Special   »   [go: up one dir, main page]

CN112426709A - Forearm movement posture recognition method, interface interaction control method and device - Google Patents

Forearm movement posture recognition method, interface interaction control method and device Download PDF

Info

Publication number
CN112426709A
CN112426709A CN202011331040.6A CN202011331040A CN112426709A CN 112426709 A CN112426709 A CN 112426709A CN 202011331040 A CN202011331040 A CN 202011331040A CN 112426709 A CN112426709 A CN 112426709A
Authority
CN
China
Prior art keywords
forearm
detector
interface interaction
axis acceleration
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011331040.6A
Other languages
Chinese (zh)
Other versions
CN112426709B (en
Inventor
蒋超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jinling Era Technology Co.,Ltd.
Original Assignee
Shenzhen Jinling Mutual Entertainment Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jinling Mutual Entertainment Technology Co ltd filed Critical Shenzhen Jinling Mutual Entertainment Technology Co ltd
Priority to CN202011331040.6A priority Critical patent/CN112426709B/en
Publication of CN112426709A publication Critical patent/CN112426709A/en
Application granted granted Critical
Publication of CN112426709B publication Critical patent/CN112426709B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a forearm movement posture recognition method, an interface interaction control method and an interface interaction control device. The recognition method utilizes detectors worn on the left wrist and the right wrist to automatically judge the motion postures of the forearms through the ratio of the acceleration of the X axis to the acceleration of the Z axis, the magnitude of the acceleration and the angular velocity, and can automatically recognize various motion postures of the forearms, such as the forearm lifting crossing the chest, the parallel forearm lifting, the left swinging and the right swinging. The interface interaction method and the interface interaction device utilize the attitude parameters obtained by the attitude identification and the interface interaction control rules to execute corresponding interface interaction operation when the attitude parameters meet the conditions of the interface interaction control rules, thereby improving the immersion feeling of games or software control.

Description

Forearm movement posture recognition method, interface interaction control method and device
Technical Field
The invention relates to the technical field of wearable equipment, in particular to a forearm movement posture recognition method and device, an electronic equipment interface interaction control method and device, and an electronic device adopting the control method.
Background
In traditional software/games, interface interaction modes generally include conventional input modes such as a mouse, a keyboard and a touch screen, however, if the user wears the somatosensory input device in the somatosensory game, the somatosensory input device is split in the somatosensory/conventional input experience if the conventional input mode is used, the immersion feeling of the game is not strong, and the advantage of the somatosensory input device is not utilized.
Disclosure of Invention
The invention aims to provide a forearm movement posture recognition method, an interface interaction control method and a forearm movement posture recognition device, which at least solve the defects in the related art to a certain extent.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a forearm motion gesture recognition method, the forearm motion gesture including a first motion gesture in which two forearms lift upward to intersect the chest at 45 degrees, the recognition method comprising the steps of: acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the left wrist; acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the right wrist; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to obtain a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio; judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center; if yes, outputting that the forearm is in the first motion posture; wherein, the direction of the X axis is the same as the length direction of the front arm, and the direction of the Z axis is vertical to the top surface of the detector.
Preferably, the forearm movement posture further includes a second posture in which two forearms are lifted in parallel, and the recognition method further includes the steps of: and judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center, and if so, outputting that the forearm is in the second motion posture.
Preferably, the forearm movement postures further comprise a third movement posture of a single-arm right pendulum and a fourth movement posture of a single-arm left pendulum, and the recognition method further comprises the following steps: acquiring the Y-axis angular velocity detected by a detector of the left wrist or the right arm; saving a plurality of Y-axis angular velocities acquired within a set time duration according to a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value; if the position median is larger than a preset right swing threshold value, outputting that the forearm is in the third motion posture; if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value; and if the position median is smaller than a preset left swing threshold value, outputting that the forearm is in the fourth motion posture.
A forearm motion gesture recognition device, comprising: the acceleration ratio acquisition module is used for performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to acquire a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio; the ratio judging module is used for judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center; and the result output module is used for outputting that the forearm is in the first motion posture when the first ratio and the second ratio both fall into a first preset interval taking 1 as the center.
Preferably, the forearm movement posture recognition device further includes: the acceleration acquisition module is used for acquiring the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist; the acceleration judging module is used for judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center; and the result output module is also used for outputting that the forearm is in a second motion posture when the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval taking 9.8 as the center.
Preferably, the forearm movement posture recognition device further includes: the angular velocity acquisition module is used for acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm; the angular velocity judging module is used for storing a plurality of Y-axis angular velocities acquired within a set time length according to a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value; if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value; the result output module is further used for outputting that the forearm is in a third motion posture when the position median is larger than a preset right swing threshold value, and outputting that the forearm is in a fourth motion posture when the position median is smaller than a preset left swing threshold value.
Preferably, the detector is a smart watch or a bracelet, and an IMU chip is arranged in the smart watch or the bracelet.
The control method of the interface interaction of the electronic equipment comprises the following steps: acquiring posture parameters output by a forearm movement posture recognition device; acquiring an interface interaction control rule; judging whether the attitude parameters meet the conditions of the interface interaction control rules or not; and if so, executing interface interaction operation corresponding to the interface interaction control rule. Wherein, the forearm movement gesture recognition device is any one of the forearm movement gesture recognition devices.
Control apparatus for electronic device interface interaction, comprising: the posture parameter acquisition module is used for acquiring posture parameters output by the forearm movement posture recognition device; the interface interaction control rule acquisition module is used for acquiring an interface interaction control rule of the electronic equipment; the judgment module is used for judging whether the attitude parameters meet the conditions of the interface interaction control rules or not; and the control execution module is used for executing the interface interaction operation corresponding to the interface interaction control rule when the attitude parameter meets the condition of the interface interaction control rule. Wherein, the forearm movement gesture recognition device is any one of the forearm movement gesture recognition devices.
An electronic device comprising a processor, a memory, and a computer program stored in the memory and executable by the processor, the electronic device implementing the steps of the control method of electronic device interface interaction as described above when the computer program is executed by the processor.
Compared with the prior art, the invention has at least the following beneficial effects:
the method can automatically recognize various motion postures of the forearm lifting crossing the chest, the parallel forearm lifting, the left swinging and the right swinging, and can automatically trigger the operation of the interface by utilizing the motion postures of the forearm, thereby improving the immersion feeling of games or software control.
Drawings
FIG. 1 is a schematic illustration of four kinematic poses of the forearm;
FIG. 2 is a flow chart of a method of identifying a first motion gesture;
FIG. 3 is a definition of the X, Y, and Z axes;
FIG. 4 is a flow chart of a method of recognition of a second motion gesture;
FIG. 5 is a flow chart of a method of identifying a third motion gesture and a fourth motion gesture;
FIG. 6 is a block diagram of the forearm movement posture recognition apparatus;
FIG. 7 is a flow chart of a method of controlling interface interaction of an electronic device;
FIG. 8 is a block diagram of the control device for electronic device interface interaction.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
Four motion poses defined by some embodiments are shown in fig. 1. The method comprises the following steps: a first motion posture that the two forearms are lifted upwards to intersect at 45 degrees on the chest, a second motion posture that the two forearms are lifted upwards in parallel, a third motion posture of a right single-arm pendulum and a fourth motion posture of a left single-arm pendulum.
According to the invention, the electronic equipment automatically recognizes the motion gesture through a specific gesture recognition method, so that the automatic control of the interface interaction operation of the electronic equipment is realized. The details are as follows.
Fig. 2 shows a flow of the above-described first motion gesture recognition method. As shown in fig. 2, the method for recognizing the first motion gesture includes:
step S1-1, acquiring X-axis acceleration Lx and Z-axis acceleration Lz detected by the detector of the left wrist, and acquiring X-axis acceleration Rx and Z-axis acceleration Rz detected by the detector of the right wrist;
an Inertial Measurement Unit (IMU) chip is built in the detector, and the X-axis acceleration, the Y-axis acceleration, the Z-axis acceleration, the X-axis angular velocity, the Y-axis angular velocity, and the Z-axis angular velocity can be detected by the IMU chip. The detector that can be adopted is preferably but not limited to smart bracelet, smart watch, etc. with built-in IMU chip. The X-axis acceleration and the Z-axis acceleration can be acquired in real time through wired or wireless connection with the detector. The connection to the detector is preferably via a bluetooth module.
The definitions of the X, Y and Z axes are shown in fig. 3. As shown in fig. 3, the X-axis is oriented in the same direction as the length of the forearm, and the Z-axis is oriented perpendicular to the top surface of the detector.
Then, dividing the X-axis acceleration Lx and the Z-axis acceleration Lz detected by the detector of the left wrist to obtain a first ratio Lx/Lz; and (4) dividing the X-axis acceleration Rx and the Z-axis acceleration Rz detected by the detector of the right wrist to obtain a second ratio Rx/Rz.
And step S1-2, judging whether the first ratio Lx/Lz and the second ratio Rx/Rz fall into a first preset interval with 1 as the center.
In this embodiment, the first preset interval is specifically [ 0.95, 1.05 ]. The end value of the first preset interval can also be other values, and the value follows the following principle: the larger the range of the first preset interval is, the smaller the missing recognition probability is and the greater the false recognition probability is, whereas the smaller the range of the first preset interval is, the smaller the false recognition probability is and the greater the missing recognition probability is.
Step S1-3, when the first ratio Lx/Lz and the second ratio Rx/Rz both fall within the first preset interval centered at 1, further determining whether the state maintaining time exceeds a set time duration, where the set time duration is 1 second in this embodiment.
Specifically, after the first ratio Lx/Lz and the second ratio Rx/Rz both fall within a first preset interval centered at 1, determining whether the second reading state is present, if not, entering the second reading state, recording a timestamp when the second reading state is entered, and returning to step S1-1; if yes, judging whether the difference between the current time stamp and the time stamp in the second reading state is larger than 1, if not, returning to the step S1-1, and if so, executing the step S1-4.
And step S1-4, outputting that the forearm is in the first motion posture. In this embodiment, the first motion gesture corresponds to a cancel operation.
Fig. 4 shows a flow of the above-described recognition method of the second motion gesture. As shown in fig. 4, the method for recognizing the second motion gesture includes:
step S2-1, the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist are obtained.
As described in the foregoing step S1-1, the detector incorporates an IMU chip, and the X-axis acceleration can be detected by the IMU chip.
Step S2-2, it is determined whether both the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist fall within a second preset section centered at 9.8.
In this embodiment, the second preset interval is [ 9.5, 10 ]. The end value of the second preset interval can also be other values, and the value follows the following principle: the larger the range of the second preset interval is, the smaller the missing recognition probability is and the greater the false recognition probability is, whereas the smaller the range of the second preset interval is, the smaller the false recognition probability is and the greater the missing recognition probability is.
Step S2-3, after the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist both fall into the second preset interval, further determining whether the state maintaining time exceeds a set time period, where the set time period is 1 second in this embodiment.
Specifically, after the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist both fall within a second preset interval, determining whether the wrist is in the second reading state, if not, entering the second reading state, recording a timestamp when the wrist enters the second reading state, and returning to step S2-1; if yes, judging whether the difference between the current time stamp and the time stamp in the second reading state is larger than 1, if not, returning to the step S2-1, and if so, executing the step S2-4.
And step S2-4, outputting that the forearm is in the second motion posture. In this embodiment, the second motion gesture corresponds to a determination operation.
Fig. 5 shows a flow of the above-described recognition method of the third motion gesture and the fourth motion gesture. As shown in fig. 5, the method for recognizing the third motion gesture and the fourth motion gesture includes:
step S3-1, acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm;
as described in the foregoing step S1-1, the detector incorporates an IMU chip, and the Y-axis angular velocity can be detected by the IMU chip.
Step S3-2, storing a plurality of Y-axis angular velocities acquired within a set time duration according to a time sequence to form a Y-axis angular velocity list;
in this embodiment, the set time period is specifically 1 second. The formed Y-axis angular velocity list is a kind of temporary list.
Step S3-3, judging whether the position median in the Y-axis angular velocity list is the maximum value; if not, go to step S3-4, if yes, go to step S3-5;
step S3-4, judging whether the position median in the Y-axis angular velocity list is the minimum value; if not, go to step S3-1; if yes, go to step S3-6;
step S3-5, judging whether the position median is larger than a preset right swing threshold value; if not, go to step S3-1; if yes, go to step S3-7;
step S3-6, judging whether the position median is smaller than a preset left swing threshold value; if not, go to step S3-1; if yes, go to step S3-8;
step S3-7, outputting that the forearm is in a third motion posture;
and step S3-8, outputting that the forearm is in the fourth motion posture.
As can be seen from the above, with the above technical solution, the processor can automatically recognize whether the forearm enters the first motion posture, the second motion posture, the third motion posture and the fourth motion posture.
A block diagram of the components of an embodiment of the forearm motion gesture recognition apparatus is shown in fig. 6. As shown in fig. 6, the forearm movement posture recognition device specifically includes the following modules:
the acceleration ratio acquisition module 1 is used for performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to acquire a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio;
the ratio judging module 2 is used for judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center;
and the result output module 7 is used for outputting that the forearm is in the first motion posture when the first ratio and the second ratio both fall into a first preset interval with 1 as the center.
The forearm movement gesture recognition device specifically comprises the following modules:
the acceleration acquisition module 3 is used for acquiring the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist;
the acceleration judging module 4 is used for judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center;
and the result output module 7 is further configured to output that the forearm is in a second motion posture when the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall within a second preset interval centered at 9.8.
The forearm movement gesture recognition device specifically comprises the following modules:
the angular velocity acquisition module 5 is used for acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm;
the angular velocity determination module 6 is configured to store a plurality of Y-axis angular velocities acquired within a set time duration in a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value; if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value;
the result output module 7 is further configured to output that the forearm is in a third motion posture when the median value of the positions is greater than a preset right-swing threshold, and output that the forearm is in a fourth motion posture when the median value of the positions is less than a preset left-swing threshold.
Wherein, the detector is preferred to be built-in have intelligent wrist-watch or the bracelet of IMU chip, and intelligent wrist-watch or bracelet are preferred still to include bluetooth module to realize the wireless transmission who detects data.
The flow of the control method of the electronic device interface interaction is shown in fig. 7. As shown in fig. 7, the interface interaction control method specifically includes:
and step S4-1, acquiring the posture parameters output by the forearm movement posture recognition device.
The forearm movement posture recognition device is the forearm movement posture recognition device. In a specific implementation, the attitude parameter is preferably represented by a digital code, for example, 00 represents the first motion attitude, 01 represents the second motion attitude, 10 represents the third motion attitude, and 11 represents the fourth motion attitude.
And step S4-2, acquiring interface interaction control rules.
Specifically, the interface interaction operation includes a cancel operation, a determine operation, a left operation, and a right operation, and the interface interaction control rule specifically includes: the first motion posture corresponds to a cancel operation, the second motion posture corresponds to a confirm operation, the third motion posture corresponds to a right page turning operation, and the fourth motion posture corresponds to a left page turning operation.
The rules are pre-stored in the memory.
And step S4-3, judging whether the attitude parameters meet the conditions of the interface interaction control rules.
That is, it is determined whether the gesture parameters output in step S4-1 include gesture parameters corresponding to the interface interaction operation.
And step S4-4, if the posture parameters meet the conditions of the interface interaction control rules, executing the interface interaction operation corresponding to the interface interaction control rules.
For example, if the gesture parameter output in step S4-1 is 00, a cancel operation is performed; if the attitude parameter output in the step S4-1 is 01, executing a determination operation; if the posture parameter output by the step S4-1 is 10, executing a page turning operation to the right; if the gesture parameter output in step S4-1 is 11, a page turning operation to the left is performed.
Therefore, the control method realizes automatic control of interface interaction of the electronic equipment based on the motion posture of the forearm. Compared with the existing interface interaction mode of a mouse, a keyboard, a touch screen and the like, the control method can improve the immersion feeling of game or software control.
Compared with fingers, the forearm has larger action amplitude, so the IMU chip has the advantages of easy identification and low requirement on the precision of the sensor, and compared with precision sensors such as an optical sensor, the IMU chip has the advantage of low cost, so the automatic control of the interface interaction of the electronic equipment is realized by combining the IMU chip and the forearm, and the IMU chip has the advantages of reliable control and low cost.
A block diagram of the components of the control device for electronic device interface interaction is shown in fig. 8. As shown in fig. 8, the interface interaction control device specifically includes the following modules:
and the posture parameter acquisition module 8 is used for acquiring the posture parameters output by the forearm movement posture recognition device. Wherein the forearm movement posture recognition device is the forearm movement posture recognition device;
the interface interaction control rule obtaining module 9 is used for obtaining an interface interaction control rule of the electronic equipment;
the judgment module 10 is used for judging whether the attitude parameters meet the conditions of the interface interaction control rules;
and the control execution module 11 is configured to execute the interface interaction operation corresponding to the interface interaction control rule when the gesture parameter meets the condition of the interface interaction control rule.
An embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and capable of being executed by the processor, wherein when the computer program is executed by the processor, the electronic device implements the steps of the control method for interface interaction of the electronic device as described above.
The present invention has been described in detail with reference to the specific embodiments, and the detailed description is only for the purpose of helping those skilled in the art understand the present invention, and is not to be construed as limiting the scope of the present invention. Various modifications, equivalent changes, etc. made by those skilled in the art under the spirit of the present invention shall be included in the protection scope of the present invention.

Claims (10)

1. A forearm movement gesture recognition method, wherein the forearm movement gesture includes a first movement gesture in which two forearms lift upward to intersect at 45 degrees on the chest, the recognition method comprising the steps of:
acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the left wrist;
acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the right wrist;
performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to obtain a first ratio;
performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio;
judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center;
if yes, outputting that the forearm is in the first motion posture;
wherein, the direction of the X axis is the same as the length direction of the front arm, and the direction of the Z axis is vertical to the top surface of the detector.
2. The forearm movement gesture recognition method of claim 1, wherein the forearm movement gesture further includes a second gesture where both forearms are lifted in parallel, the recognition method further including the steps of: and judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center, and if so, outputting that the forearm is in the second motion posture.
3. The forearm movement gesture recognition method of claim 1, wherein the forearm movement gesture further includes a third movement gesture of a right single-arm pendulum and a fourth movement gesture of a left single-arm pendulum, the recognition method further comprising the steps of:
acquiring the Y-axis angular velocity detected by a detector of the left wrist or the right arm;
saving a plurality of Y-axis angular velocities acquired within a set time duration according to a time sequence to form a Y-axis angular velocity list;
judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value;
if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value;
if the position median is larger than a preset right swing threshold value, outputting that the forearm is in the third motion posture;
if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value;
and if the position median is smaller than a preset left swing threshold value, outputting that the forearm is in the fourth motion posture.
4. A forearm movement posture recognition apparatus, characterized by comprising:
the acceleration ratio acquisition module is used for performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to acquire a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio;
the ratio judging module is used for judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center;
and the result output module is used for outputting that the forearm is in the first motion posture when the first ratio and the second ratio both fall into a first preset interval taking 1 as the center.
5. The forearm movement gesture recognition device of claim 4, further comprising:
the acceleration acquisition module is used for acquiring the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist;
the acceleration judging module is used for judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center;
and the result output module is also used for outputting that the forearm is in a second motion posture when the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval taking 9.8 as the center.
6. The forearm movement gesture recognition device of claim 4, further comprising:
the angular velocity acquisition module is used for acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm;
the angular velocity judging module is used for storing a plurality of Y-axis angular velocities acquired within a set time length according to a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value; if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value;
the result output module is further used for outputting that the forearm is in a third motion posture when the position median is larger than a preset right swing threshold value, and outputting that the forearm is in a fourth motion posture when the position median is smaller than a preset left swing threshold value.
7. The forearm movement gesture recognition device of claim 4, wherein the detector is a smart watch or bracelet with an IMU chip built in.
8. The method for controlling the interface interaction of the electronic equipment is characterized by comprising the following steps of:
acquiring posture parameters output by a forearm movement posture recognition device, wherein the forearm movement posture recognition device is the forearm movement posture recognition device according to any one of claims 4 to 7;
acquiring an interface interaction control rule;
judging whether the attitude parameters meet the conditions of the interface interaction control rules or not;
and if so, executing interface interaction operation corresponding to the interface interaction control rule.
9. Control device of electronic equipment interface interaction, characterized by, includes:
a posture parameter acquiring module, configured to acquire a posture parameter output by a forearm movement posture recognition device, where the forearm movement posture recognition device is a forearm movement posture recognition device according to any one of claims 4 to 7;
the interface interaction control rule acquisition module is used for acquiring an interface interaction control rule of the electronic equipment;
the judgment module is used for judging whether the attitude parameters meet the conditions of the interface interaction control rules or not;
and the control execution module is used for executing the interface interaction operation corresponding to the interface interaction control rule when the attitude parameter meets the condition of the interface interaction control rule.
10. An electronic device, comprising a processor, a memory, and a computer program stored in the memory and executable by the processor, the computer program, when executed by the processor, implementing the steps of the control method as claimed in claim 8.
CN202011331040.6A 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device Active CN112426709B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011331040.6A CN112426709B (en) 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011331040.6A CN112426709B (en) 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device

Publications (2)

Publication Number Publication Date
CN112426709A true CN112426709A (en) 2021-03-02
CN112426709B CN112426709B (en) 2022-11-18

Family

ID=74694540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011331040.6A Active CN112426709B (en) 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device

Country Status (1)

Country Link
CN (1) CN112426709B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001828A1 (en) * 2022-06-29 2024-01-04 华为技术有限公司 Wrist-worn-device control method and related system, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN106468951A (en) * 2016-08-29 2017-03-01 华东师范大学 A kind of intelligent remote control systems based on the fusion of both hands ring sensor and its method
CN108245880A (en) * 2018-01-05 2018-07-06 华东师范大学 Body-sensing detection method for visualizing and system based on more wearing annulus sensor fusions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN106468951A (en) * 2016-08-29 2017-03-01 华东师范大学 A kind of intelligent remote control systems based on the fusion of both hands ring sensor and its method
CN108245880A (en) * 2018-01-05 2018-07-06 华东师范大学 Body-sensing detection method for visualizing and system based on more wearing annulus sensor fusions

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001828A1 (en) * 2022-06-29 2024-01-04 华为技术有限公司 Wrist-worn-device control method and related system, and storage medium

Also Published As

Publication number Publication date
CN112426709B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
Perng et al. Acceleration sensing glove (ASG)
JP6064280B2 (en) System and method for recognizing gestures
CN102246125B (en) Mobile devices with motion gesture recognition
CN106462239A (en) Finger tracking
CN101625607A (en) Finger mouse
CN104049759A (en) Instruction input and protection method integrating touch screen and behavior sensing
CN104238924A (en) Method and device for controlling illumining of screen of wearable device
CN113031840B (en) False triggering prevention method and device for wrist-worn device, electronic device and storage medium
CN105302021A (en) Wearable gesture control device for controlling movement of robot in human-computer collaborative remanufacturing
CN103034343B (en) The control method and device of a kind of sensitive mouse
EP3289435B1 (en) User interface control using impact gestures
CN104423566A (en) Gesture recognition method and wearable device
US20170017303A1 (en) Operation recognition device and operation recognition method
CN103294226B (en) A kind of virtual input device and method
CN102024316B (en) Wireless intelligent sensing method, device and system
CN111552383A (en) Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment
CN112426709B (en) Forearm movement posture recognition method, interface interaction control method and device
Gouthaman et al. Gesture detection system using smart watch based motion sensors
CN114949839A (en) Swimming posture-based motion sensing game method
CN110236560A (en) Six axis attitude detecting methods of intelligent wearable device, system
JP6891891B2 (en) Information processing device
CN106933342A (en) Body-sensing system, motion sensing control equipment and intelligent electronic device
US11454646B2 (en) Initiation of calibration of multiple sensor modules related to an orientation of a user of the sensor modules
CN115291786A (en) False touch judgment method and device based on machine learning and storage medium
CN112099649A (en) Anti-interference knocking detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221025

Address after: 518000 2801, block a, building 2, Shenzhen Bay innovation and technology center, No. 3156, Keyuan South Road, community, high tech Zone, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant after: Shenzhen Jinling Technology Co.,Ltd.

Address before: 811-1, building 10, Shenzhen Bay science and technology ecological park, No.10, Gaoxin South 9th Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: Shenzhen Jinling mutual Entertainment Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000, 5th floor, 501A, 502, Xinghe WORLD Ecological Building, No. 8 Yaxing Road, Nankeng Community, Bantian Street, Longgang District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Jinling Era Technology Co.,Ltd.

Country or region after: China

Address before: 518000 2801, block a, building 2, Shenzhen Bay innovation and technology center, No. 3156, Keyuan South Road, community, high tech Zone, Yuehai street, Nanshan District, Shenzhen, Guangdong

Patentee before: Shenzhen Jinling Technology Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address