CN103777748A - Motion sensing input method and device - Google Patents
Motion sensing input method and device Download PDFInfo
- Publication number
- CN103777748A CN103777748A CN201210417290.0A CN201210417290A CN103777748A CN 103777748 A CN103777748 A CN 103777748A CN 201210417290 A CN201210417290 A CN 201210417290A CN 103777748 A CN103777748 A CN 103777748A
- Authority
- CN
- China
- Prior art keywords
- hand
- determining
- motion
- depth map
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 118
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000013507 mapping Methods 0.000 claims abstract description 11
- 230000036544 posture Effects 0.000 claims description 29
- 230000003238 somatosensory effect Effects 0.000 claims description 26
- 238000004458 analytical method Methods 0.000 claims description 18
- 238000006073 displacement reaction Methods 0.000 claims description 13
- 210000000988 bone and bone Anatomy 0.000 claims description 9
- 238000005070 sampling Methods 0.000 claims description 9
- 230000002123 temporal effect Effects 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 5
- 230000003993 interaction Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 8
- 241000233788 Arecaceae Species 0.000 description 2
- 238000005452 bending Methods 0.000 description 2
- 240000001889 Brahea edulis Species 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention discloses a motion sensing input method and device, and relates to the technical field of electronics. Finer motion changes can be recognized, then a user can input more kinds of and more complex information conveniently, and therefore the application range of the motion sensing device is widened. The method comprises the step of determining the changing track of the hand gesture, the step of determining the moving track of the hand, and the step of determining the input information according to the mapping relation of the changing track of the hand gesture, the moving track of the hand and the input information. The motion sensing input method and device are suitable for the man-machine interaction process based on the motion sensing input technology.
Description
Technical Field
The invention relates to the technical field of electronics, in particular to a motion sensing input method and device.
Background
With the development of electronic technology, motion sensing recognition technology is widely applied to multimedia platforms as a new man-machine interaction technology. For example: the motion sensing device in the household game machine can collect human body images, skeleton images and the like of a user, and can identify the motion tracks of the trunk and the limbs of the user, so that the user can input instructions in a game through the motion sensing device.
However, since the conventional motion sensing device can recognize only a large human body part and motion such as a limb, a trunk, an entire palm, and an arm, information that a user can input through the motion sensing device is relatively simple. Some detailed, complex actions cannot be identified. This results in a narrow range of applications for motion sensing devices.
Disclosure of Invention
Embodiments of the present invention provide a motion sensing input method and apparatus, which can identify more precise motion changes, so that a user can input more types of information with higher complexity, thereby increasing the application range of a motion sensing device.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, an embodiment of the present invention provides a somatosensory input method, including:
determining a hand posture change track;
determining a motion trajectory of the hand;
and determining input information according to the hand posture change track and the mapping relation between the hand motion track and the input information.
In one possible implementation manner of the first aspect, the determining the hand posture change trajectory includes:
obtaining a depth map of at least two hands;
determining a hand gesture displayed by a depth map of the hand;
determining an order of the hand gestures in accordance with a temporal order in which the depth maps are obtained;
and determining the posture change track of the hand according to the sequence of the hand postures.
Optionally, the obtaining the depth map of at least two hands includes:
obtaining a depth map and a skeleton map of a human body;
determining skeletal points of the hand on a skeletal map of the human body;
determining a designated range by taking the skeleton point of the hand as a reference, and acquiring a depth map in the designated range on the depth map of the human body to obtain the depth map of the hand;
and repeating the process at preset time intervals to acquire at least two depth maps of the hand.
Optionally, the determining the hand gesture displayed by the depth map of the hand comprises:
determining the number and shape of fingers in a depth map of the hand;
determining the gesture of the hand according to the number and the shape of the fingers.
In another possible implementation manner of the first aspect, the determining the motion trajectory of the hand includes:
analyzing the displacement condition of the skeleton points of the hand according to the skeleton points of the hand;
and determining the motion trail of the hand according to the displacement condition of the hand skeleton point.
In a second aspect, an embodiment of the present invention provides a somatosensory input device, including:
the motion analysis module is used for determining a hand posture change track;
the motion analysis module is used for determining the motion track of the hand;
and the information analysis module is used for determining the input information according to the hand posture change track and the mapping relation between the hand motion track and the input information.
In one possible implementation manner of the second aspect, the action analysis module includes:
the sampling unit is used for obtaining depth maps of at least two hands;
a recognition unit for determining a hand gesture displayed by a depth map of the hand;
a ranking unit for determining an order of the hand gestures in accordance with a time order in which the depth maps are obtained;
a change trajectory acquisition unit configured to determine a gesture change trajectory of the hand according to the order of the hand gestures.
Optionally, the sampling unit includes:
the scanning subunit is used for obtaining a depth map and a skeleton map of the human body;
the positioning subunit is used for determining a bone point of the hand on a bone map of the human body;
and the local sampling subunit is used for determining a specified range by taking the skeleton point of the hand as a reference, and acquiring a depth map in the specified range on the depth map of the human body to obtain the depth map of the hand.
Optionally, the identification unit includes:
a finger recognition subunit, configured to determine the number and shape of fingers in the depth map of the hand;
and the gesture determining subunit is used for determining the gesture of the hand according to the number and the shape of the fingers.
In another possible implementation manner of the second aspect, the motion analysis module includes:
the motion capture unit is used for analyzing the displacement condition of the skeleton points of the hand according to the skeleton points of the hand;
and the motion analysis unit is used for determining the motion track of the hand according to the displacement condition of the hand skeleton point.
The somatosensory input method and the somatosensory input device provided by the embodiment of the invention can identify the shape change and the motion change of small organs such as fingers and the like, and determine the information input by a user according to the shape change and the motion change track, so that the user can make more complex motions through the somatosensory device, and the types of motions and motion combinations are increased. Compared with the prior art, the motion sensing device and the motion sensing method can identify more precise motion changes, so that a user can input more types of information with higher complexity, and the application range of the motion sensing device is enlarged.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a somatosensory input method according to an embodiment of the present invention;
fig. 2a is a flowchart of a somatosensory input method according to another embodiment of the invention;
FIG. 2b is a flow chart of an embodiment of the present invention according to another embodiment;
fig. 3a is a flowchart of a somatosensory input method according to yet another embodiment of the invention;
FIG. 3b is a schematic diagram of an embodiment of the present invention;
FIG. 3c is a flowchart of another somatosensory input method according to yet another embodiment of the invention;
FIG. 3d is a schematic diagram illustrating another embodiment of the present invention;
FIG. 3e is a flowchart of another somatosensory input method according to another embodiment of the invention;
fig. 4 is a schematic structural diagram of a motion sensing input device according to an embodiment of the present invention;
fig. 5a is a schematic structural diagram of a motion sensing input device according to another embodiment of the present invention;
fig. 5b is a schematic structural diagram of another motion sensing input device according to another embodiment of the present invention;
fig. 5c is a schematic structural diagram of another motion sensing input device according to another embodiment of the present invention;
fig. 5d is a schematic structural diagram of a motion sensing input device according to still another embodiment of the present invention;
fig. 6 is a schematic structural diagram of a motion sensing device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of the present invention provides a somatosensory input method, as shown in fig. 1, including:
and 201, determining a hand posture change track.
Alternatively, the motion sensing device may be a device that captures a motion of the user by motion capture/acquisition means, and may be capable of processing the captured/acquired motion.
202, determining the motion track of the hand.
And 203, determining input information according to the hand posture change track and the mapping relation between the motion track of the hand and the input information.
In this embodiment, a mapping relationship between a hand posture change trajectory, a hand motion trajectory, and input information may be pre-stored in the motion sensing device, and for convenience of understanding, a part of the mapping relationship may be as shown in table 1:
TABLE 1
Hand posture change trajectory | Hand motion trajectory | Inputting information |
The fingers are fully opened → one finger is straightened | Null | Selecting an object |
Straightening a finger → bending a finger | Null | Left mouse button click |
Straightening one finger → making a fist on hand | Null | Right click of mouse |
Straightening two fingers → bending one finger | Null | Right click of mouse |
The fingers are fully opened → one finger is straightened | Up and down or left and right | Dragging target after target selection |
Single palm → Single palm | Move | Screen movement |
Fist making of hand → palm stretching | Two palms being separated to either side | Amplification of |
Fist making of hand → palm stretching | The two palms are close to the middle | Shrinking |
Where "Null" indicates that there is no displacement motion of the hand during the change in hand pose. The mapping relationship between the hand posture change trajectory and the hand motion trajectory and the input information is flexible. The method can be customized by a user and can also be updated at any time.
Optionally, as shown in fig. 2a, in this embodiment, the foregoing 201 may include:
2011, depth maps of at least two hands are obtained.
In an embodiment of the present invention, the depth map of the hand is a two-dimensional picture showing an outline of the hand, and the distance between different parts of the hand and the motion sensing device is represented by different color gradations in the two-dimensional picture. In general, a region with lighter color gradation indicates a longer distance, and a region with darker color gradation indicates a shorter distance.
2012, determining the hand gesture displayed by the depth map of the hand.
In this embodiment, the motion sensing device can recognize a hand gesture displayed by a depth map of a hand by using a technique such as image recognition. E.g. palm open, or index finger flexed, etc.
2013, determining the order of the hand gestures according to the time sequence of obtaining the depth map;
2014, determining the posture change trajectory of the hand according to the order of the hand postures.
In this embodiment, the motion sensing device may continuously sample the hand, and obtain a series of continuous depth maps of the hand, such as: the motion sensing device samples the hand at a rate of 15 frames per second or higher for image frame frequency. And recognizing the change of the hand posture within a certain time according to the acquired series of continuous depth maps of the hand, thereby obtaining the change track of the hand posture.
Taking fig. 2b as an example, the somatosensory device acquires depth maps of 4 hands in total, and analyzes to obtain 4 hand postures. Wherein, the gesture of the hand is recognized as 'one finger straight' (1-1) from the 1 st depth map, the gesture of the hand is recognized as 'one finger bent' (1-2) from the 2 nd depth map, the gesture of the hand is recognized as 'hand fist' (1-3) from the 3 rd depth map, the gesture of the hand is recognized as 'all fingers open' (1-4) from the 4 th depth map, and the gesture change track of the hand is 1-1 → 1-2 → 1-3 → 1-4.
Optionally, as shown in fig. 3a, the 2011 may include:
20111, depth and bone maps of the human body are obtained.
In this embodiment, the motion sensing device may irradiate infrared rays on a human body through an infrared scanning technology, and obtain a depth map showing a contour of the human body according to the reflected infrared rays. Since different parts of the human body have different distances from the motion sensing device, the distance of the scanned part from the motion sensing device is represented by different color gradations in the depth map. In general, a region with lighter color gradation indicates a longer distance, and a region with darker color gradation indicates a shorter distance.
The motion sensing device can set bone points on the depth map of the human body, such as: the skeleton points are arranged at key positions of hands, heads, joints and the like, and the skeleton points are connected to obtain a human skeleton map. For example, see fig. 3 b.
20112, the skeletal points of the hand are determined on the skeletal map of the human body.
20113, a designated range is determined based on the skeleton point of the hand, and a depth map within the designated range is obtained on the depth map of the human body, so as to obtain the depth map of the hand.
The size of the designated range can be automatically determined by the motion sensing device according to a specific operation scene, can be set when the motion sensing device leaves a factory, and can be customized by a user. But the specified range is preferably such that the entire hand can be covered.
In this embodiment, the motion sensing device may repeat 20111, 20112, and 20113 periodically to capture depth maps of at least two of the hands. For example: the image frame frequency sampled by the motion sensing device may be a standard 15 frames per second or higher, and may also be set by the user or a technician.
Optionally, as shown in fig. 3c, the 2012 may include:
20121, the number and shape of fingers are determined in the depth map of the hand.
20122, determining the posture of the hand according to the number and the shape of the fingers.
In this embodiment, as shown in fig. 3d, a gallery may be stored in advance in the motion sensing device, and the motion sensing device may obtain the graphics data or the number of the corresponding hand gesture from the gallery according to the determined hand gesture.
In this embodiment, the gallery stored in the motion sensing device may be input into the motion sensing device when the motion sensing device leaves a factory, or may be customized by a user or updated in real time. If the motion sensing device collects a new hand gesture, the motion sensing device can generate graphic data according to the hand gesture, the graphic data are numbered, and the graphic data are stored in a gallery.
Optionally, as shown in fig. 3e, the step 202 may include:
2021, analyzing the displacement of the bone points of the hand according to the bone points of the hand;
2022, determining the motion track of the hand according to the displacement condition of the hand skeleton points.
In this embodiment, after acquiring the skeleton points of the hand at 303, the motion sensing device may acquire the motion trajectory of the hand according to the skeleton points of the hand. It should be noted that the process of determining the motion trajectory of the hand by the motion sensing device may be performed simultaneously with the process of acquiring the hand posture change trajectory.
The somatosensory input method provided by the embodiment of the invention can identify the gesture change track and the motion track of the hand and determine the input information, so that the somatosensory equipment can identify more precise motion changes, and the application range of the somatosensory equipment is enlarged.
One embodiment of the present invention provides a motion sensing apparatus, as shown in fig. 4, including:
an action analysis module 41, configured to determine a hand posture change trajectory;
a motion analysis module 42 for determining a motion trajectory of the hand;
and the information analysis module 43 is configured to determine the input information according to the gesture change trajectory, the mapping relationship between the motion trajectory of the hand and the input information.
Optionally, as shown in fig. 5a, the motion analysis module 41 may include:
the sampling unit 411 is configured to acquire at least two depth maps of the hand.
A recognition unit 412 for determining a hand gesture displayed by the depth map of the hand.
A sorting unit 413 for determining an order of the hand gestures in chronological order of obtaining the depth map.
A trajectory determination unit 414 configured to determine a gesture change trajectory of the hand according to the order of the hand gestures.
Optionally, as shown in fig. 5b, the sampling unit 411 may include:
a scanning subunit 4111, configured to obtain a depth map and a bone map of the human body.
A positioning subunit 4112, configured to determine a skeletal point of the hand on a skeletal map of the human body.
A local sampling subunit 4113, configured to determine a specified range based on the skeleton point of the hand, and obtain a depth map within the specified range from the depth map of the human body to obtain the depth map of the hand.
Optionally, as shown in fig. 5c, the identifying unit 412 may include:
a finger recognition subunit 4121 for determining the number and shape of fingers in the depth map of the hand.
A gesture determination subunit 4122 for determining the posture of the hand according to the number and shape of the fingers.
Optionally, as shown in fig. 5d, the motion analysis module 42 may include:
and a motion capture unit 421, configured to analyze a displacement condition of the skeleton point of the hand according to the skeleton point of the hand.
And the motion analysis unit 422 is configured to determine a motion trajectory of the hand according to the displacement condition of the hand skeleton point.
The motion sensing device provided by the embodiment of the invention can identify the gesture change track and the motion track of the hand and determine the input information, so that the motion sensing device can identify more precise motion changes, and the application range of the motion sensing device is enlarged.
Still another embodiment of the present invention provides a motion sensing device, as shown in fig. 6, including a processor 61 and a memory 62. Wherein,
the memory 62 is used for storing instructions;
the processor 61 is configured to execute the instructions for:
determining a hand posture change track;
determining a motion trajectory of the hand;
and determining input information according to the hand posture change track and the mapping relation between the hand motion track and the input information.
According to the motion sensing device provided by the embodiment of the invention, the gesture change track and the motion track of the hand can be identified through the processor, and the input information is determined, so that the motion sensing device can identify more precise motion changes, and the application range of the motion sensing device is enlarged.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. A somatosensory input method, comprising:
determining a hand posture change track;
determining a motion trajectory of the hand;
and determining input information according to the hand posture change track and the mapping relation between the hand motion track and the input information.
2. The somatosensory input method of claim 1, wherein determining a hand gesture change trajectory comprises:
obtaining a depth map of at least two hands;
determining a hand gesture displayed by a depth map of the hand;
determining an order of the hand gestures in accordance with a temporal order in which the depth maps are obtained;
and determining the posture change track of the hand according to the sequence of the hand postures.
3. The somatosensory input method of claim 2, wherein obtaining the depth map of at least two hands comprises:
obtaining a depth map and a skeleton map of a human body;
determining skeletal points of the hand on a skeletal map of the human body;
and determining a specified range by taking the skeleton point of the hand as a reference, and acquiring a depth map in the specified range on the depth map of the human body to obtain the depth map of the hand.
4. A somatosensory input method according to claim 2, wherein determining a hand gesture displayed by the depth map of the hand comprises:
determining the number and shape of fingers in a depth map of the hand;
determining the gesture of the hand according to the number and the shape of the fingers.
5. The somatosensory input method according to any one of claims 1-4, wherein the determining a motion trajectory of the hand comprises:
analyzing the displacement condition of the skeleton points of the hand according to the skeleton points of the hand;
and determining the motion trail of the hand according to the displacement condition of the hand skeleton point.
6. A somatosensory input device, comprising:
the motion analysis module is used for determining a hand posture change track;
the motion analysis module is used for determining the motion track of the hand;
and the information analysis module is used for determining the input information according to the hand posture change track and the mapping relation between the hand motion track and the input information.
7. The somatosensory input device of claim 6, wherein the motion analysis module comprises:
the sampling unit is used for obtaining depth maps of at least two hands;
a recognition unit for determining a hand gesture displayed by a depth map of the hand;
a ranking unit for determining an order of the hand gestures in accordance with a time order in which the depth maps are obtained;
a change trajectory acquisition unit configured to determine a gesture change trajectory of the hand according to the order of the hand gestures.
8. The somatosensory input device according to claim 7, wherein the sampling unit comprises:
the scanning subunit is used for obtaining a depth map and a skeleton map of the human body;
the positioning subunit is used for determining a bone point of the hand on a bone map of the human body;
and the local sampling subunit is used for determining a specified range by taking the skeleton point of the hand as a reference, and acquiring a depth map in the specified range on the depth map of the human body to obtain the depth map of the hand.
9. The somatosensory input device according to claim 7 or 8, wherein the recognition unit comprises:
a finger recognition subunit, configured to determine the number and shape of fingers in the depth map of the hand;
and the gesture determining subunit is used for determining the gesture of the hand according to the number and the shape of the fingers.
10. The somatosensory input device according to any one of claims 6-9, wherein the motion analysis module comprises:
the motion capture unit is used for analyzing the displacement condition of the skeleton points of the hand according to the skeleton points of the hand;
and the motion analysis unit is used for determining the motion track of the hand according to the displacement condition of the hand skeleton point.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210417290.0A CN103777748A (en) | 2012-10-26 | 2012-10-26 | Motion sensing input method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210417290.0A CN103777748A (en) | 2012-10-26 | 2012-10-26 | Motion sensing input method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103777748A true CN103777748A (en) | 2014-05-07 |
Family
ID=50570094
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210417290.0A Pending CN103777748A (en) | 2012-10-26 | 2012-10-26 | Motion sensing input method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103777748A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104881122A (en) * | 2015-05-29 | 2015-09-02 | 深圳奥比中光科技有限公司 | Somatosensory interactive system activation method and somatosensory interactive method and system |
CN105022485A (en) * | 2015-07-09 | 2015-11-04 | 中山大学 | Suspension interaction method and system for automatic teller machine device |
CN105302295A (en) * | 2015-09-07 | 2016-02-03 | 哈尔滨市一舍科技有限公司 | Virtual reality interaction device having 3D camera assembly |
CN105302294A (en) * | 2015-09-07 | 2016-02-03 | 哈尔滨市一舍科技有限公司 | Interactive virtual reality presentation device |
CN105912117A (en) * | 2016-04-12 | 2016-08-31 | 北京锤子数码科技有限公司 | Motion state capture method and system |
CN106054627A (en) * | 2016-06-12 | 2016-10-26 | 珠海格力电器股份有限公司 | Control method and device based on gesture recognition and air conditioner |
CN106325493A (en) * | 2015-06-30 | 2017-01-11 | 现代自动车株式会社 | Vehicle and method of controlling the same |
CN106681497A (en) * | 2016-12-07 | 2017-05-17 | 南京仁光电子科技有限公司 | Method and device based on somatosensory control application program |
CN107436686A (en) * | 2017-08-28 | 2017-12-05 | 山东浪潮商用系统有限公司 | A kind of methods, devices and systems for controlling target to be controlled |
US9990031B2 (en) | 2016-01-18 | 2018-06-05 | Boe Technology Group Co., Ltd. | Indicating method and device for correcting failure of motion-sensing interaction tracking |
CN108874141A (en) * | 2018-06-25 | 2018-11-23 | 北京京东金融科技控股有限公司 | A kind of body-sensing browsing method and device |
CN109240494A (en) * | 2018-08-23 | 2019-01-18 | 京东方科技集团股份有限公司 | Control method, computer readable storage medium and the control system of electronic data display |
WO2020087204A1 (en) * | 2018-10-29 | 2020-05-07 | 深圳市欢太科技有限公司 | Display screen operating method, electronic device, and readable storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010088035A2 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture recognizer system architecture |
US20110080475A1 (en) * | 2009-10-07 | 2011-04-07 | Microsoft Corporation | Methods And Systems For Determining And Tracking Extremities Of A Target |
CN102270035A (en) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | Apparatus and method for selecting and operating object in non-touch mode |
CN102307309A (en) * | 2011-07-29 | 2012-01-04 | 杭州电子科技大学 | Somatosensory interactive broadcasting guide system and method based on free viewpoints |
US20120062558A1 (en) * | 2010-09-15 | 2012-03-15 | Lg Electronics Inc. | Mobile terminal and method for controlling operation of the mobile terminal |
US20120174213A1 (en) * | 2010-12-29 | 2012-07-05 | Microsoft Corporation | User identification with biokinematic input |
CN102638653A (en) * | 2012-03-01 | 2012-08-15 | 北京航空航天大学 | Automatic face tracing method on basis of Kinect |
CN102722249A (en) * | 2012-06-05 | 2012-10-10 | 上海鼎为软件技术有限公司 | Manipulating method, manipulating device and electronic device |
-
2012
- 2012-10-26 CN CN201210417290.0A patent/CN103777748A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010088035A2 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture recognizer system architecture |
US20110080475A1 (en) * | 2009-10-07 | 2011-04-07 | Microsoft Corporation | Methods And Systems For Determining And Tracking Extremities Of A Target |
CN102270035A (en) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | Apparatus and method for selecting and operating object in non-touch mode |
US20120062558A1 (en) * | 2010-09-15 | 2012-03-15 | Lg Electronics Inc. | Mobile terminal and method for controlling operation of the mobile terminal |
US20120174213A1 (en) * | 2010-12-29 | 2012-07-05 | Microsoft Corporation | User identification with biokinematic input |
CN102307309A (en) * | 2011-07-29 | 2012-01-04 | 杭州电子科技大学 | Somatosensory interactive broadcasting guide system and method based on free viewpoints |
CN102638653A (en) * | 2012-03-01 | 2012-08-15 | 北京航空航天大学 | Automatic face tracing method on basis of Kinect |
CN102722249A (en) * | 2012-06-05 | 2012-10-10 | 上海鼎为软件技术有限公司 | Manipulating method, manipulating device and electronic device |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104881122A (en) * | 2015-05-29 | 2015-09-02 | 深圳奥比中光科技有限公司 | Somatosensory interactive system activation method and somatosensory interactive method and system |
CN106325493A (en) * | 2015-06-30 | 2017-01-11 | 现代自动车株式会社 | Vehicle and method of controlling the same |
CN105022485A (en) * | 2015-07-09 | 2015-11-04 | 中山大学 | Suspension interaction method and system for automatic teller machine device |
CN105022485B (en) * | 2015-07-09 | 2018-09-28 | 中山大学 | A kind of the suspension exchange method and system of automated teller machine equipment |
CN105302295A (en) * | 2015-09-07 | 2016-02-03 | 哈尔滨市一舍科技有限公司 | Virtual reality interaction device having 3D camera assembly |
CN105302294A (en) * | 2015-09-07 | 2016-02-03 | 哈尔滨市一舍科技有限公司 | Interactive virtual reality presentation device |
CN105302294B (en) * | 2015-09-07 | 2018-08-03 | 哈尔滨市一舍科技有限公司 | A kind of interactive virtual reality apparatus for demonstrating |
CN105302295B (en) * | 2015-09-07 | 2018-06-26 | 哈尔滨市一舍科技有限公司 | A kind of virtual reality interactive device with 3D camera assemblies |
US9990031B2 (en) | 2016-01-18 | 2018-06-05 | Boe Technology Group Co., Ltd. | Indicating method and device for correcting failure of motion-sensing interaction tracking |
CN105912117B (en) * | 2016-04-12 | 2019-05-07 | 北京锤子数码科技有限公司 | Motion state method for catching and system |
CN105912117A (en) * | 2016-04-12 | 2016-08-31 | 北京锤子数码科技有限公司 | Motion state capture method and system |
CN106054627A (en) * | 2016-06-12 | 2016-10-26 | 珠海格力电器股份有限公司 | Control method and device based on gesture recognition and air conditioner |
CN106681497A (en) * | 2016-12-07 | 2017-05-17 | 南京仁光电子科技有限公司 | Method and device based on somatosensory control application program |
CN107436686A (en) * | 2017-08-28 | 2017-12-05 | 山东浪潮商用系统有限公司 | A kind of methods, devices and systems for controlling target to be controlled |
CN108874141A (en) * | 2018-06-25 | 2018-11-23 | 北京京东金融科技控股有限公司 | A kind of body-sensing browsing method and device |
CN108874141B (en) * | 2018-06-25 | 2021-03-30 | 京东数字科技控股有限公司 | Somatosensory browsing method and device |
CN109240494A (en) * | 2018-08-23 | 2019-01-18 | 京东方科技集团股份有限公司 | Control method, computer readable storage medium and the control system of electronic data display |
CN109240494B (en) * | 2018-08-23 | 2023-09-12 | 京东方科技集团股份有限公司 | Control method, computer-readable storage medium and control system for electronic display panel |
WO2020087204A1 (en) * | 2018-10-29 | 2020-05-07 | 深圳市欢太科技有限公司 | Display screen operating method, electronic device, and readable storage medium |
CN112714900A (en) * | 2018-10-29 | 2021-04-27 | 深圳市欢太科技有限公司 | Display screen operation method, electronic device and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103777748A (en) | Motion sensing input method and device | |
US8824802B2 (en) | Method and system for gesture recognition | |
US8509484B2 (en) | Information processing device and information processing method | |
CN104350509B (en) | Quick attitude detector | |
US9183431B2 (en) | Apparatus and method for providing activity recognition based application service | |
RU2013154102A (en) | FINGER RECOGNITION AND TRACKING SYSTEM | |
CN107832736B (en) | Real-time human body action recognition method and real-time human body action recognition device | |
JP6066093B2 (en) | Finger shape estimation device, finger shape estimation method, and finger shape estimation program | |
CN104346816A (en) | Depth determining method and device and electronic equipment | |
CN106502390B (en) | A kind of visual human's interactive system and method based on dynamic 3D Handwritten Digit Recognition | |
CN109815776A (en) | Action prompt method and apparatus, storage medium and electronic device | |
CN105068662B (en) | A kind of electronic equipment for man-machine interaction | |
CN110633004A (en) | Interaction method, device and system based on human body posture estimation | |
CN104914989A (en) | Gesture recognition apparatus and control method of gesture recognition apparatus | |
CN109325408A (en) | A kind of gesture judging method and storage medium | |
CN111857334A (en) | Human body gesture letter recognition method and device, computer equipment and storage medium | |
CN112199015B (en) | Intelligent interaction all-in-one machine and writing method and device thereof | |
JP6651388B2 (en) | Gesture modeling device, gesture modeling method, program for gesture modeling system, and gesture modeling system | |
CN110866468A (en) | Gesture recognition system and method based on passive RFID | |
Thakur et al. | Vision based computer mouse control using hand gestures | |
CN115331314A (en) | Exercise effect evaluation method and system based on APP screening function | |
CN111651038A (en) | Gesture recognition control method based on ToF and control system thereof | |
CN110910426A (en) | Action process and action trend identification method, storage medium and electronic device | |
Cohen et al. | Recognition of continuous sign language alphabet using leap motion controller | |
KR101447958B1 (en) | Method and apparatus for recognizing body point |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20140507 |