Nothing Special   »   [go: up one dir, main page]

CN101901339A - Hand movement detecting method - Google Patents

Hand movement detecting method Download PDF

Info

Publication number
CN101901339A
CN101901339A CN 201010242832 CN201010242832A CN101901339A CN 101901339 A CN101901339 A CN 101901339A CN 201010242832 CN201010242832 CN 201010242832 CN 201010242832 A CN201010242832 A CN 201010242832A CN 101901339 A CN101901339 A CN 101901339A
Authority
CN
China
Prior art keywords
hand
camera
detected
palm
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010242832
Other languages
Chinese (zh)
Other versions
CN101901339B (en
Inventor
徐向民
梁卓锐
苗捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN2010102428326A priority Critical patent/CN101901339B/en
Publication of CN101901339A publication Critical patent/CN101901339A/en
Application granted granted Critical
Publication of CN101901339B publication Critical patent/CN101901339B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a hand movement detecting method, at least comprising the following steps: A, starting a machine; B, gathering an image sequence through a camera, and searching the user hands in the image sequence gathered by the camera; C, if the first hand is up, and the second hand supports the elbow joint of the first hand, thereby judging that the first hand is detected hand, and the second hand is support hand; extracting the palm position information of the detected hand and the palm position information of the support hand, and processing the extracted position information. Without wearing a mark on the hand nor exposing the forearm of the hand when detection by using the method, the detection is more convenient, and the detection speed is also improved.

Description

Hand movement detecting method
Technical field
The present invention relates to a kind of hand movement detecting method.
Background technology
In the currently used staff forearm method for testing motion,, mainly adopt following two kinds for obtaining the information of staff accurately: one, on staff, wear mark, two, adopt the Face Detection of arm, but this mode need be exposed arm.For first kind of mode, owing to need wear mark on hand, inconvenience during detection; For the second way, owing to arm need be exposed, also inconvenient when then detecting if wear long sleeves, cause many restrictions, also have influence on the speed of detection.
On the other hand, conventional detection need adopt two cameras, and detects from two different directions (often adopting orthogonal direction), can detect the three-dimensional motion information of staff forearm, has strengthened the cost that detects.
Summary of the invention
The object of the present invention is to provide a kind of hand movement detecting method, when adopting the method for the invention to detect, need on staff, not wear mark, also do not needed the staff forearm is exposed, more convenient during detection, also improved the speed that detects.
Its technical scheme is as follows:
A kind of hand movement detecting method, this method comprises the steps: A, start at least; B, camera collection image sequence, and in the image sequence of camera collection the search subscriber both hands; Be raised state if C searches first hand, second hand rest lived the elbow joint of first hand, judges that first hand is that detected hand, second hand are for supporting hand; And extract the palm positional information of detected hand and the palm positional information of support hand, and the positional information of being extracted is handled.
The present invention locatees staff by Face Detection, palm with detected hand is the vectorial terminal point of staff forearm, palm position with the support hand is the vectorial starting point of staff forearm, make up staff forearm model, and easily staff is detected, avoid in the prior art, on staff, wear mark or arm exposed and the defective brought, detect more conveniently, also improved the speed that detects.
The technical scheme of the further refinement of aforementioned techniques scheme can be:
In the step C, keep the palm position of support hand motionless, both hands are followed the tracks of by a camera; When described detected hand vertically lifts, detect the absolute distance L of the palm of described detected hand to the palm of described support hand; When described detected hand tilted to lift, the palm that detects described detected hand was with respect to the projector distance d of the palm that supports hand in level or vertical direction, and detected the angle θ of the arm of described detected hand with respect to level or vertical direction; Determine the three-dimensional motion state of the palm of detected hand again by L, d and θ.
Described B step comprises the steps: B1, camera seeker face, and judges whether have people's face to exist; B2, if there is people's face to exist, near people's face, set up the sensitizing range, and by camera search subscriber both hands in the sensitizing range.
Described B2 step comprises the steps: B21, camera search for whether palmar aspect is arranged to camera in the sensitizing range, if having, judges that then this hand is a detected hand; B22, camera are searched for area of skin color below detected hand, if there is area of skin color to meet area and length breadth ratio, judge that then this area of skin color is for supporting hand.
In sum, advantage of the present invention is:
When 1, detecting, need not on staff, wear mark or arm exposed and the defective brought, detect more conveniently, also improve the speed of detection;
2, obtain the value of L, d and θ by a camera, can determine the three-dimensional motion state of the palm of detected hand, reduced the cost that detects.
Description of drawings
Fig. 1 is when adopting the described detection method of the embodiment of the invention, the image sequence figure that camera is gathered;
Fig. 2 is that detected hand is erect the constitutional diagram when lifting;
Fig. 3 is tilt constitutional diagram when lifting of detected hand;
Fig. 4 is the tri-vector figure of staff forearm.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated:
As shown in Figure 1, a kind of hand movement detecting method, this method comprises the steps: A, start at least; B, camera collection image sequence, and in the image sequence of camera collection the search subscriber both hands; Be raised state if C searches first hand, second hand rest lived the elbow joint of first hand, judges that first hand is that detected hand, second hand are for supporting hand; And extract the palm positional information of detected hand and the palm positional information of support hand, and the positional information of being extracted is handled.
Described B step comprises the steps: B1, camera seeker face, and judges whether have people's face to exist; B2, if there is people's face to exist, near people's face, set up the sensitizing range, and by camera the search subscriber both hands are (wherein in the sensitizing range, described B2 step comprises the steps: B21, camera search for whether palmar aspect is arranged to camera in the sensitizing range, if have, judge that then this hand is a detected hand; B22, camera are searched for area of skin color below detected hand, if there is area of skin color to meet area and length breadth ratio, judge that then this area of skin color is for supporting hand).
In the step C, keep the palm position of support hand motionless, both hands are followed the tracks of by a camera; As shown in Figure 2, when described detected hand vertically lifts, detect the absolute distance L of the palm of described detected hand to the palm of described support hand; As shown in Figure 3, when described detected hand tilts to lift, detect palm in the horizontal direction the projector distance d of the palm of described detected hand, and detect the angle θ of the arm of described detected hand with respect to vertical direction with respect to the support hand; As shown in Figure 4, carry out geometrical calculation, be mapped to three dimensions by L, d and θ, and the three-dimensional motion state of the palm of definite detected hand.
Below the described detection method of present embodiment is described in detail again:
1, the people enters in the camera scope, the person is over against camera, arm motion detects when being not activated, and seeker's face in the image that collects carries out people's face and detects, adopt the Haar wavelet transformation to carry out feature extraction, select the face identification of conducting oneself of Adaboost sorter, search people's face after, according to the size of people's face, determine the rectangular area of search staff, with further accurately and dwindle the hunting zone that hand detects.
2, detect people's face after, in people's face near zone, search for staff, mainly judge that by Face Detection and form carrying out hand detects, specific algorithm is as follows:
2.1 the formula below at first adopting arrives the YCbCr space with the RGB color space conversion
Y = 0.257 R + 0.504 G + 0.098 B + 16 Cb = - 0.148 R - 0.219 G + 0.439 B + 128 Cr = 0.439 R - 0.368 G - 0.071 B + 128
Judge 2.2 adopt the Gauss model method to carry out the colour of skin, suppose colour of skin distribution obedience uni-modal Gaussian
p ( x ) = 1 2 π e ( x - μ ) 2 2 σ 2
In the YCbCr color space,, obtain the skin color probability calculating parameter by gathering the colour of skin sample training under the different illumination conditions.The skin color probability computing formula is as follows:
p(Cb,Cr)=exp[-0.5(x-m) TC -1(x-m)]
Wherein:
X=(Cb, Cr) T, be the pixel in the CbCr space
M=E{x} is the average of all pixels in the CbCr space
C=E{ (x-m) (x-m) T, be the variance of all pixels in the CbCr space
Average and variance obtain by the colour of skin sample statistics that collects, and (Cb, value Cr) are got 0.6 and be threshold value, and its probability surpasses 0.6 and promptly thinks the skin pixel point finally to calculate p;
2.3 handle carry out morphology by the bianry image after the colour of skin cluster, mark UNICOM zone is carried out hand and is chosen by area size, UNICOM's zone length breadth ratio;
3, detect hand after, judge whether it is staff forearm motion detection, basis for estimation is following 2 points:
Support hand rest and the elbow joint of detected hand, promptly certain distance (being generally 1.5 times of people's face height) can search an other colour of skin district below detected hand, and this zone is identified as the support hand;
Detected arm forms relative vertical position (it is generally acknowledged with horizontal sextant angle greater than 85 degree), search effective detected hand inclination angle after, just trigger staff forearm motion detection.
4, camera automatic focusing aligning detected hand makes camera output image 70% be detected hand, to dwindle following range and to improve accuracy of detection, by detected detected hand and support hand, calculates the detected hand palm of the hand to the distance reference value L that supports hand.
In the staff forearm testing process, support hand and keep motionless, in present frame, search for staff.Search staff, think that promptly detected hand is effective, find detected hand after, behaviour detected hand the position below search for area of skin color, if there is area of skin color to meet area and length breadth ratio, promptly be judged to be the support hand.
The calculating detected hand arrives the horizontal projection of support hand apart from d, and staff forearm vector off-center angle θ (being the angle between OA and the OD); Analyze L, d and relation are calculated the staff forearm at the three-dimensional space motion state, and computing method are as follows:
As shown in Figure 4, OA is the staff forearm vector of original state, and length is L, with O is true origin, and the OA direction is the Z direction, as shown in Figure 4, the dotted line plane is over against camera, the plane vertical direction is a directions X, thereby constructs coordinate system, in view of the activity characteristic of arm, forearm can only be in postbrachium the place ahead, OB is a current state staff forearm vector, and BC length is d, and the size of angle DOA is θ.Then the coordinate of B (x, y z) are:
x=d;
y=dtanθ;
z = L 2 - d 2 - y 2
Be specific embodiments of the invention only below, do not limit protection scope of the present invention with this; Any replacement and the improvement done on the basis of not violating the present invention's design all belong to protection scope of the present invention.

Claims (4)

1. a hand movement detecting method is characterized in that, this method comprises the steps: at least
A, start;
B, camera collection image sequence, and in the image sequence of camera collection the search subscriber both hands;
Be raised state if C searches first hand, second hand rest lived the elbow joint of first hand, judges that first hand is that detected hand, second hand are for supporting hand; And extract the palm positional information of detected hand and the palm positional information of support hand, and the positional information of being extracted is handled.
2. hand movement detecting method according to claim 1 is characterized in that, in the step C, the palm position that keeps supporting hand is motionless, by a camera both hands is followed the tracks of; When described detected hand vertically lifts, detect the absolute distance L of the palm of described detected hand to the palm of described support hand; When described detected hand tilted to lift, the palm that detects described detected hand was with respect to the projector distance d of the palm that supports hand in level or vertical direction, and detected the angle θ of the arm of described detected hand with respect to level or vertical direction; Determine the three-dimensional motion state of the palm of detected hand again by L, d and θ.
3. hand movement detecting method as claimed in claim 1 or 2 is characterized in that described B step comprises the steps:
B1, camera seeker face, and judge whether have people's face to exist;
B2, if there is people's face to exist, near people's face, set up the sensitizing range, and by camera search subscriber both hands in the sensitizing range.
4. as hand movement detecting method as described in the claim 3, it is characterized in that described B2 step comprises the steps:
Whether B21, camera are searched in the sensitizing range has palmar aspect to camera, if having, judges that then this hand is a detected hand;
B22, camera are searched for area of skin color below detected hand, if there is area of skin color to meet area and length breadth ratio, judge that then this area of skin color is for supporting hand.
CN2010102428326A 2010-07-30 2010-07-30 Hand movement detecting method Expired - Fee Related CN101901339B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010102428326A CN101901339B (en) 2010-07-30 2010-07-30 Hand movement detecting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010102428326A CN101901339B (en) 2010-07-30 2010-07-30 Hand movement detecting method

Publications (2)

Publication Number Publication Date
CN101901339A true CN101901339A (en) 2010-12-01
CN101901339B CN101901339B (en) 2012-11-14

Family

ID=43226863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010102428326A Expired - Fee Related CN101901339B (en) 2010-07-30 2010-07-30 Hand movement detecting method

Country Status (1)

Country Link
CN (1) CN101901339B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719279A (en) * 2016-01-15 2016-06-29 上海交通大学 Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method
CN106383452A (en) * 2016-11-24 2017-02-08 北京地平线机器人技术研发有限公司 Smart control module and kitchen appliances employing same
CN107300877A (en) * 2017-07-26 2017-10-27 佛山伊贝尔科技有限公司 A kind of hologram three-dimensional projects robot
CN109815828A (en) * 2018-12-28 2019-05-28 公安部第三研究所 Realize the system and method for initiative alarming or help-seeking behavior detection control
CN110633382A (en) * 2019-07-26 2019-12-31 上海工程技术大学 Automatic carpal plane searching method based on three-dimensional human body scanning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060077175A1 (en) * 2004-10-07 2006-04-13 Maurizio Pilu Machine-human interface
CN101033963A (en) * 2007-04-10 2007-09-12 南京航空航天大学 Location system of video finger and location method based on finger tip marking
CN101073089A (en) * 2004-04-15 2007-11-14 格斯图尔泰克股份有限公司 Tracking bimanual movements
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating
JP2010079651A (en) * 2008-09-26 2010-04-08 Toshiba Corp Movement recognition device, method and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073089A (en) * 2004-04-15 2007-11-14 格斯图尔泰克股份有限公司 Tracking bimanual movements
US20060077175A1 (en) * 2004-10-07 2006-04-13 Maurizio Pilu Machine-human interface
CN101033963A (en) * 2007-04-10 2007-09-12 南京航空航天大学 Location system of video finger and location method based on finger tip marking
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating
JP2010079651A (en) * 2008-09-26 2010-04-08 Toshiba Corp Movement recognition device, method and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《ICACT》 20060222 Kye Kyung kim 等 Gesture Analysis for Human-robot Interaction 全文 1-4 , 2 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719279A (en) * 2016-01-15 2016-06-29 上海交通大学 Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method
CN105719279B (en) * 2016-01-15 2018-07-13 上海交通大学 Based on the modeling of cylindroid trunk and arm regions segmentation and arm framework extraction method
CN106383452A (en) * 2016-11-24 2017-02-08 北京地平线机器人技术研发有限公司 Smart control module and kitchen appliances employing same
CN107300877A (en) * 2017-07-26 2017-10-27 佛山伊贝尔科技有限公司 A kind of hologram three-dimensional projects robot
CN109815828A (en) * 2018-12-28 2019-05-28 公安部第三研究所 Realize the system and method for initiative alarming or help-seeking behavior detection control
CN110633382A (en) * 2019-07-26 2019-12-31 上海工程技术大学 Automatic carpal plane searching method based on three-dimensional human body scanning
CN110633382B (en) * 2019-07-26 2022-04-12 上海工程技术大学 Automatic carpal plane searching method based on three-dimensional human body scanning

Also Published As

Publication number Publication date
CN101901339B (en) 2012-11-14

Similar Documents

Publication Publication Date Title
CN101901339B (en) Hand movement detecting method
US10466797B2 (en) Pointing interaction method, apparatus, and system
CN104484645B (en) A kind of " 1 " gesture identification method and system towards man-machine interaction
CN103854292B (en) A kind of number and the computational methods and device in crowd movement direction
CN101807257A (en) Method for identifying information of image tag
CN101593022A (en) A kind of quick human-computer interaction of following the tracks of based on finger tip
WO2018076392A1 (en) Pedestrian statistical method and apparatus based on recognition of parietal region of human body
CN109685827B (en) Target detection and tracking method based on DSP
EP3036714B1 (en) Unstructured road boundary detection
CN103413120A (en) Tracking method based on integral and partial recognition of object
CN106339657B (en) Crop straw burning monitoring method based on monitor video, device
CN103257713A (en) Gesture control method
CN107993224B (en) Object detection and positioning method based on circular marker
CN103530892A (en) Kinect sensor based two-hand tracking method and device
CN106971406A (en) Object pose detection method and device
CN104700088A (en) Gesture track recognition method based on monocular vision motion shooting
CN105894540A (en) Method and system for counting vertical reciprocating movements based on mobile terminal
CN108182381A (en) Escalator occupant detection algorithm based on quick Adaboost training algorithms
CN103106409A (en) Composite character extraction method aiming at head shoulder detection
US9160986B2 (en) Device for monitoring surroundings of a vehicle
CN101908150B (en) Human body detection method
JP5534432B2 (en) Information terminal equipment
Zhang et al. Dynamic fry counting based on multi-object tracking and one-stage detection
CN114612933B (en) Monocular social distance detection tracking method
CN113591973B (en) Intelligent comparison method for appearance state change of track plate

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121114

Termination date: 20210730

CF01 Termination of patent right due to non-payment of annual fee