Nothing Special   »   [go: up one dir, main page]

CN106570489A - Living body determination method and apparatus, and identity authentication method and device - Google Patents

Living body determination method and apparatus, and identity authentication method and device Download PDF

Info

Publication number
CN106570489A
CN106570489A CN201610992121.8A CN201610992121A CN106570489A CN 106570489 A CN106570489 A CN 106570489A CN 201610992121 A CN201610992121 A CN 201610992121A CN 106570489 A CN106570489 A CN 106570489A
Authority
CN
China
Prior art keywords
face
pulse
characteristic curve
pulse characteristic
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610992121.8A
Other languages
Chinese (zh)
Inventor
赵凌
李季檩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201610992121.8A priority Critical patent/CN106570489A/en
Publication of CN106570489A publication Critical patent/CN106570489A/en
Priority to PCT/CN2017/109989 priority patent/WO2018086543A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention relates to a living body determination method. The method comprises: a plurality of frames of face images are collected; for each frame of face image, a face area is extracted; the illumination intensity of the face area is obtained, and a pulse characteristic corresponding to each frame of face image is calculated based on the illumination intensity of the face area; according to the pulse characteristic corresponding to each frame of face image, a pulse characteristic curve is established; the pulse characteristic curve is compared with a pre-stored standard non-living-body pulse characteristic curve; if a difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-stored standard non-living-body pulse characteristic curve exceeds a preset characteristic threshold, the collected image is determined to be a living-body face image; and otherwise, the collected image is determined to be a non-living-body face image. Therefore, the hardware cost is saved and the detection rate is improved. In addition, the invention also provides a living body determination apparatus and an identity authentication method and device.

Description

Living body distinguishing method and device, identity authentication method and device
Technical Field
The invention relates to the technical field of computers, in particular to a living body distinguishing method and device and an identity authentication method and device.
Background
With the increasing demand for information security, living human face identification has been widely applied in the fields of human face access control, financial verification and the like, so as to prevent users from using illegal photos to automatically register bank accounts in a large scale through a human face recognition system or a machine.
In a traditional living body distinguishing technology, a certain interaction, such as shaking head, blinking and the like, is usually combined in an actual application scene, and a real person and a photo are distinguished through the position movement of a certain point of a face; the traditional optical plethysmography adopts a close-distance contact type, detects the blood volume change at the tail end of a human body through an additional instrument, estimates the pulse of a person, and distinguishes a real person from a picture according to the change of the pulse.
However, this conventional living body discrimination requires user interaction, and only after the user makes correct interaction according to a prompt, the living body detection can be passed, resulting in a low detection rate; conventional optical plethysmography requires additional instrumentation, making the hardware costly.
Disclosure of Invention
In view of the above, it is desirable to provide a living body identification method and apparatus, and an identity authentication method and apparatus, which can improve a detection rate while saving hardware costs.
A living body discrimination method, the method comprising:
collecting a plurality of frames of face images;
extracting a face area from each frame of face image;
acquiring the illumination intensity of a face area, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area;
establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and comparing the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, judging that the collected image is a living body face image, otherwise, judging that the collected image is a non-living body face image.
A living body discriminating apparatus, the apparatus comprising:
the face image acquisition module is used for acquiring a plurality of frames of face images;
the face region extraction module is used for extracting a face region from each frame of face image;
the pulse feature calculation module is used for acquiring the illumination intensity of the face area and calculating the pulse feature corresponding to each frame of face image according to the illumination intensity;
the pulse characteristic curve establishing module is used for establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and the living body judging module is used for comparing the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, judging that the collected image is a living body face image if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, and otherwise, judging that the collected image is a non-living body face image.
The living body distinguishing method and the living body distinguishing device collect a plurality of frames of face images, extract a face area for each frame of face image, obtain the illumination intensity of the face area, calculate the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area, establish a pulse characteristic curve according to the pulse characteristic corresponding to each frame of face image, compare the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, and judge the living body according to the comparison result. Because the method and the device calculate the pulse characteristics corresponding to the face image by using the illumination intensity of the face area and analyze the pulse characteristic curve to realize living body judgment, no additional instrument or equipment is needed to estimate the pulse characteristics, and the hardware cost is saved; and the living body judgment is not required to be completed by the interaction of the user, so that the detection rate of the living body judgment is improved.
A method of identity authentication, the method comprising:
receiving a user identity authentication request, wherein the user identity authentication request carries a user identifier;
acquiring a plurality of frames of face images of the user according to the user identity authentication request;
extracting a face area from each frame of face image;
acquiring the illumination intensity of a face area, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area;
establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within a preset range, the user identity authentication is passed, otherwise, the user identity authentication is not passed.
An identity authentication apparatus, the apparatus comprising:
the identity authentication request acquisition module is used for receiving an identity authentication request, and the user identity authentication request carries a user identifier;
the face image acquisition module is used for acquiring multi-frame face images of the user according to the user identity authentication request;
the human face region extraction module is used for extracting a human face region from each real human face image;
the pulse feature calculation module is used for acquiring the illumination intensity of a face area and calculating the pulse feature corresponding to each frame of face image according to the illumination intensity of the face area;
the pulse characteristic curve establishing module is used for establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and the identity authentication module is used for comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within a preset range, the user identity authentication is passed, otherwise, the user identity authentication is not passed.
According to the identity authentication method and the identity authentication device, the living body distinguishing method is applied to the identity authentication field, the pulse characteristic curve obtained during living body distinguishing is compared with the pre-stored standard pulse characteristic curve corresponding to the user identification, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within the preset range, the user identity authentication is passed, and otherwise, the user identity authentication is not passed. The method and the device perform identity authentication by comparing the pulse characteristic curves of the users, do not need the users to perform additional interaction actions, and do not need additional instruments, so that the efficiency of the identity authentication can be improved, and the hardware cost can be saved.
Drawings
FIG. 1 is a diagram illustrating an exemplary embodiment of an application environment of a living body discrimination method and an identity authentication method;
FIG. 2A is a diagram illustrating an internal architecture of a server in one embodiment;
fig. 2B is an internal structural view of the terminal in one embodiment;
FIG. 3 is a flowchart of a living body discriminating method in one embodiment;
FIG. 4 is a diagram illustrating a segmentation result of the forehead region of the face shown in FIG. 3 according to an embodiment;
FIG. 5 is a flowchart illustrating a pulse feature calculation method of FIG. 3 according to an embodiment;
FIG. 6 is a flow diagram of a method of identity authentication in one embodiment;
FIG. 7 is a flowchart of a pulse feature calculation method of FIG. 6 according to one embodiment;
FIG. 8 is a flow chart of a method of identity authentication in another embodiment;
FIG. 9 is a block diagram showing the configuration of a living body discriminating apparatus in one embodiment;
FIG. 10 is a block diagram showing the structure of an authentication apparatus according to an embodiment;
fig. 11 is a block diagram showing the structure of an authentication device according to another embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The living body identification method provided by the embodiment of the invention can be applied to the environment shown in FIG. 1. Referring to fig. 1, a server 102 may receive and process a plurality of frames of face images collected by a terminal 104. Specifically, the server 102 communicates with the terminal 104 through a network, receives a plurality of frames of face images collected and sent by the terminal 104, extracts a face area of each frame of face image, obtains an illumination intensity of the face area, calculates a pulse feature corresponding to each frame of face image and establishes a pulse feature curve, compares the established pulse feature curve with a standard non-living body pulse feature curve, performs living body judgment according to a comparison result, and sends the judgment result to the terminal 104. The terminal includes, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, portable wearable devices, and the like. It should be noted that, in other embodiments, a terminal may also directly process multiple frames of face images to determine whether the images are living bodies.
The identity authentication method provided by the embodiment of the invention can also be applied to the environment shown in fig. 1. Referring to fig. 1, the server 102 may receive a user authentication request sent by the terminal 104, and may also return a user authentication result to the terminal 104. Specifically, the server 102 communicates with the terminal 104 through a network, receives a user identity authentication request sent by the terminal 104 and a plurality of frames of face images of the user collected by the terminal 104 according to the user identity authentication request, extracts a face area of each frame of face image, obtains an illumination intensity of the face area, calculates pulse characteristics corresponding to each frame of face image and establishes a pulse characteristic curve, compares the established pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to a user identifier, performs identity authentication according to a comparison result, and sends an identity authentication result to the terminal 104. The terminal includes, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, portable wearable devices, and the like. In other embodiments, the terminal 104 may obtain the user identity authentication request, or may directly verify the user identity. Specifically, the terminal 104 may directly process the collected multiple frames of face images of the user to obtain a pulse characteristic curve of the user, and compare the pulse characteristic curve with a standard pulse characteristic curve of the user to obtain a user authentication result.
In one embodiment, as shown in fig. 2A, there is also provided a server including a processor, a nonvolatile storage medium, an internal memory, and a network interface connected by a system bus, the nonvolatile storage medium having an operating system stored therein and a living body discriminating apparatus for performing a living body discriminating method or an identity authenticating apparatus for performing an identity authenticating method. The processor is used for improving the calculation and control capacity and supporting the operation of the whole server. The internal memory is used for providing an environment for the operation of the living body identification device or the identity authentication device in the nonvolatile storage medium, and the internal memory can store computer readable instructions which can cause the processor to execute a living body identification method or an identity authentication method when being executed by the processor. The network interface is used for carrying out network communication with the terminal, receiving or sending data, such as a face image sent by the receiving terminal, and sending a living body judgment result to the terminal; or receiving an identity authentication request sent by the terminal and the collected face image, and sending an identity authentication result to the terminal.
In one embodiment, as shown in fig. 2B, there is also provided a terminal including a processor, a nonvolatile storage medium, an internal memory, a network interface, and a display screen connected by a system bus, the nonvolatile storage medium having an operating system stored therein and a living body discriminating device for performing a living body discriminating method or an identity authenticating device for performing an identity authenticating method. The processor is used for improving the calculation and control capacity and supporting the operation of the whole server. The internal memory is used to provide an environment for operation of the living body identification device in the non-volatile storage medium, and the internal memory stores computer-readable instructions, which when executed by the processor, cause the processor to execute a living body identification method. The network interface is used for performing network communication with the server, receiving or sending data, for example, receiving a pulse feature comparison result sent by the server.
As shown in fig. 3, in one embodiment, a living body distinguishing method is provided, which is exemplified by being applied to a server shown in fig. 2A or a terminal shown in fig. 2B, and includes:
step 302, collecting a plurality of frames of face images.
In this embodiment, because the pulse characteristic curve needs to be acquired, multiple frames of face images need to be processed. Specifically, a video including a face image within a preset time may be acquired, or a face image may be acquired at preset intervals to obtain multiple frames of face images.
And step 304, extracting a face area of each frame of face image.
In the face image, the face region is the region that can reflect the pulse characteristics of the living body most, and the regions such as hair and clothes do not reflect the characteristics of the living body. Therefore, in this embodiment, a face region needs to be extracted from each frame of face image.
Specifically, the face region can be extracted from each frame of face image by using an image integral graph and an Adaboost method. The method specifically comprises the following steps: the method comprises the steps of rapidly obtaining Haar face features by calculating an image integral graph, classifying training samples by using an Adaboost classification algorithm according to the Haar face features, and classifying test samples by using a final classifier obtained by classification, so that the face region is extracted. The Haar feature is a common feature description operator in the field of computer vision, and the Haar feature value reflects the gray level change condition of an image; adaboost is an iterative algorithm, and the core idea thereof is to train different classifiers (weak classifiers) aiming at the same training set, and then to assemble the weak classifiers to form a stronger final classifier (strong classifier).
And step 306, acquiring the illumination intensity of the face area, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area.
In this embodiment, first, the face region is divided into a plurality of face subregions, which specifically includes: obtaining some face key points such as some points on the forehead, the left eye, the right eye, the left side of the cheek and the right side of the cheek by using a face registration algorithm, and then carrying out region segmentation on the face region according to the key points to obtain the face subregions such as the forehead, the left eye, the right eye, the left side of the cheek and the right side of the cheek; the illumination intensity of all pixels in the face sub-region is obtained, if the difference value of the illumination intensity of any two pixels in the face sub-region is within the preset light intensity threshold value, the face sub-region is continuously segmented until the difference value of the illumination intensity of any two pixels in the face sub-region obtained after segmentation does not exceed the preset light intensity threshold value, so that the segmentation of the face region is more detailed, for example, the forehead region is segmented into a left part, a middle part and a right part, as shown in fig. 4.
In this embodiment, in order to reduce the influence of ambient light on the skin color change of the face, so as to accurately reflect the skin color change of the face sub-region caused by the change of the oxygen saturation of the blood vessel and the blood volume, the face region is subdivided into a plurality of face sub-regions, so that the light intensity of all positions in each face sub-region can be approximately represented as a constant, and the light intensity of the face sub-region is the constant.
Secondly, obtaining the illumination intensity of each face subregion and the weight corresponding to each face subregion, and carrying out weighted summation on the illumination intensity of each face subregion according to the corresponding weight, wherein the summation result is the pulse characteristic corresponding to the face image. If the maximum intensity and the minimum intensity in the face sub-region exceed a certain threshold within a certain time range, which indicates that the region is greatly influenced by the ambient light, the region is ignored, that is, the region does not participate in the calculation of the pulse characteristics.
And step 308, establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image.
The pulse characteristic corresponding to each frame of face image is a static value, whether the face image is a living body or a non-living body can not be distinguished by only looking at one static value, a pulse characteristic curve can be established by forming lines of pulse characteristic connecting points corresponding to all the collected face images, and subsequent living body judgment is carried out by analyzing some attributes of the pulse characteristic curve, whether the pulse characteristic curve changes periodically or not, and if the pulse characteristic curve changes periodically, the change period, the maximum amplitude and the like.
And 310, comparing the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, judging that the collected image is a living body face image, otherwise, judging that the collected image is a non-living body face image.
A non-living body here is an object without vital signs, such as a piece of paper. Acquiring a video containing a non-living body within preset time or acquiring the video of the non-living body at preset time intervals to obtain a multi-frame non-living body image. Since the non-living body has no skin color change due to the change of blood volume and oxygen saturation, that is, the illumination intensity is constant at all positions of the non-living body. In a preset time, if the ambient light intensity is kept unchanged, pre-storing a standard non-living body pulse characteristic curve as a straight line with a pulse characteristic value approximate to the ambient light intensity; if the environment illumination intensity changes, the pre-stored standard non-living body pulse characteristic curve is a change curve of which the pulse characteristic value is approximate to the environment illumination intensity.
The non-living body has the advantages that the corresponding skin color of the non-living body is not changed due to the fact that oxygen saturation and blood volume in blood vessels of the non-living body are fixed, so that the illumination intensity in the face area of the non-living body of one or more adjacent frames is not changed, and the pre-stored standard non-living body pulse characteristic curve is in a single unchanged state. If the difference between the characteristic value of the pulse characteristic curve obtained in the last step and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, the pulse characteristic curve obtained in the last step is changed, the face image corresponding to the pulse characteristic curve can be judged to be a living body face image, otherwise, the pulse characteristic curve is single and unchanged, and the face image is the non-living body face image.
In the embodiment, the pulse characteristics corresponding to the face image are calculated by utilizing the illumination intensity of the face area, and living body judgment is realized by analyzing the comparison result of the pulse characteristic curve and the pre-stored standard non-living body pulse characteristic curve without additional instrument equipment for estimating the pulse characteristics, so that the hardware cost is saved; and the living body judgment is not required to be completed by the interaction of the user, so that the detection rate of the living body judgment is improved.
In one embodiment, as shown in FIG. 5, the step 306 includes:
and step 316, segmenting the face region to obtain a face subregion.
In this embodiment, the segmentation of the face region may be implemented by using a region segmentation algorithm, but before the segmentation, some face key points, such as some points on the forehead, the left eye, the right eye, the left side of the cheek, and the right side of the cheek, are usually obtained according to a face registration algorithm, and then the region segmentation is performed on the face region according to the key points to obtain the face sub-regions, such as the forehead, the left eye, the right eye, the left side of the cheek, and the right side of the cheek.
In one embodiment, after the region segmentation algorithm is used for the face region, the illumination intensities of all pixels in the face sub-region are obtained, and if the difference value of the illumination intensities of any two pixels exceeds a preset light intensity threshold, the face sub-region is continuously segmented until the difference value of the illumination intensities of any two pixels in the face sub-region obtained after segmentation does not exceed the preset light intensity threshold. In the embodiment, the subdivision degree of the face sub-regions is determined by judging whether the difference value of the illumination intensities of any two pixels in the face sub-regions is within the range of the preset light intensity threshold value, so that the illumination intensities of all positions in any face sub-region can be approximately expressed as a constant, the influence of illumination on skin color change is reduced, and the skin color change caused by blood flow is accurately reflected.
And 336, acquiring the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion.
Because the blood vessels of the face are not uniformly distributed, the skin color change of different areas of the face is different, and the skin color change is relatively obvious in an area with concentrated blood vessels, the weight corresponding to the area is relatively large, otherwise, the weight corresponding to the area is relatively small. In this embodiment, the obtained illumination intensity corresponding to each face subregion is subjected to weighted summation according to the obtained weight corresponding to each face subregion, so as to obtain the pulse feature corresponding to each frame of face image.
In one embodiment, the pulse characteristics may be calculated according to the following formula:
wherein,for pulse characteristics, n is the total number of regions, GiThe weight corresponding to each region is used as the weight,the function I is an indication function which indicates that when the maximum intensity and the minimum intensity in the area I exceed a certain threshold value within a certain time range, the area is ignored and does not participate in the calculation of the pulse characteristics.
In this embodiment, the face sub-region is obtained by segmenting the face region, and the pulse feature corresponding to each frame of face image is calculated according to the illumination intensity corresponding to the face sub-region and the weight corresponding to the face sub-region. The estimated value of the pulse characteristic is obtained through a weighting summation mode, and the accuracy rate of pulse characteristic calculation is improved.
In one embodiment, as shown in fig. 6, there is provided an identity authentication method, which is exemplified as applied in the server shown in fig. 1, and which includes:
step 602, receiving a user identity authentication request, where the user identity authentication request carries a user identifier.
The user identity authentication request is a request for verifying the identity, which is sent to the server by the terminal where the user is located; the user identifier is used for identifying each user, has uniqueness, and can be any one of an identity card number, an instant communication number, a social account, an electronic mailbox or a mobile communication number of the user.
For example, in a face access control, a user places an identity card in a designated scannable area, a terminal where the user is located scans the identity card, obtains a user identifier, namely an identity card number, of the user, and sends an identity authentication request to a server after the identity card number is successfully obtained, so that a subsequent server can find out a standard pulse characteristic curve corresponding to the user identifier from a database.
And step 604, collecting multiple frames of face images of the user according to the user identity authentication request.
In this embodiment, because the pulse characteristic curve needs to be acquired, multiple frames of face images need to be processed. Specifically, a video including a face image within a preset time may be acquired, or a face image may be acquired at preset intervals to obtain multiple frames of face images.
Step 606, extracting the face region for each frame of face image.
In the face image, the face region is the region that can reflect the pulse characteristics of the living body most, and the regions such as hair and clothes do not reflect the characteristics of the living body. Therefore, in this embodiment, a face region needs to be extracted from each frame of face image.
Specifically, the face region can be extracted from each frame of face image by using an image integral graph and an Adaboost method. The method specifically comprises the following steps: the method comprises the steps of rapidly obtaining Haar face features by calculating an image integral graph, classifying training samples by using an Adaboost classification algorithm according to the Haar face features, and classifying test samples by using a final classifier obtained by classification, so that the face region is extracted. The Haar feature is a common feature description operator in the field of computer vision, and the Haar feature value reflects the gray level change condition of an image; adaboost is an iterative algorithm, and the core idea thereof is to train different classifiers (weak classifiers) aiming at the same training set, and then to assemble the weak classifiers to form a stronger final classifier (strong classifier).
Step 608, obtaining the illumination intensity of the face area, and calculating the pulse feature corresponding to each frame of face image according to the illumination intensity of the face area.
In this embodiment, first, the face region is divided into a plurality of face subregions, which specifically includes: obtaining some face key points such as some points on the forehead, the left eye, the right eye, the left side of the cheek and the right side of the cheek by using a face registration algorithm, and then carrying out region segmentation on the face region according to the key points to obtain the face subregions such as the forehead, the left eye, the right eye, the left side of the cheek and the right side of the cheek; the illumination intensity of all pixels in the face sub-region is obtained, if the difference value of the illumination intensity of any two pixels in the face sub-region is within the preset light intensity threshold value, the face sub-region is continuously segmented until the difference value of the illumination intensity of any two pixels in the face sub-region obtained after segmentation does not exceed the preset light intensity threshold value, so that the segmentation of the face region is more detailed, for example, the forehead region is segmented into a left part, a middle part and a right part, as shown in fig. 4.
In this embodiment, in order to reduce the influence of ambient light on the skin color change of the face, so as to accurately reflect the skin color change of the face sub-region caused by the change of the oxygen saturation of the blood vessel and the blood volume, the face region is subdivided into a plurality of face sub-regions, so that the light intensity of all positions in each face sub-region can be approximately represented as a constant, and the light intensity of the face sub-region is the constant.
Secondly, obtaining the illumination intensity of each face subregion and the weight corresponding to each face subregion, and carrying out weighted summation on the illumination intensity of each face subregion according to the corresponding weight, wherein the summation result is the pulse characteristic corresponding to the face image. If the maximum intensity and the minimum intensity in the face sub-region exceed a certain threshold within a certain time range, which indicates that the region is greatly influenced by the ambient light, the region is ignored, that is, the region does not participate in the calculation of the pulse characteristics.
Step 610, establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image.
The pulse characteristics corresponding to each frame of face image are a static value, whether a real person or a picture can not be distinguished by only looking at one static value, a line is formed by connecting pulse characteristics corresponding to all collected face images, a pulse characteristic curve is established, and subsequent living body judgment is carried out by analyzing some attributes of the pulse characteristic curve, whether the pulse characteristic curve changes periodically or not, if the pulse characteristic curve changes periodically, the change period, the maximum amplitude and the like.
And step 612, comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within a preset range, passing the user identity authentication, otherwise, failing to pass the user identity authentication.
The method comprises the steps of storing a standard pulse characteristic curve corresponding to a user identification in a database in advance, collecting a plurality of frames of face images of a user in advance, extracting a face area from each frame of face image, obtaining the illumination intensity of the face area, calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area, establishing a pulse characteristic curve according to the pulse characteristic corresponding to each frame of face image, wherein the pulse characteristic curve is the standard pulse characteristic curve of the user, is stored in the database corresponding to the user identification, and is compared when identity authentication is needed.
In this embodiment, by comparing the pulse characteristic curve obtained during living body identification with the standard pulse characteristic curve corresponding to the pre-stored user identifier, as long as the difference between the characteristic values (such as the variation period, the maximum amplitude, and the like) of the two is within an acceptable range, it is firstly indicated that the acquired face image is the living body face image, and then the user corresponding to the acquired face image and the user corresponding to the user identifier in the database are the same person, that is, the identity authentication is passed. The database may be an online database or a local database.
In the embodiment, the pulse characteristic curve is compared with the pre-stored standard pulse characteristic curve corresponding to the user identifier, so that the user does not need to perform additional interaction action and extra instrument equipment, and therefore the efficiency of identity authentication can be improved and the hardware cost can be saved.
In one embodiment, before comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, the method further comprises: and comparing the pulse characteristic curve with a pre-standard non-living body pulse characteristic curve, and if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-standard non-living body pulse characteristic curve exceeds a threshold value, performing comparison between the pulse characteristic curve and a pre-stored standard pulse characteristic curve corresponding to the user identifier. In the embodiment, only after living body judgment, subsequent identity authentication is performed, the possibility of image spoofing is eliminated, and the identity authentication is performed by comparing the currently acquired pulse characteristic curve with the pre-stored standard pulse characteristic curve corresponding to the user identifier, so that the possibility of spoofing by other people is eliminated, and the identity authentication method is high in safety.
In one embodiment, as shown in fig. 7, the step 608 includes:
and step 618, segmenting the face region to obtain a face sub-region.
The segmentation of the face region can be realized by using a region segmentation algorithm, but before the segmentation, some face key points, such as some points on the forehead, the left eye, the right eye, the left side of the cheek and the right side of the cheek, are usually obtained according to a face registration algorithm, and then the region segmentation is performed on the face region according to the key points to obtain regions of the forehead, the left eye, the right eye, the left side of the cheek, the right side of the cheek and the like.
In one embodiment, after the region segmentation algorithm is used for the face region, the illumination intensities of all pixels in the face sub-region are obtained, and if the difference value of the illumination intensities of any two pixels exceeds a preset light intensity threshold, the face sub-region is continuously segmented until the difference value of the illumination intensities of any two pixels in the face sub-region obtained after segmentation does not exceed the preset light intensity threshold. In the embodiment, the subdivision degree of the face sub-regions is determined by judging whether the difference value of the illumination intensities of any two pixels in the face sub-regions is within the range of the preset light intensity threshold value, so that the illumination intensities of all positions in any face sub-region can be approximately expressed as a constant, the influence of illumination on skin color change is reduced, and the skin color change caused by blood flow is accurately reflected.
Step 638, obtaining the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristics corresponding to each frame of face image according to the illumination intensity corresponding to the face subregion and the weight corresponding to the face subregion.
Because the blood vessels of the face are not uniformly distributed, the skin color change of different areas of the face is different, and the skin color change is relatively obvious in an area with concentrated blood vessels, the weight corresponding to the area is relatively large, otherwise, the weight corresponding to the area is relatively small. In this embodiment, the obtained illumination intensity corresponding to each face subregion is subjected to weighted summation according to the obtained weight corresponding to each face subregion, so as to obtain the pulse feature corresponding to each frame of face image.
In this embodiment, the face sub-region is obtained by segmenting the face region, and the pulse feature corresponding to each frame of face image is calculated according to the illumination intensity corresponding to the face sub-region and the weight corresponding to the face sub-region. The estimated value of the pulse characteristic is obtained through a weighting summation mode, and the accuracy rate of pulse characteristic calculation is improved.
In one embodiment, as shown in fig. 8, another identity authentication method is provided, the method comprising:
step 802, receiving a user identity authentication request, wherein the user identity authentication request carries a user identifier.
Similarly, the user identity authentication request is a request for verifying the identity, which is sent to the server by the terminal where the user is located; the user identifier is a unique credential for distinguishing each user, such as a phone number, a social account, or a mailbox of the user.
And step 804, collecting a plurality of frames of face images of the user according to the user identity authentication request.
Similarly, in the embodiment of identity authentication based on a face image, in order to ensure high accuracy of identity authentication, living body discrimination needs to be performed first, and living body discrimination needs to be realized through feature analysis of more images, so that a plurality of frames of face images are acquired first.
Step 806, extracting a face region for each frame of face image.
In the face image, the face region is the region that can reflect the pulse characteristics of the living body most, and the regions such as hair and clothes do not reflect the characteristics of the living body. Therefore, in this embodiment, a face region needs to be extracted from each frame of face image.
Specifically, the face region can be extracted from each frame of face image by using an image integral graph and an Adaboost method. The method specifically comprises the following steps: the method comprises the steps of rapidly obtaining Haar face features by calculating an image integral graph, classifying training samples by using an Adaboost classification algorithm according to the Haar face features, and classifying test samples by using a final classifier obtained by classification, so that the face region is extracted. The Haar feature is a common feature description operator in the field of computer vision, and the Haar feature value reflects the gray level change condition of an image; adaboost is an iterative algorithm, and the core idea thereof is to train different classifiers (weak classifiers) aiming at the same training set, and then to assemble the weak classifiers to form a stronger final classifier (strong classifier).
And 808, segmenting the face area to obtain a face subarea.
The segmentation of the face region can be realized by using a region segmentation algorithm, but before the segmentation, some face key points, such as some points on the forehead, the left eye, the right eye, the left side of the cheek and the right side of the cheek, are usually obtained according to a face registration algorithm, and then the region segmentation is performed on the face region according to the key points to obtain regions of the forehead, the left eye, the right eye, the left side of the cheek, the right side of the cheek and the like.
And 810, acquiring the illumination intensity of all pixels in the face sub-region, if the difference value of the illumination intensity of any two pixels exceeds a preset light intensity threshold value, continuing to segment the face sub-region, and otherwise, stopping segmenting the face sub-region.
The subdivision degree of the face sub-regions is determined by judging whether the difference value of the illumination intensities of any two pixels in the face sub-regions is within the range of the preset light intensity threshold value, so that the illumination intensities of all the positions in any face sub-region can be approximately expressed as a constant, the influence of illumination on skin color change is reduced, and the skin color change caused by blood flow is accurately reflected.
Step 812, acquiring the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity corresponding to the face subregion and the weight corresponding to the face subregion.
Because the blood vessels of the face are not uniformly distributed, the skin color change of different areas of the face is different, and the skin color change is relatively obvious in an area with concentrated blood vessels, the weight corresponding to the area is relatively large, otherwise, the weight corresponding to the area is relatively small.
And carrying out weighted summation on the acquired illumination intensity corresponding to each face subregion according to the acquired weight corresponding to each face subregion to obtain the pulse characteristic corresponding to each frame of face image. The specific calculation formula is as follows:
wherein,for pulse characteristics, n is the total number of regions, GiThe weight corresponding to each region is used as the weight,the function I is an indication function which indicates that when the maximum intensity and the minimum intensity in the area I exceed a certain threshold value within a certain time range, the area is ignored and does not participate in the calculation of the pulse characteristics.
Step 814, a pulse characteristic curve is established according to the pulse characteristics corresponding to each frame of face image.
The pulse characteristics corresponding to each frame of face image are a static value, whether a real person or a picture can not be distinguished by only looking at one static value, a line is formed by connecting pulse characteristics corresponding to all collected face images, a pulse characteristic curve is established, and subsequent living body judgment is carried out by analyzing some attributes of the pulse characteristic curve, whether the pulse characteristic curve changes periodically or not, if the pulse characteristic curve changes periodically, the change period, the maximum amplitude and the like.
Step 816, comparing the pulse characteristic curve with a pre-standard non-living body pulse characteristic curve, and if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-standard non-living body pulse characteristic curve exceeds a preset characteristic threshold, determining that the collected image is a living body face image.
The non-living body here is an object having no vital sign, such as a paper sheet or the like. Acquiring a video containing a non-living body within preset time or acquiring the video of the non-living body at preset time intervals to obtain a multi-frame non-living body image. Since non-living subjects have no skin color changes due to changes in blood volume and oxygen saturation, i.e., the illumination intensity is constant for all locations of the non-living subject area. In a preset time, if the ambient light intensity is kept unchanged, pre-storing a standard non-living body pulse characteristic curve as a straight line with a pulse characteristic value approximate to the ambient light intensity; if the environment illumination intensity changes, the pre-stored standard non-living body pulse characteristic curve is a change curve of which the pulse characteristic value is approximate to the environment illumination intensity.
The oxygen saturation and the blood volume of the non-living body in the blood vessel are fixed, so that the corresponding skin color of the non-living body does not change, the illumination intensity in the face area of the non-living body of one or more adjacent frames is not changed, and the pre-stored standard non-living body pulse characteristic curve is in a single unchanged state. If the difference between the characteristic value of the pulse characteristic curve obtained in the last step and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, the pulse characteristic curve obtained in the last step is changed, and the face image corresponding to the pulse characteristic curve can be judged to be the living body face image.
Step 818, comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within a preset range, the user identity authentication is passed, otherwise, the user identity authentication is not passed.
The standard pulse characteristic curve corresponding to the user identification is prestored in the database, and by comparing the pulse characteristic curve obtained during living body identification with the standard pulse characteristic curve corresponding to the prestored user identification, as long as the difference between the characteristic values (such as the change period, the maximum amplitude and the like) of the two is within an acceptable range, it is indicated that the user corresponding to the acquired image and the user corresponding to the user identification in the database are the same person, namely that the identity authentication is passed. The database may be an online database or a local database.
In the embodiment, only after living body judgment, subsequent identity authentication is performed, the possibility of image spoofing is eliminated, and the identity authentication is realized by comparing the currently acquired pulse characteristic curve with the pre-stored standard pulse characteristic curve corresponding to the user identifier, so that the possibility of spoofing by other people is eliminated, and the identity authentication method is high in safety.
In one embodiment, as shown in fig. 9, there is provided a living body discriminating apparatus including:
a face image collecting module 902, configured to collect multiple frames of face images.
A face region extracting module 904, configured to extract a face region for each frame of face image.
And the pulse feature calculating module 906 is configured to obtain the illumination intensity of the face area, and calculate a pulse feature corresponding to each frame of face image according to the illumination intensity of the face area.
The pulse characteristic curve establishing module 908 is configured to establish a pulse characteristic curve according to a pulse characteristic corresponding to each frame of the face image.
The living body judging module 910 is configured to compare the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, determine that the collected image is a living body face image if a difference between a characteristic value of the pulse characteristic curve and a characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold, and determine that the collected image is a non-living body face image if the difference is not greater than the preset characteristic threshold.
In one embodiment, the pulse feature calculation module 906 is configured to segment the face region into face subregions; and acquiring the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity corresponding to the face subregion and the weight corresponding to the face subregion.
In one embodiment, the pulse feature calculating module 906 is configured to obtain illumination intensities of all pixels in the face sub-region, continue to segment the face sub-region if a difference between the illumination intensities of any two pixels exceeds a preset light intensity threshold, and otherwise, stop segmenting the face sub-region.
In one embodiment, as shown in fig. 10, there is provided an identity authentication apparatus, including:
the identity authentication request receiving module 1002 is configured to receive a user identity authentication request, where the user identity authentication request carries a user identifier.
And the face image acquisition module 1004 is configured to acquire multiple frames of face images of the user according to the user identity authentication request.
A face region extraction module 1006, configured to extract a face region for each frame of face image by the face region module.
And the pulse feature calculation module 1008 is configured to obtain the illumination intensity of the face area, and calculate a pulse feature corresponding to each frame of face image according to the illumination intensity of the face area.
The pulse characteristic curve establishing module 1010 is configured to establish a pulse characteristic curve according to a pulse characteristic corresponding to each frame of the face image.
And an identity authentication module 1012, configured to compare the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, where if a difference between a characteristic value of the pulse characteristic curve and a characteristic value of the standard pulse characteristic curve is within a preset range, the user identity authentication is passed, and otherwise, the user identity authentication is not passed.
In one embodiment, the pulse feature calculation module 1008 is configured to segment the face region to obtain a face sub-region; and acquiring the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity corresponding to the face subregion and the weight corresponding to the face subregion.
In one embodiment, as shown in fig. 11, the identity authentication apparatus further includes: and the living body judging module 1011 is configured to compare the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, and if a difference between a characteristic value of the pulse characteristic curve and a characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold, determine that the collected image is a living body face image.
In an embodiment, the pulse feature calculation module 1008 is configured to obtain illumination intensities of all pixels in the face sub-region, and if a difference between the illumination intensities of any two pixels exceeds a preset light intensity threshold, continue to segment the face sub-region until the difference between the illumination intensities of any two pixels in the face sub-region obtained after the segmentation does not exceed the preset light intensity threshold.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (14)

1. A living body discrimination method, the method comprising:
collecting a plurality of frames of face images;
extracting a face area from each frame of face image;
acquiring the illumination intensity of a face area, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area;
establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and comparing the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, judging that the collected image is a living body face image, otherwise, judging that the collected image is a non-living body face image.
2. The method of claim 1, wherein the obtaining of the illumination intensity of the face area and the calculating of the pulse feature corresponding to each frame of the face image according to the illumination intensity of the face area comprise:
segmenting the face region to obtain a face subregion;
and acquiring the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity corresponding to the face subregion and the weight corresponding to the face subregion.
3. The method of claim 2, wherein the segmenting the face region into face subregions comprises:
and obtaining the illumination intensity of all pixels in the face sub-region, if the difference value of the illumination intensity of any two pixels exceeds a preset light intensity threshold value, continuing to segment the face sub-region until the difference value of the illumination intensity of any two pixels in the face sub-region obtained after segmentation does not exceed the preset light intensity threshold value.
4. A method of identity authentication, the method comprising:
receiving a user identity authentication request, wherein the user identity authentication request carries a user identifier;
acquiring a plurality of frames of face images of the user according to the user identity authentication request;
extracting a face area from each frame of face image;
acquiring the illumination intensity of a face area, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face area;
establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within a preset range, the user identity authentication is passed, otherwise, the user identity authentication is not passed.
5. The method of claim 4, further comprising, prior to comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identification:
and comparing the pulse characteristic curve with a pre-standard non-living body pulse characteristic curve, and if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, performing the step of comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier.
6. The method of claim 4, wherein the obtaining of the illumination intensity of the face area and the calculating of the pulse feature corresponding to each frame of the face image according to the illumination intensity of the face area comprise:
segmenting the face region to obtain a face subregion;
and acquiring the illumination intensity of each face subregion and the weight of each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face subregions and the weight of the face subregions.
7. The method of claim 6, wherein the segmenting the face region into face subregions comprises:
and obtaining the illumination intensity of all pixels in the face sub-region, if the difference value of the illumination intensity of any two pixels exceeds a preset light intensity threshold value, continuing to segment the face sub-region until the difference value of the illumination intensity of any two pixels in the face sub-region obtained after segmentation does not exceed the preset light intensity threshold value.
8. A living body discriminating apparatus, comprising:
the face image acquisition module is used for acquiring a plurality of frames of face images;
the face region extraction module is used for extracting a face region from each frame of face image;
the pulse feature calculation module is used for acquiring the illumination intensity of the face area and calculating the pulse feature corresponding to each frame of face image according to the illumination intensity;
the pulse characteristic curve establishing module is used for establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and the living body judging module is used for comparing the pulse characteristic curve with a pre-stored standard non-living body pulse characteristic curve, judging that the collected image is a living body face image if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-stored standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, and otherwise, judging that the collected image is a non-living body face image.
9. The apparatus of claim 8, wherein the pulse feature calculating module is configured to segment the face region to obtain a face sub-region; and acquiring the illumination intensity of each face subregion and the weight of each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity of the face subregions and the weight of the face subregions.
10. The apparatus according to claim 8, wherein the pulse feature calculating module is configured to obtain illumination intensities of all pixels in the face sub-region, and if a difference between the illumination intensities of any two pixels exceeds a preset intensity threshold, continue to segment the face sub-region until the difference between the illumination intensities of any two pixels in the face sub-region obtained after segmentation does not exceed the preset intensity threshold.
11. An identity authentication apparatus, the apparatus comprising:
the identity authentication request acquisition module is used for receiving an identity authentication request, and the user identity authentication request carries a user identifier;
the face image acquisition module is used for acquiring multi-frame face images of the user according to the user identity authentication request;
the human face region extraction module is used for extracting a human face region from each real human face image;
the pulse feature calculation module is used for acquiring the illumination intensity of a face area and calculating the pulse feature corresponding to each frame of face image according to the illumination intensity of the face area;
the pulse characteristic curve establishing module is used for establishing a pulse characteristic curve according to the pulse characteristics corresponding to each frame of face image;
and the identity authentication module is used for comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier, if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the standard pulse characteristic curve is within a preset range, the user identity authentication is passed, otherwise, the user identity authentication is not passed.
12. The apparatus of claim 11, further comprising:
and the living body judging module is used for comparing the pulse characteristic curve with a pre-standard non-living body pulse characteristic curve, and if the difference between the characteristic value of the pulse characteristic curve and the characteristic value of the pre-standard non-living body pulse characteristic curve exceeds a preset characteristic threshold value, the step of comparing the pulse characteristic curve with a pre-stored standard pulse characteristic curve corresponding to the user identifier is carried out.
13. The apparatus of claim 11, wherein the pulse feature calculation module is configured to segment the face region into face subregions; and acquiring the illumination intensity corresponding to each face subregion and the weight corresponding to each face subregion, and calculating the pulse characteristic corresponding to each frame of face image according to the illumination intensity corresponding to the face subregion and the weight.
14. The apparatus according to claim 11, wherein the pulse feature calculating module is configured to obtain illumination intensities of all pixels in the face sub-region, and if a difference between the illumination intensities of any two pixels exceeds a preset intensity threshold, continue to segment the face sub-region until the difference between the illumination intensities of any two pixels in the face sub-region obtained after segmentation does not exceed the preset intensity threshold.
CN201610992121.8A 2016-11-10 2016-11-10 Living body determination method and apparatus, and identity authentication method and device Pending CN106570489A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610992121.8A CN106570489A (en) 2016-11-10 2016-11-10 Living body determination method and apparatus, and identity authentication method and device
PCT/CN2017/109989 WO2018086543A1 (en) 2016-11-10 2017-11-08 Living body identification method, identity authentication method, terminal, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610992121.8A CN106570489A (en) 2016-11-10 2016-11-10 Living body determination method and apparatus, and identity authentication method and device

Publications (1)

Publication Number Publication Date
CN106570489A true CN106570489A (en) 2017-04-19

Family

ID=58541303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610992121.8A Pending CN106570489A (en) 2016-11-10 2016-11-10 Living body determination method and apparatus, and identity authentication method and device

Country Status (2)

Country Link
CN (1) CN106570489A (en)
WO (1) WO2018086543A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038428A (en) * 2017-04-28 2017-08-11 北京小米移动软件有限公司 Vivo identification method and device
CN107506713A (en) * 2017-08-15 2017-12-22 哈尔滨工业大学深圳研究生院 Living body faces detection method and storage device
WO2018086543A1 (en) * 2016-11-10 2018-05-17 腾讯科技(深圳)有限公司 Living body identification method, identity authentication method, terminal, server and storage medium
CN108197279A (en) * 2018-01-09 2018-06-22 北京旷视科技有限公司 Attack data creation method, device, system and computer readable storage medium
CN108875333A (en) * 2017-09-22 2018-11-23 北京旷视科技有限公司 Terminal unlock method, terminal and computer readable storage medium
WO2019001427A1 (en) * 2017-06-28 2019-01-03 阿里巴巴集团控股有限公司 Account management method and device
CN109446981A (en) * 2018-10-25 2019-03-08 腾讯科技(深圳)有限公司 A kind of face's In vivo detection, identity identifying method and device
CN109766849A (en) * 2019-01-15 2019-05-17 深圳市凯广荣科技发展有限公司 A kind of biopsy method, detection device and self-help terminal equipment
CN109858375A (en) * 2018-12-29 2019-06-07 深圳市软数科技有限公司 Living body faces detection method, terminal and computer readable storage medium
CN110141246A (en) * 2018-02-10 2019-08-20 上海聚虹光电科技有限公司 Biopsy method based on colour of skin variation
CN110335216A (en) * 2019-07-09 2019-10-15 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, terminal device and readable storage medium storing program for executing
WO2019214295A1 (en) * 2018-05-09 2019-11-14 杭州海康威视数字技术股份有限公司 Illegal attack prevention
CN111464519A (en) * 2020-03-26 2020-07-28 支付宝(杭州)信息技术有限公司 Account registration method and system based on voice interaction
CN111523438A (en) * 2020-04-20 2020-08-11 支付宝实验室(新加坡)有限公司 Living body identification method, terminal device and electronic device
CN111931153A (en) * 2020-10-16 2020-11-13 腾讯科技(深圳)有限公司 Identity verification method and device based on artificial intelligence and computer equipment

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470169A (en) * 2018-05-23 2018-08-31 国政通科技股份有限公司 Face identification system and method
CN111666786B (en) * 2019-03-06 2024-05-03 杭州海康威视数字技术股份有限公司 Image processing method, device, electronic equipment and storage medium
TWI731461B (en) * 2019-11-01 2021-06-21 宏碁股份有限公司 Identification method of real face and identification device using the same
CN112016482B (en) * 2020-08-31 2022-10-25 成都新潮传媒集团有限公司 Method and device for distinguishing false face and computer equipment
US11443527B2 (en) 2021-01-13 2022-09-13 Ford Global Technologies, Llc Material spectroscopy
US11657589B2 (en) 2021-01-13 2023-05-23 Ford Global Technologies, Llc Material spectroscopy
US11741747B2 (en) 2021-01-13 2023-08-29 Ford Global Technologies, Llc Material spectroscopy
CN116389647B (en) * 2023-06-02 2023-08-08 深圳市尚哲医健科技有限责任公司 Emergency first-aid integrated platform

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369315A (en) * 2007-08-17 2009-02-18 上海银晨智能识别科技有限公司 Human face detection method
EP2562687A2 (en) * 2011-08-22 2013-02-27 Fujitsu Limited Biometric authentication device and method
CN103761465A (en) * 2014-02-14 2014-04-30 上海云亨科技有限公司 Method and device for identity authentication
US20150206019A1 (en) * 2014-01-17 2015-07-23 Htc Corporation Methods for identity authentication and handheld electronic devices utilizing the same
CN105844206A (en) * 2015-01-15 2016-08-10 北京市商汤科技开发有限公司 Identity authentication method and identity authentication device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5581696B2 (en) * 2010-01-05 2014-09-03 セイコーエプソン株式会社 Biological information detector and biological information measuring device
JP5625688B2 (en) * 2010-02-01 2014-11-19 セイコーエプソン株式会社 Biological information measuring device
CN106570489A (en) * 2016-11-10 2017-04-19 腾讯科技(深圳)有限公司 Living body determination method and apparatus, and identity authentication method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369315A (en) * 2007-08-17 2009-02-18 上海银晨智能识别科技有限公司 Human face detection method
EP2562687A2 (en) * 2011-08-22 2013-02-27 Fujitsu Limited Biometric authentication device and method
US20150206019A1 (en) * 2014-01-17 2015-07-23 Htc Corporation Methods for identity authentication and handheld electronic devices utilizing the same
CN103761465A (en) * 2014-02-14 2014-04-30 上海云亨科技有限公司 Method and device for identity authentication
CN105844206A (en) * 2015-01-15 2016-08-10 北京市商汤科技开发有限公司 Identity authentication method and identity authentication device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MAYANK KUMAR 等: "DistancePPG: Robust non-contact vital signs monitoring using a camera", 《BIOMEDICAL OPTICS》 *
林卉 等: "《数字摄像测量学》", 30 August 2015, 北京:中国矿业大学出版社 *
管伟光 等: "《体视化技术及应用》", 31 January 1998, 北京:电子工业出版社 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018086543A1 (en) * 2016-11-10 2018-05-17 腾讯科技(深圳)有限公司 Living body identification method, identity authentication method, terminal, server and storage medium
CN107038428A (en) * 2017-04-28 2017-08-11 北京小米移动软件有限公司 Vivo identification method and device
CN107038428B (en) * 2017-04-28 2020-04-07 北京小米移动软件有限公司 Living body identification method and apparatus
WO2019001427A1 (en) * 2017-06-28 2019-01-03 阿里巴巴集团控股有限公司 Account management method and device
CN107506713A (en) * 2017-08-15 2017-12-22 哈尔滨工业大学深圳研究生院 Living body faces detection method and storage device
CN108875333A (en) * 2017-09-22 2018-11-23 北京旷视科技有限公司 Terminal unlock method, terminal and computer readable storage medium
CN108197279A (en) * 2018-01-09 2018-06-22 北京旷视科技有限公司 Attack data creation method, device, system and computer readable storage medium
CN110141246A (en) * 2018-02-10 2019-08-20 上海聚虹光电科技有限公司 Biopsy method based on colour of skin variation
US11270140B2 (en) 2018-05-09 2022-03-08 Hangzhou Hikvision Digital Technology Co., Ltd. Illegal attack prevention
WO2019214295A1 (en) * 2018-05-09 2019-11-14 杭州海康威视数字技术股份有限公司 Illegal attack prevention
CN109446981A (en) * 2018-10-25 2019-03-08 腾讯科技(深圳)有限公司 A kind of face's In vivo detection, identity identifying method and device
CN109858375A (en) * 2018-12-29 2019-06-07 深圳市软数科技有限公司 Living body faces detection method, terminal and computer readable storage medium
CN109858375B (en) * 2018-12-29 2023-09-26 简图创智(深圳)科技有限公司 Living body face detection method, terminal and computer readable storage medium
CN109766849A (en) * 2019-01-15 2019-05-17 深圳市凯广荣科技发展有限公司 A kind of biopsy method, detection device and self-help terminal equipment
CN109766849B (en) * 2019-01-15 2023-06-20 深圳市凯广荣科技发展有限公司 Living body detection method, detection device and self-service terminal equipment
CN110335216A (en) * 2019-07-09 2019-10-15 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, terminal device and readable storage medium storing program for executing
CN110335216B (en) * 2019-07-09 2021-11-30 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, terminal device, and readable storage medium
CN111464519A (en) * 2020-03-26 2020-07-28 支付宝(杭州)信息技术有限公司 Account registration method and system based on voice interaction
CN111523438A (en) * 2020-04-20 2020-08-11 支付宝实验室(新加坡)有限公司 Living body identification method, terminal device and electronic device
CN111523438B (en) * 2020-04-20 2024-02-23 支付宝实验室(新加坡)有限公司 Living body identification method, terminal equipment and electronic equipment
CN111931153B (en) * 2020-10-16 2021-02-19 腾讯科技(深圳)有限公司 Identity verification method and device based on artificial intelligence and computer equipment
CN111931153A (en) * 2020-10-16 2020-11-13 腾讯科技(深圳)有限公司 Identity verification method and device based on artificial intelligence and computer equipment

Also Published As

Publication number Publication date
WO2018086543A1 (en) 2018-05-17

Similar Documents

Publication Publication Date Title
CN106570489A (en) Living body determination method and apparatus, and identity authentication method and device
US10664581B2 (en) Biometric-based authentication method, apparatus and system
CN108985134B (en) Face living body detection and face brushing transaction method and system based on binocular camera
CN105956578B (en) A kind of face verification method of identity-based certificate information
CN105930709B (en) Face recognition technology is applied to the method and device of testimony of a witness consistency check
WO2019134536A1 (en) Neural network model-based human face living body detection
US20180034852A1 (en) Anti-spoofing system and methods useful in conjunction therewith
CN104123543B (en) A kind of eye movement recognition methods based on recognition of face
US9262614B2 (en) Image processing device, image processing method, and storage medium storing image processing program
WO2019033572A1 (en) Method for detecting whether face is blocked, device and storage medium
US10924476B2 (en) Security gesture authentication
WO2020244071A1 (en) Neural network-based gesture recognition method and apparatus, storage medium, and device
CN106663157A (en) User authentication method, device for executing same, and recording medium for storing same
Tiwari et al. A touch-less fingerphoto recognition system for mobile hand-held devices
WO2019200872A1 (en) Authentication method and apparatus, and electronic device, computer program, and storage medium
CN106056083B (en) A kind of information processing method and terminal
CN109635625B (en) Intelligent identity verification method, equipment, storage medium and device
EP3203416A1 (en) Method computer program and system for facial recognition
KR20150069799A (en) Method for certifying face and apparatus thereof
Rahouma et al. Design and implementation of a face recognition system based on API mobile vision and normalized features of still images
CN112183167A (en) Attendance checking method, authentication method, living body detection method, device and equipment
Hassan et al. Facial image detection based on the Viola-Jones algorithm for gender recognition
CN108596127B (en) Fingerprint identification method, identity verification method and device and identity verification machine
CN111126283A (en) Rapid in-vivo detection method and system for automatically filtering fuzzy human face
CN108764033A (en) Auth method and device, electronic equipment, computer program and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination