CN110287899B - Area fingerprint identification method and related product - Google Patents
Area fingerprint identification method and related product Download PDFInfo
- Publication number
- CN110287899B CN110287899B CN201910568548.9A CN201910568548A CN110287899B CN 110287899 B CN110287899 B CN 110287899B CN 201910568548 A CN201910568548 A CN 201910568548A CN 110287899 B CN110287899 B CN 110287899B
- Authority
- CN
- China
- Prior art keywords
- target
- contour
- area
- image
- fingerprint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1306—Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1365—Matching; Classification
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Image Input (AREA)
- Collating Specific Patterns (AREA)
Abstract
The embodiment of the application provides a regional fingerprint identification method and a related product, wherein the method is applied to electronic equipment, the electronic equipment comprises an ultrasonic fingerprint identification module, and the method comprises the following steps: when the electronic equipment is in a single-hand operation mode, acquiring reference position information of fingers; determining a target fingerprint acquisition area according to the reference position information, wherein the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module; and fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image, so that the power consumption during fingerprint identification can be reduced.
Description
Technical Field
The application relates to the technical field of identification, in particular to a regional fingerprint identification method and a related product.
Background
With the continuous development of communication devices, the development speed of electronic devices is also changing day by day, for example, a mobile phone has only a basic communication function before, and is developed to a current smart phone, and in the current smart phone, in order to meet a larger screen demand, when fingerprint recognition is performed, the previous contact type key recognition is developed to a non-key recognition mode, but when fingerprint recognition is performed, the power consumption during fingerprint recognition is easily high due to the existing non-key recognition mode.
Disclosure of Invention
The embodiment of the application provides a regional fingerprint identification method and a related product, which can reduce the power consumption during fingerprint identification.
A first aspect of an embodiment of the present application provides a method for recognizing a regional fingerprint, which is applied to an electronic device, where the electronic device includes an ultrasonic fingerprint recognition module, and the method includes:
when the electronic equipment is in a single-hand operation mode, acquiring reference position information of fingers;
determining a target fingerprint acquisition area according to the reference position information, wherein the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module;
and fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image.
A second aspect of the embodiments of the present application provides an area fingerprint identification device, which is applied to an electronic device, the electronic device includes an ultrasonic fingerprint identification module, the device includes: an acquisition unit, a determination unit and an acquisition unit, wherein,
the acquisition unit is used for acquiring reference position information of fingers when the electronic equipment is in a single-hand operation mode;
the determining unit is used for determining a target fingerprint acquisition area according to the reference position information, wherein the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module;
the acquisition unit is used for carrying out fingerprint acquisition on the finger through the target fingerprint acquisition area to obtain a target fingerprint image.
A third aspect of the embodiments of the present application provides an electronic device, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the step instructions in the first aspect of the embodiments of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps as described in the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has at least the following beneficial effects:
in this embodiment, through when electronic equipment is in one-hand operation mode, acquire the reference position information of finger, it is regional to determine target fingerprint collection according to reference position information, the target fingerprint collection is regional for the part region in the fingerprint identification region that ultrasonic fingerprint identification module corresponds, carry out fingerprint collection to the finger through the target fingerprint collection region, obtain the target fingerprint image, consequently, can determine the target fingerprint collection region according to the reference position information of finger, carry out fingerprint collection to this finger through this fingerprint collection region, consumption when can reduce fingerprint identification to a certain extent.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an ultrasonic fingerprint identification module according to an embodiment of the present disclosure;
fig. 2A is a schematic flowchart of a method for recognizing a regional fingerprint according to an embodiment of the present application;
FIG. 2B is a schematic diagram of an initial fingerprint acquisition area according to an embodiment of the present application;
FIG. 2C is a schematic illustration of a contour repair provided by an embodiment of the present application;
FIG. 2D is a schematic diagram of a sub-magnified region provided by an embodiment of the present application;
FIG. 2E is a schematic diagram of a method for determining a shape of a target according to an embodiment of the present application;
FIG. 2F is a schematic diagram of another method for determining a target shape according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart illustrating another method for area fingerprinting in accordance with an embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating another method for area fingerprinting in accordance with an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an area fingerprint identification device according to an embodiment of the present application.
Detailed Description
In the embodiment of the application, aim at solving among the prior art, when gathering the fingerprint through the mode of non-button discernment, the higher problem of consumption has been proposed and has been carried out the analysis through the position to the finger to confirm the regional mode of collection that corresponds, consumption when can reduce fingerprint identification to a certain extent promotes electronic equipment's performance.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal equipment (terminal device), and so on. For convenience of description, the above-mentioned apparatuses are collectively referred to as electronic devices.
In order to better understand the method for recognizing an area fingerprint provided by the embodiment of the present application, a brief description is first given below to an ultrasonic fingerprint recognition module applying the method for recognizing an area fingerprint. Referring to fig. 1, fig. 1 is a schematic structural diagram of an ultrasonic fingerprint identification module according to an embodiment of the present disclosure. As shown in fig. 1, the ultrasonic fingerprint recognition module comprises: the cathode structure comprises a substrate 101, a transmitting/receiving cathode 102, a piezoelectric material 103 and a transmitting/receiving anode 104, wherein the transmitting/receiving cathode 102 is arranged on the lower side of the substrate 101, the piezoelectric material 103 is arranged on the lower side of the transmitting/receiving cathode 102, the transmitting/receiving anode 104 is arranged on the lower side of the piezoelectric material 103, and a solidified glue is arranged on the lower side of the transmitting/receiving anode 104. The solidified glue is used for protecting the ultrasonic fingerprint identification module. The operating principle of ultrasonic fingerprint identification module can be:
and (3) a transmitting state: by supplying an oscillation signal of high frequency (typically in the order of 10 MHz), such as a sine wave, to the electrodes (the transmitting/receiving cathode 102 and the transmitting/receiving anode 104) at both ends of the piezoelectric material 103, the piezoelectric material 103 generates vibration of a response frequency and emits a transmitted ultrasonic wave 109. The transmitted ultrasonic waves 109 transmitted upward reach the fingerprint in contact with the screen surface after passing through the Organic Light-Emitting Diode (OLED) screen 105, and since the acoustic resistance characteristic of the air in the fingerprint valley 106 is greatly different from the acoustic resistance characteristic of the screen surface glass when the fingerprint ridge 107 is attached to the screen, the acoustic resistance characteristic of the skin tissue of the fingerprint ridge 107 is relatively greatly different from the acoustic resistance characteristic of the screen surface glass, and therefore, the intensity of the reflected signal of the transmitted ultrasonic waves 109 by the fingerprint ridge 107 is different.
Receiving state: when the reflected ultrasonic waves 108 pass through the display screen again and reach the transmitting/receiving cathode 102, the piezoelectric material is caused to vibrate to generate an electric signal, and the vibration intensities of the piezoelectric material in the areas where the transmitting/receiving cathode 102 is located, which correspond to the fingerprint valleys and the fingerprint ridges at different positions, are different, so that the potential differences received by the transmitting/receiving cathode 102 at different positions are different (the transmitting/receiving anode is 104 equipotential). And converting the potential difference into a two-dimensional image signal, thereby obtaining an ultrasonic fingerprint image.
Referring to fig. 2A, fig. 2A is a schematic flowchart illustrating a method for recognizing a local fingerprint according to an embodiment of the present disclosure. As shown in fig. 2A, the method for recognizing an area fingerprint is applied to an electronic device, the electronic device includes an ultrasonic fingerprint recognition module, the method for recognizing an area fingerprint includes steps 201 and 203, and the method is as follows:
201. and when the electronic equipment is in a single-hand operation mode, acquiring the reference position information of the finger.
The one-handed operation mode may be understood as a mode selected by a user, and when the electronic device is in the one-handed operation mode, only a partial area of the screen is displayed and an operable area is displayed, and at this time, the electronic device is usually held by one hand.
Alternatively, the reference position information of the finger may be understood as position information of a position where the user places the finger close to the electronic device. When acquireing reference position information, can be through opening ultrasonic fingerprint identification module, confirm reference position information through the back wave, when acquireing, can acquire the position information of finger many times, this mode can be applicable to and acquire the position information of finger many times at the in-process that the finger is close to electronic device's display screen, and move the position information that corresponds in presetting the within range as reference position information with last finger, finger moves and can understand in presetting the within range, the user is when carrying out fingerprint identification, because the finger can have the condition of shake, then move the position information that corresponds in presetting the within range as reference position information with the finger. The preset range may be set by an empirical value or historical data.
202. According to reference position information, a target fingerprint acquisition area is determined, and the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module.
Optionally, the target fingerprint acquisition area may include target position information and a target shape, the target position information may be understood as position information of the target fingerprint acquisition area, the position information may be position information of a central point of the target fingerprint acquisition area, and the target shape may be understood as a specific shape of the target fingerprint acquisition area, for example, a rectangle, an ellipse, and the like.
Optionally, the fingerprint identification region that the ultrasonic wave fingerprint identification module corresponds can be for the whole region of the display screen is distinguished to the electron, also can be the subregion of display screen.
Optionally, the fingerprint identification region that ultrasonic wave fingerprint identification module corresponds can be all regions of electronic equipment display screen, can understand, all regions at electronic equipment's display screen all have the fingerprint identification function, of course, the fingerprint identification region that ultrasonic wave fingerprint identification module corresponds also can be the subregion of electronic equipment display screen, and this subregion can be understood as arbitrary subregion on the electronic equipment display screen.
203. And fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image.
Optionally, after the target fingerprint image is acquired, the fingerprint image may be sent to a processor for fingerprint recognition.
In one possible embodiment, a possible method for determining a target fingerprint collection area according to reference location information includes steps a1-a5 as follows:
a1, acquiring a target holding gesture for holding the electronic equipment;
a2, determining a target initial fingerprint acquisition area corresponding to a target holding gesture according to a mapping relation between a preset holding gesture and the initial fingerprint acquisition area;
a3, adjusting the target initial fingerprint acquisition area according to the reference position information to obtain the target position information of the target fingerprint acquisition area;
a4, acquiring a reference contour image of the finger;
and A5, determining the target shape of the target fingerprint acquisition area according to the contour image.
The holding gesture may include left-hand holding and right-hand holding, and the method for acquiring the target holding gesture of the electronic device may be that the target holding gesture is acquired through a pressure sensor of the electronic device, and the pressure sensor may be disposed on a housing of the electronic device. The target holding gesture has position information, and the position information is a holding position.
Optionally, the mapping relationship between the preset holding gesture and the initial fingerprint acquisition area may be: the initial fingerprint collection area corresponding to the left hand grip is the left initial fingerprint collection area, and the initial fingerprint collection area corresponding to the right hand grip is the right initial fingerprint collection area, as shown in fig. 2B.
Optionally, the reference location information includes a reference location coordinate point, and a possible method for adjusting the target initial fingerprint acquisition area according to the reference location information to obtain the target location information of the target fingerprint acquisition area includes: the reference position coordinate point is projected to the plane where the ultrasonic fingerprint identification module is located, the coordinate of the reference projection point is obtained, the target initial fingerprint acquisition area is moved to the coordinate of the reference projection point, the target position information of the target fingerprint acquisition area is obtained, and the target position information can be the coordinate of the projection point. The target position information may also be obtained by moving the initial fingerprint acquisition area to the coordinates of the reference projection point, and then presetting coordinates of any point within a radius from the initial fingerprint acquisition area around the coordinates of the reference projection point as a center of a circle, where the preset radius is set by an empirical value or historical data.
Optionally, the method for acquiring the reference contour image of the finger may be: the method comprises the steps of obtaining through an ultrasonic fingerprint identification module, transmitting ultrasonic waves in a preset area of a coordinate point corresponding to target position information after the target position information is determined, and determining a reference profile image according to reflected waves of the received ultrasonic waves. The preset area may be set by empirical values or historical data, and may be generally rectangular, circular, or the like. The reference profile image may be a complete fingerprint profile image of the finger or a partial fingerprint profile image, which may be determined by the shape of the preset region and the finger, and is a complete fingerprint profile image if the projection of the finger completely falls within the preset region, or is a partial fingerprint profile image if the projection of the finger does not completely fall within the preset region.
Alternatively, one possible method for determining the target shape of the target fingerprint acquisition area according to the contour image may be: when the outline image is a partial fingerprint outline image, the outline image can be restored to be a complete fingerprint outline image, so that the target shape is determined according to the restored complete fingerprint outline image.
In this example, an initial fingerprint collection area is determined through a target holding gesture for holding the electronic device, the initial fingerprint collection area is adjusted according to reference position information of the finger to obtain target position information, a reference contour image of the finger is obtained at the same time, and a target shape is determined according to the contour image.
In one possible embodiment, where the reference outline image includes a plurality of contour lines, a possible method for determining the target shape of the target fingerprint acquisition area from the outline image includes steps B1-B5, as follows:
b1, determining the reference integrity of the reference contour image according to a preset contour integrity determination method;
b2, if the reference integrity is smaller than a preset integrity threshold, obtaining contour lines in the reference contour image to obtain a plurality of reference contour lines;
b3, repairing the at least one contour line to be repaired to obtain a target contour line corresponding to each contour line to be repaired in the at least one contour line to be repaired, wherein the contour line to be repaired is a contour line which is positioned at the boundary position of the reference contour image in the plurality of reference contour lines;
b4, determining a target contour image according to the plurality of reference contour lines and the target contour line;
and B5, determining the shape of the target according to the target contour image.
The possible method for determining the reference integrity of the reference contour image according to the preset contour integrity comprises the following steps: and comparing the reference contour image with a preset complete image to obtain a reference integrity. The missing amount is understood to be a portion of the reference contour image missing from the complete image. The preset complete image may be a complete image pre-stored by a finger corresponding to the reference wheel reference profile image. The electronic device may store a fingerprint image capable of being unlocked in advance, and the fingerprint image may be a preset complete image. When performing the comparison, the area of the reference contour image repeated relative to the preset complete image may be determined, and the repeated area is divided by the area of the preset complete image to serve as the reference integrity. If the reference contour image and the preset complete image are different in size, the reference contour image may be enlarged or reduced to make the size of the reference contour image the same as that of the preset complete image, where the size may be understood as the size of the shape, that is, after the reference contour image is restored, the shape is the same as that of the preset complete image. The preset integrity threshold may be set by empirical values or historical data.
Optionally, a possible method for repairing the contour line to be repaired may be: and acquiring the contour line adjacent to the contour line to be repaired, and repairing the contour line to be repaired according to the contour trend of the adjacent contour line to obtain a target contour line corresponding to the contour line to be repaired, as shown in fig. 2C.
In this example, when the reference integrity of the reference profile image is smaller than the preset integrity threshold, the contour line to be restored in the multi-reference profile image is restored to obtain the target contour line, the target profile image is determined according to the restored contour line and the reference contour line, and the target shape is determined according to the target profile image.
In one possible embodiment, a possible method for determining a shape of an object from an object contour image includes the following steps C1-C3:
c1, determining a reference shape according to the contour of the target contour image;
c2, acquiring a magnification area of the reference shape and a magnification coefficient corresponding to the magnification area;
and C3, carrying out amplification processing on the amplification area by adopting the corresponding amplification factor to obtain the target shape.
Optionally, the contour of the target contour image may be used as a reference shape, or the contour of the target contour image may be amplified according to a preset amplification factor, and the amplified contour is used as the reference shape. The preset magnification is set by empirical values or historical data.
Optionally, the magnified region includes a plurality of sub-magnified regions, and one possible method for obtaining the magnified region of the reference shape may be: taking the repaired area in the target contour image as a first sub-amplification area; and acquiring the amplification end points in the reference shape, and taking the area between the amplification end points as a second sub-amplification area. The method for acquiring the amplification endpoint may be: and acquiring two intersection points of the symmetry axis of the first sub-amplification region and the target contour image, acquiring two intersection points of a perpendicular bisector between the two intersection points and the target contour image, and taking four points which are the same as the two intersection points by a preset distance as amplification end points, wherein the amplification end points are positioned at the points on the outermost contour line of the target contour image. As shown in fig. 2D, only the second magnified sub-region on one side is shown in fig. 2D, and the magnified sub-region on the other side is determined in the same manner. The amplification factor may be set by an empirical value or historical data, and the amplification factor of the first sub-amplification region is higher than that of the second sub-amplification region.
Optionally, one possible method for performing amplification processing on the amplified region by using a corresponding amplification factor to obtain the target shape is as follows: the region is extended outward in a direction outward of a sector region formed between the center point of the target contour image and the enlargement end point, and the region is enlarged in a sector enlargement manner, which is understood as a manner in which the radius of the sector is increased in proportion to obtain a sector formed by the radius increased in the radius proportion, and the circular angle thereof is not changed.
In the example, the reference shape is determined through the target profile image, and the target shape is obtained by partially amplifying the reference shape, so that the target fingerprint acquisition area with the target shape can better acquire the fingerprint image of the finger, and the accuracy of the fingerprint image acquisition of the finger of the user is improved.
In a possible embodiment, when the reference integrity is greater than or equal to the preset integrity, the target shape may be determined by the following method, specifically including steps D1-D6, as follows:
d1, if the reference integrity is larger than or equal to a preset integrity threshold, acquiring a target line segment, wherein the target line segment is a line segment with the longest length between two points in the reference contour image;
d2, determining a perpendicular bisector of the target line segment, and a first intersection point and a second intersection point of the perpendicular bisector and the reference contour image;
d3, acquiring a first parallel line and a second parallel line, wherein the first parallel line is a straight line which passes through the first intersection point and is parallel to the target line segment, and the second parallel line is a straight line which passes through the second intersection point and is parallel to the target line segment;
d4, if the reference contour image only exists in the target area, determining the target shape according to the first parallel line, the second parallel line and the line segment between the first intersection point and the second intersection point, wherein the target area is the area between the first parallel line and the second parallel line;
d5, if the reference contour image has an image in the area outside the target area, acquiring a first target point and a second target point, wherein the first target point is the point in the first reference area image with the longest distance from the first parallel line, the second target point is the point in the second reference area image with the longest distance from the second parallel line, the first reference area is the area of the reference contour area outside the first parallel line, and the second reference area is the area of the reference contour image outside the second parallel line;
d6, determining a target shape according to the first target point, the second target point and the target line segment.
Optionally, referring to fig. 2E, fig. 2E is a schematic diagram of a method for determining a target shape according to an embodiment of the present disclosure. As shown in fig. 2E, one possible method for determining the target shape according to the first parallel line, the second parallel line, and the line segment between the first intersection point and the second intersection point is as follows: and translating the line segment between the first intersection point and the second intersection point to the end point of the target line segment, so that the area enclosed by the first parallel line and the second parallel line is a target fingerprint acquisition area, and the enclosed shape is a target shape.
Optionally, referring to fig. 2F, fig. 2F is a schematic diagram of another method for determining a target shape according to an embodiment of the present application. As shown in fig. 2F, the method for determining the target shape according to the first target point, the second target point and the target line segment includes: determining a third parallel line and a fourth parallel line, wherein the third parallel line is a straight line passing through the first target point and being parallel to the target line segment, and the fourth parallel line is a straight line passing through the second target point and being parallel to the target line segment; the method comprises the steps of obtaining a third intersection point and a fourth intersection point of a first parallel line and a target contour image, obtaining an intersection point (a fifth intersection point) of a straight line passing through the third intersection point and a target line segment end point and a third parallel line, obtaining a straight line passing through the fifth intersection point and tangent with the target fingerprint contour image, and obtaining an intersection point (a sixth intersection point) of the straight line and a line passing through an end point (a seventh intersection point) of the target line segment and parallel to a perpendicular bisector, and sequentially connecting a first target point, the fifth intersection point, the sixth intersection point and the seventh intersection point to obtain a target fingerprint acquisition area of a part of targets, wherein the rest area of the target fingerprint acquisition area can be confirmed by the same method, so that the target shape of the complete target fingerprint acquisition area is obtained, and the target shape can be a dodecagon.
Optionally, the target contour image may be further subjected to an amplification process, so as to obtain a target shape, where the target shape is the same as the shape of the target contour image.
In this example, when the reference integrity is greater than the preset integrity, the target shape may be determined by parallel lines and a target point, so that the target shape adapted to the finger may be further determined, and energy consumption during fingerprint acquisition may be further reduced.
Referring to fig. 3, fig. 3 is a flowchart illustrating another method for recognizing a local fingerprint according to an embodiment of the present disclosure. As shown in fig. 3, the method for recognizing an area fingerprint is applied to an electronic device, the electronic device includes an ultrasonic fingerprint recognition module, and the method for recognizing an area fingerprint includes steps 301 and 307, which are as follows:
301. when the electronic equipment is in a single-hand operation mode, acquiring reference position information of fingers;
302. acquiring a target holding gesture for holding the electronic equipment;
303. determining a target initial fingerprint acquisition area corresponding to the target holding gesture according to a mapping relation between a preset holding gesture and the initial fingerprint acquisition area;
304. adjusting the target initial fingerprint acquisition area according to the reference position information to obtain target position information of the target fingerprint acquisition area;
wherein the target fingerprint acquisition area comprises target position information and a target shape. The target area is the partial region in the fingerprint identification region that ultrasonic fingerprint identification module corresponds.
305. Acquiring a reference contour image of a finger;
306. determining the target shape of a target fingerprint acquisition area according to the contour image;
307. and fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image.
In this example, an initial fingerprint collection area is determined through a target holding gesture for holding the electronic device, the initial fingerprint collection area is adjusted according to reference position information of the finger to obtain target position information, a reference contour image of the finger is obtained at the same time, and a target shape is determined according to the contour image.
Referring to fig. 4, fig. 4 is a schematic flow chart illustrating another area fingerprint identification method according to an embodiment of the present disclosure. As shown in fig. 4, the method for recognizing the regional fingerprint is applied to the electronic device, the electronic device comprises an ultrasonic fingerprint recognition module, and the method for recognizing the regional fingerprint comprises steps 401 and 411, which are as follows:
401. when the electronic equipment is in a single-hand operation mode, acquiring reference position information of fingers;
402. acquiring a target holding gesture for holding the electronic equipment;
403. determining a target initial fingerprint acquisition area corresponding to the target holding gesture according to a mapping relation between a preset holding gesture and the initial fingerprint acquisition area;
404. adjusting the target initial fingerprint acquisition area according to the reference position information to obtain target position information of the target fingerprint acquisition area;
the target fingerprint acquisition area comprises target position information and a target shape.
405. Acquiring a reference contour image of a finger;
406. determining the reference integrity of the reference contour image according to a preset contour integrity determination method;
407. if the reference integrity is smaller than a preset integrity threshold, obtaining contour lines in the reference contour image to obtain a plurality of reference contour lines;
408. repairing at least one contour line to be repaired to obtain a target contour line corresponding to each contour line to be repaired in the at least one contour line to be repaired, wherein the contour line to be repaired is a contour line which is positioned at the boundary position of a reference contour image in a plurality of reference contour lines;
409. determining a target contour image according to the plurality of reference contour lines and the target contour line;
410. determining a target shape according to the target contour image;
411. and fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image.
In this example, when the reference integrity of the reference profile image is smaller than the preset integrity threshold, the contour line to be restored in the multi-reference profile image is restored to obtain the target contour line, the target profile image is determined according to the restored contour line and the reference contour line, and the target shape is determined according to the target profile image.
In accordance with the foregoing embodiments, please refer to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in the figure, the electronic device includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, the computer program includes program instructions, and the processor is configured to call the program instructions, and the program includes instructions for performing the following steps;
when the electronic equipment is in a single-hand operation mode, acquiring reference position information of fingers;
determining a target fingerprint acquisition area according to the reference position information, wherein the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module;
and fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In accordance with the above, please refer to fig. 6, fig. 6 is a schematic structural diagram of a local fingerprint identification device according to an embodiment of the present application, and the device is applied to an electronic device, the electronic device includes an ultrasonic fingerprint identification module, and the device includes: an acquisition unit 601, a determination unit 602, and an acquisition unit 603, wherein,
an obtaining unit 601, configured to obtain reference position information of a finger when the electronic device is in a single-hand operation mode;
a determining unit 602, configured to determine a target fingerprint acquisition area according to the reference position information, where the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module;
and an acquisition unit 603, configured to perform fingerprint acquisition on the finger through the target fingerprint acquisition area to obtain a target fingerprint image.
Optionally, the target fingerprint acquisition area includes target position information and a target shape, and in terms of determining the target fingerprint acquisition area according to the reference position information, the determining unit 602 is configured to:
acquiring a target holding gesture for holding the electronic equipment;
determining a target initial fingerprint acquisition area corresponding to the target holding gesture according to a mapping relation between a preset holding gesture and the initial fingerprint acquisition area;
adjusting the target initial fingerprint acquisition area according to the reference position information to obtain target position information of the target fingerprint acquisition area;
acquiring a reference contour image of a finger;
and determining the target shape of the target fingerprint acquisition area according to the contour image.
Optionally, the reference contour image includes a plurality of contour lines, and in terms of determining the target shape of the target fingerprint acquisition area according to the contour image, the determining unit 602 is configured to:
determining the reference integrity of the reference contour image according to a preset contour integrity determination method;
if the reference integrity is smaller than a preset integrity threshold, obtaining contour lines in the reference contour image to obtain a plurality of reference contour lines;
repairing at least one contour line to be repaired to obtain a target contour line corresponding to each contour line to be repaired in the at least one contour line to be repaired, wherein the contour line to be repaired is a contour line which is positioned at the boundary position of a reference contour image in a plurality of reference contour lines;
determining a target contour image according to the plurality of reference contour lines and the target contour line;
and determining the shape of the target according to the target contour image.
Optionally, in determining the target shape according to the target contour image, the determining unit 602 is configured to:
determining a reference shape according to the contour of the target contour image;
acquiring an amplification area of the reference shape and an amplification factor corresponding to the amplification area;
and amplifying the amplified region by adopting a corresponding amplification factor to obtain a target shape.
Optionally, the apparatus is further configured to:
if the reference integrity is greater than or equal to a preset integrity threshold, acquiring a target line segment, wherein the target line segment is a line segment with the longest length between two points in the reference contour image;
determining a perpendicular bisector of the target line segment, and a first intersection point and a second intersection point of the perpendicular bisector and the reference contour image;
acquiring a first parallel line and a second parallel line, wherein the first parallel line is a straight line which passes through the first intersection point and is parallel to the target line segment, and the second parallel line is a straight line which passes through the second intersection point and is parallel to the target line segment;
if the reference contour image only exists in the target area, determining the target shape according to the first parallel line, the second parallel line and the line segment between the first intersection point and the second intersection point, wherein the target area is the area between the first parallel line and the second parallel line;
if the reference contour image has an image in a region outside the target region, acquiring a first target point and a second target point, wherein the first target point is a point in the first reference region image with the longest distance from a first parallel line, the second target point is a point in the second reference region image with the longest distance from a second parallel line, the first reference region is a region of the reference contour region outside the first parallel line, and the second reference region is a region of the reference contour image outside the second parallel line;
and determining the target shape according to the first target point, the second target point and the target line segment.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the area fingerprint identification methods as described in the above method embodiments.
Embodiments of the present application also provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program causes a computer to execute part or all of the steps of any one of the area fingerprint identification methods as described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash memory disks, read-only memory, random access memory, magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (9)
1. The utility model provides a regional fingerprint identification method, its characterized in that is applied to electronic equipment, electronic equipment includes ultrasonic fingerprint identification module, the method includes:
when the electronic equipment is in a single-hand operation mode, position information of a finger is obtained for multiple times in the process that the finger approaches a display screen of the electronic equipment, and the position information corresponding to the fact that the finger moves in a preset range at last is used as reference position information of the finger;
determining a target fingerprint acquisition area according to the reference position information, wherein the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module, and the target fingerprint acquisition area comprises a target shape; acquiring a reference contour image of the finger, wherein the reference contour image comprises a plurality of contour lines; if the reference integrity of the reference contour image is smaller than a preset integrity threshold, contour lines in the reference contour image are obtained to obtain a plurality of reference contour lines; repairing at least one contour line to be repaired to obtain a target contour line corresponding to each contour line to be repaired in the at least one contour line to be repaired, wherein the contour line to be repaired is a contour line in the boundary position of the reference contour image in the plurality of reference contour lines; determining a target contour image according to the plurality of reference contour lines and the target contour line; determining a reference shape according to the contour of the target contour image; acquiring an amplification area of the reference shape and an amplification coefficient corresponding to the amplification area, wherein the amplification area comprises a first amplification area which passes repair; amplifying the amplification area by adopting a corresponding amplification factor to obtain the target shape;
and fingerprint collection is carried out on the finger through the target fingerprint collection area to obtain a target fingerprint image.
2. The method of claim 1, wherein the target fingerprint acquisition area further comprises target location information, wherein determining the target fingerprint acquisition area from the reference location information further comprises:
acquiring a target holding gesture for holding the electronic equipment;
determining a target initial fingerprint acquisition area corresponding to the target holding gesture according to a mapping relation between a preset holding gesture and an initial fingerprint acquisition area;
and adjusting the target initial fingerprint acquisition area according to the reference position information to obtain the target position information of the target fingerprint acquisition area.
3. The method according to claim 2, wherein before the obtaining the contour line in the reference contour image if the reference integrity of the reference contour image is smaller than a preset integrity threshold, the method further comprises:
and determining the reference integrity of the reference contour image according to a preset contour integrity determination method.
4. The method of claim 3, further comprising:
if the reference integrity is greater than or equal to the preset integrity threshold, acquiring a target line segment, wherein the target line segment is a line segment with the longest length between two points in the reference contour image;
determining a perpendicular bisector of the target line segment, and a first intersection point and a second intersection point of the perpendicular bisector and the reference profile image;
acquiring a first parallel line and a second parallel line, wherein the first parallel line is a straight line which passes through the first intersection point and is parallel to the target line segment, and the second parallel line is a straight line which passes through the second intersection point and is parallel to the target line segment;
if the reference contour image only exists in a target area, determining a target shape according to the first parallel line, the second parallel line and a line segment between the first intersection point and the second intersection point, wherein the target area is an area between the first parallel line and the second parallel line;
if the reference contour image has an image in a region outside the target region, acquiring a first target point and a second target point, wherein the first target point is a point in the first reference region image with the longest distance from the first parallel line, the second target point is a point in the second reference region image with the longest distance from the second parallel line, the first reference region is a region of the reference contour region outside the first parallel line, and the second reference region is a region of the reference contour image outside the second parallel line;
and determining the target shape according to the first target point, the second target point and the target line segment.
5. The utility model provides an area fingerprint identification device which characterized in that is applied to electronic equipment, electronic equipment includes ultrasonic fingerprint identification module, the device includes: an acquisition unit, a determination unit and an acquisition unit, wherein,
the acquisition unit is used for acquiring the position information of a finger for multiple times in the process that the finger approaches a display screen of the electronic equipment when the electronic equipment is in a single-hand operation mode, and taking the position information corresponding to the condition that the finger moves in a preset range at last as the reference position information of the finger;
the determining unit is used for determining a target fingerprint acquisition area according to the reference position information, wherein the target fingerprint acquisition area is a partial area in a fingerprint identification area corresponding to the ultrasonic fingerprint identification module, and the target fingerprint acquisition area comprises a target shape; acquiring a reference contour image of the finger, wherein the reference contour image comprises a plurality of contour lines; if the reference integrity of the reference contour image is smaller than a preset integrity threshold, contour lines in the reference contour image are obtained to obtain a plurality of reference contour lines; repairing at least one contour line to be repaired to obtain a target contour line corresponding to each contour line to be repaired in the at least one contour line to be repaired, wherein the contour line to be repaired is a contour line in the boundary position of the reference contour image in the plurality of reference contour lines; determining a target contour image according to the plurality of reference contour lines and the target contour line; determining a reference shape according to the contour of the target contour image; acquiring an amplification area of the reference shape and an amplification coefficient corresponding to the amplification area, wherein the amplification area comprises a first amplification area which passes repair; amplifying the amplification area by adopting a corresponding amplification factor to obtain the target shape;
the acquisition unit is used for carrying out fingerprint acquisition on the finger through the target fingerprint acquisition area to obtain a target fingerprint image.
6. The apparatus of claim 5, wherein the target fingerprint acquisition area further comprises target location information, and wherein in the determining the target fingerprint acquisition area from the reference location information, the determining unit is further configured to:
acquiring a target holding gesture for holding the electronic equipment;
determining a target initial fingerprint acquisition area corresponding to the target holding gesture according to a mapping relation between a preset holding gesture and an initial fingerprint acquisition area;
and adjusting the target initial fingerprint acquisition area according to the reference position information to obtain the target position information of the target fingerprint acquisition area.
7. The apparatus according to claim 6, wherein before the contour line in the reference contour image is obtained if the reference integrity of the reference contour image is less than a preset integrity threshold, the determining unit is further configured to:
and determining the reference integrity of the reference contour image according to a preset contour integrity determination method.
8. An electronic device comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-4.
9. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910568548.9A CN110287899B (en) | 2019-06-27 | 2019-06-27 | Area fingerprint identification method and related product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910568548.9A CN110287899B (en) | 2019-06-27 | 2019-06-27 | Area fingerprint identification method and related product |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110287899A CN110287899A (en) | 2019-09-27 |
CN110287899B true CN110287899B (en) | 2022-01-11 |
Family
ID=68019934
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910568548.9A Active CN110287899B (en) | 2019-06-27 | 2019-06-27 | Area fingerprint identification method and related product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110287899B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102682428B (en) * | 2012-04-18 | 2014-11-05 | 浙江大学城市学院 | Fingerprint image computer automatic mending method based on direction fields |
CN104915652A (en) * | 2015-06-09 | 2015-09-16 | 迪安杰科技无锡有限公司 | Fingerprint image restoration method and device |
CN106778457A (en) * | 2015-12-11 | 2017-05-31 | 深圳市汇顶科技股份有限公司 | The fingerprint identification method and system of fingerprint recognition rate can be improved |
CN108121946A (en) * | 2017-11-15 | 2018-06-05 | 大唐微电子技术有限公司 | A kind of Pre-processing Method for Fingerprint Image and device |
US10002283B2 (en) * | 2015-08-13 | 2018-06-19 | Xiaomi Inc. | Mobile device and screen module thereof, method and apparatus for acquiring fingerprint and electronic device |
CN109063623A (en) * | 2018-07-26 | 2018-12-21 | 努比亚技术有限公司 | A kind of fingerprint identification method, equipment and computer can storage mediums |
CN109669544A (en) * | 2018-12-29 | 2019-04-23 | Oppo广东移动通信有限公司 | Finger-print region determines method and device, terminal and readable storage medium storing program for executing |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101661612A (en) * | 2009-07-27 | 2010-03-03 | 北京航空航天大学 | Fingerprint image restoration method based on pseudo-minutiae simplex |
JP2011152022A (en) * | 2010-01-25 | 2011-08-04 | Sony Corp | Device and system for managing power, and equipment control method |
US8311973B1 (en) * | 2011-09-24 | 2012-11-13 | Zadeh Lotfi A | Methods and systems for applications for Z-numbers |
CN103854013A (en) * | 2012-12-06 | 2014-06-11 | 武汉科技大学 | ARM fingerprint identification method and device based on sparse matrix |
CN107239737A (en) * | 2017-05-03 | 2017-10-10 | 广东欧珀移动通信有限公司 | A kind of optical finger print recognition methods and Related product |
-
2019
- 2019-06-27 CN CN201910568548.9A patent/CN110287899B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102682428B (en) * | 2012-04-18 | 2014-11-05 | 浙江大学城市学院 | Fingerprint image computer automatic mending method based on direction fields |
CN104915652A (en) * | 2015-06-09 | 2015-09-16 | 迪安杰科技无锡有限公司 | Fingerprint image restoration method and device |
US10002283B2 (en) * | 2015-08-13 | 2018-06-19 | Xiaomi Inc. | Mobile device and screen module thereof, method and apparatus for acquiring fingerprint and electronic device |
CN106778457A (en) * | 2015-12-11 | 2017-05-31 | 深圳市汇顶科技股份有限公司 | The fingerprint identification method and system of fingerprint recognition rate can be improved |
CN108121946A (en) * | 2017-11-15 | 2018-06-05 | 大唐微电子技术有限公司 | A kind of Pre-processing Method for Fingerprint Image and device |
CN109063623A (en) * | 2018-07-26 | 2018-12-21 | 努比亚技术有限公司 | A kind of fingerprint identification method, equipment and computer can storage mediums |
CN109669544A (en) * | 2018-12-29 | 2019-04-23 | Oppo广东移动通信有限公司 | Finger-print region determines method and device, terminal and readable storage medium storing program for executing |
Also Published As
Publication number | Publication date |
---|---|
CN110287899A (en) | 2019-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101898636B1 (en) | Fingerprint authentication using template update | |
TWI557649B (en) | Electronic device and control method for fingerprint recognition apparatus | |
EP3167358B1 (en) | Method of performing a touch action in a touch sensitive device | |
CN109145926B (en) | Similar picture identification method and computer equipment | |
CN106778707B (en) | Fingerprint identification method, display screen and mobile terminal | |
US10126858B2 (en) | Touch display device and touch method thereof | |
JP2017527907A (en) | Pseudorandomly induced fingerprint registration | |
CN110516521B (en) | Fingerprint registration method and electronic device thereof | |
JP2017529582A (en) | Touch classification | |
CN109375833B (en) | Touch instruction generation method and device | |
US20170061191A1 (en) | Fingerprint authentication | |
CN107077284B (en) | Gripping mode determining device | |
CN111492407B (en) | System and method for map beautification | |
US20210166047A1 (en) | Method of multi-finger enrollment for a user | |
CN109976615A (en) | Fingerprint image processing method and related device | |
US20220335750A1 (en) | Method for determining a match between a candidate fingerprint and a reference fingerprint | |
CN110298274B (en) | Optical fingerprint parameter upgrading method and related product | |
CN110287899B (en) | Area fingerprint identification method and related product | |
CN111309149A (en) | Gesture recognition method and gesture recognition device | |
WO2016197815A2 (en) | Method and apparatus for using fingerprint operation, and terminal | |
CN110287922B (en) | Fingerprint identification method and related product | |
CN110263757B (en) | Fingerprint identification method and related products | |
CN110287919B (en) | Ultrasonic fingerprint identification method and related equipment | |
CN110188676B (en) | Vein collection method and related products | |
TW202004554A (en) | Fingerprint enrollment method and electronic device for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |