WO2015145591A1 - 生体認証装置、生体認証方法、及びプログラム - Google Patents
生体認証装置、生体認証方法、及びプログラム Download PDFInfo
- Publication number
- WO2015145591A1 WO2015145591A1 PCT/JP2014/058386 JP2014058386W WO2015145591A1 WO 2015145591 A1 WO2015145591 A1 WO 2015145591A1 JP 2014058386 W JP2014058386 W JP 2014058386W WO 2015145591 A1 WO2015145591 A1 WO 2015145591A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature
- directional
- similarity
- omnidirectional
- orthogonal
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1365—Matching; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the embodiment of the present disclosure relates to biometric authentication technology.
- biometric authentication device if the biometric information extracted from the captured image and the biometric information registered in advance match each other, it is determined that the user is the person himself / herself.
- This biometric information includes features such as palm prints and veins.
- the features indicating the palm prints are separated from the captured image to indicate the veins as much as possible. It needs to be a feature only.
- a method of separating the feature indicating the palm print for example, a method of optical separation using a polarizing filter or the like is known.
- a method using multiple wavelength imaging is known.
- the biometric authentication must be performed using the biometric information including the feature showing the palm print. Because the diversity of features that indicate palm prints is less than the variety of features that indicate veins, the more features that indicate palm prints included in biological information, the higher the acceptance rate of others (FAR (False Acceptance Rate)) . In addition, when a feature indicating a palm print in which melanin is strongly deposited is included, a feature indicating a palm print is more easily extracted than a feature indicating a vein, and the acceptance rate of others is further increased.
- a biometric authentication device capable of suppressing an increase in the acceptance rate of others even when a method of physically separating a feature indicating a palmprint from an image cannot be applied. And to provide a program.
- the biometric authentication device corresponds to a filter that extracts directional features corresponding to mutually different directions from an input image, and corresponds to a predetermined direction among a plurality of directional features extracted from the filter.
- An orthogonal filter that lowers the overall luminance value of the directional feature, increases the overall luminance value of the directional feature corresponding to the direction orthogonal to the directional feature corresponding to the predetermined direction, and outputs other directional features as they are
- An omnidirectional feature generation processing unit that generates an omnidirectional feature from a plurality of directional features output from the orthogonal filter, and a registered omnidirectional feature stored in the omnidirectional feature and a storage unit;
- a matching processing unit for obtaining the similarity, and a determination unit for determining whether or not the person is the person using the similarity.
- the biometric authentication device includes a filter that extracts directional features corresponding to different directions from an input image, and an omnidirectional feature by a plurality of directional features extracted from the filter.
- An omnidirectional feature generation processing unit that generates a directional feature selected from a plurality of directional features extracted from the filter, and outputs a directional feature corresponding to a predetermined direction as a main directional feature.
- a selection unit that selects a directional feature corresponding to a direction orthogonal to the main directional feature from among the plurality of extracted directional features and outputs the selected directional feature as an orthogonal directional feature; and the omnidirectional feature and the storage unit
- An omnidirectional feature matching processing unit for obtaining a first similarity to a stored registered omnidirectional feature; the main directional feature and a registered main directional feature stored in the storage unit;
- a directional feature matching processing unit for obtaining a third similarity between the orthogonal directional feature and a registered orthogonal directional feature stored in the storage unit;
- a similarity adjustment unit that reduces the weight of similarity compared to the weight of the third similarity and outputs the sum as a fourth similarity, and the first similarity and the fourth similarity
- a determination unit that determines whether or not the user is the person himself / herself.
- the biometric authentication device includes a filter that extracts directional features corresponding to different directions from an input image, and an omnidirectional feature by a plurality of directional features extracted from the filter.
- An omnidirectional feature generation processing unit that generates a directional feature selected from a plurality of directional features extracted from the filter, and outputs a directional feature corresponding to a predetermined direction as a main directional feature.
- a selection unit that selects a directional feature corresponding to a direction orthogonal to the main directional feature from among the plurality of extracted directional features and outputs the selected directional feature as an orthogonal directional feature; and the omnidirectional feature and the storage unit
- An omnidirectional feature matching processing unit for obtaining a first similarity to a stored registered omnidirectional feature; the main directional feature and a registered main directional feature stored in the storage unit;
- a directional feature matching processing unit for obtaining a third similarity between the orthogonal directional feature and the registered orthogonal directional feature stored in the storage unit,
- a similarity adjustment unit that increases the weighting of similarity compared to the weighting of the second similarity and outputs the sum as a fourth similarity, and the first similarity and the fourth similarity
- a determination unit that determines whether or not the user is the person himself / herself.
- the computer extracts directional features corresponding to directions different from each other from an input image, and corresponds to a predetermined direction among the plurality of extracted directional features. Lowering the overall luminance value of the directional feature, increasing the overall luminance value of the directional feature corresponding to the direction orthogonal to the directional feature corresponding to the predetermined direction, outputting the other directional features as they are, An omnidirectional feature is generated from a plurality of output directional features, and a similarity between the omnidirectional feature and a registered omnidirectional feature stored in a storage unit is obtained. It is determined whether or not there is.
- a computer extracts directional features corresponding to different directions from an input image, and generates omnidirectional features by the plurality of extracted directional features. And selecting a directional feature corresponding to a predetermined direction from the plurality of extracted directional features and outputting the selected directional feature as the main directional feature.
- a computer extracts directional features corresponding to different directions from an input image, and generates omnidirectional features by the plurality of extracted directional features. And selecting a directional feature corresponding to a predetermined direction from the plurality of extracted directional features and outputting the selected directional feature as the main directional feature.
- the program according to the embodiment of the present disclosure extracts a directional feature corresponding to a different direction from an input image to a computer, and a directional feature corresponding to a predetermined direction among the plurality of extracted directional features.
- Lower the overall luminance value of the target feature increase the overall brightness value of the directional feature corresponding to the direction orthogonal to the directional feature corresponding to the predetermined direction, output the other directional features as they are, and output Whether an omnidirectional feature is generated from a plurality of directional features, the degree of similarity between the omnidirectional feature and a registered omnidirectional feature stored in a storage unit is obtained, and the identity is used by using the degree of similarity. Make a determination of NO.
- the program according to the embodiment of the present disclosure extracts directional features corresponding to mutually different directions from an input image to a computer, and generates omnidirectional features by the plurality of extracted directional features, Among the plurality of extracted directional features, a directional feature corresponding to a predetermined direction is selected and output as a main directional feature, and among the plurality of extracted directional features, the directional feature is orthogonal to the main directional feature. A directional feature corresponding to a direction is selected and output as an orthogonal directional feature, a first similarity between the omnidirectional feature and a registered omnidirectional feature stored in a storage unit is obtained, and the main directional feature is obtained.
- a second similarity between the primary feature and the registered main directional feature stored in the storage unit is obtained, and a third similarity between the orthogonal directional feature and the registered orthogonal directional feature stored in the storage unit is obtained.
- Find similarity The weight of the second similarity is made smaller than the weight of the third similarity, the sum of them is output as a fourth similarity, and the first similarity and the fourth similarity are To determine whether or not the user is the person himself / herself.
- the program according to the embodiment of the present disclosure extracts directional features corresponding to mutually different directions from an input image to a computer, and generates omnidirectional features by the plurality of extracted directional features, Among the plurality of extracted directional features, a directional feature corresponding to a predetermined direction is selected and output as a main directional feature, and among the plurality of extracted directional features, the directional feature is orthogonal to the main directional feature. A directional feature corresponding to a direction is selected and output as an orthogonal directional feature, a first similarity between the omnidirectional feature and a registered omnidirectional feature stored in a storage unit is obtained, and the main directional feature is obtained.
- a second similarity between the primary feature and the registered main directional feature stored in the storage unit is obtained, and a third similarity between the orthogonal directional feature and the registered orthogonal directional feature stored in the storage unit is obtained.
- Find similarity The weight of the third similarity is increased as compared with the weight of the second similarity, and the sum of them is output as a fourth similarity, and the first similarity and the fourth similarity are calculated. To determine whether or not the user is the person himself / herself.
- FIG. 1 is a diagram illustrating an example of a biometric authentication apparatus according to the first embodiment.
- a biometric authentication device 1 illustrated in FIG. 1 includes an image acquisition unit 2, a region specification unit 3, a feature extraction unit 4, a matching processing unit 5, a score determination unit 6 (determination unit), and a storage unit 7. .
- FIG. 2 is a flowchart showing the biometric authentication method of the first embodiment.
- the image acquisition unit 2 acquires an image of a subject's hand (S1).
- the image acquisition unit 2 is an imaging device, and acquires a captured image of a hand of a subject using a single-plate imaging element and RGB color filters in a Bayer array.
- the region specifying unit 3 specifies a palm region (ROI (Region Of Interest)) corresponding to the palm of the subject in the image acquired by the image acquiring unit 2 (S2).
- ROI Region Of Interest
- the feature extraction unit 4 extracts an omnidirectional feature from the palm region image f specified by the region specification unit 3 (S3).
- the matching processing unit 5 obtains the similarity between the omnidirectional feature extracted by the feature extraction unit 4 and the registered omnidirectional feature registered in advance and stored in the storage unit 7 (S4).
- the score determination unit 6 determines whether or not the person is the person based on the similarity obtained by the matching processing unit 5 (S5).
- FIG. 3 is a diagram illustrating an example of the feature extraction unit 4 according to the first embodiment.
- the feature extraction unit 4 shown in FIG. 3 includes a filter 41, an orthogonal filter 42, a point-wise maximum selection unit 43, a binarization unit 44, and a thinning unit 45.
- the filter 41 has eight directions ⁇ (0 °, 22.5 °, 45 °, 67.5 °, 90 °, 112.5 °, 135 °, relative to the input image f of the palm region (the luminance values of all the pixels). Gabor filter processing is performed every 157.5 °), and each filter response (luminance value) is converted into a directional feature g ⁇ (directional feature g 0 ° , directional feature g 22.5 ° , directional feature g 45 ° , directional feature g 67.5 ° , directional feature g 90 ° , directional feature g 112.5 ° , directional feature g 135 ° , directional feature g 157.5 ° ).
- the number of directions ⁇ set during the filtering process is not limited to eight as long as it is two or more.
- the filtering process is not limited to the Gabor filter process as long as it has a high filter response with respect to a linear dark part in each direction ⁇ in the image f.
- the orthogonal filter 42 includes the entire directional feature g 0 ° (directional feature g ⁇ of S (Significant) component) corresponding to the direction ⁇ of 0 ° among the directional features g ⁇ extracted from the filter 41. Decrease the luminance value, increase the overall luminance value of the directional feature g 90 ° (directional characteristic g ⁇ of the P (Perpendicular) component) corresponding to the direction orthogonal to the directional feature g ⁇ of the S component, Directive feature g ⁇ is output as it is.
- the maximum value selection unit 43 each point, by the directional characteristic g theta outputted from the orthogonal filter 49, and outputs the non-directional characteristics g.
- each point maximum value selection unit 43 as shown in Expression 1, among the directional features g ⁇ (i, j) output from the orthogonal filter 49, the maximum directional feature max ⁇ ⁇ g ⁇ ( i, j) ⁇ is output as an omnidirectional feature g (i, j).
- i indicates the position in the horizontal axis direction of the two-dimensional coordinate when the positions of all the pixels in the palm region correspond to the position on the two-dimensional coordinate
- j indicates the vertical axis direction of the two-dimensional coordinate. Indicates the position.
- the binarizing unit 44 sets 1 to an omnidirectional surface feature.
- b (i, j) is output, and when the omnidirectional feature g (i, j) is a value other than a positive value, 0 is output as the omnidirectional surface feature b (i, j).
- the omnidirectional surface feature b obtained at this time is stored in the storage unit 7.
- the binarization process is performed by a threshold process using a simple constant 0.
- the binarization process may be performed using a more advanced adaptive-thresholding method.
- the thinning unit 45 obtains an omnidirectional line feature LF by performing thinning processing (skeltonizing) on the omnidirectional surface feature b as shown in Equation 3. Note that skel represents a thinning process. Further, the omnidirectional line feature LF obtained at this time is stored in the storage unit 7.
- a line feature is an image made up of lines.
- the collation processing unit 5 shown in FIG. 1 includes an omnidirectional line feature LF output from the thinning unit 45 and stored in the storage unit 7 and pre-registered and stored in the storage unit 7. A similarity score with a registered non-directional line feature TLF is obtained.
- the score determination part 6 shown in FIG. 1 determines with the identity when the similarity sore is more than a threshold value.
- the palm print is mainly composed of wrinkles that are generated when the hand is held, and thus the directional feature g ⁇ that is presumed to contain a lot of palm print
- the direction ⁇ corresponding to is 0 °. Therefore, the directivity characteristics g theta of S component becomes oriented features palm print of the palm region is emphasized g theta, generated directional characteristic g theta of S component lowered the overall brightness value is used
- the omni-directional line feature LF is reduced in the influence of palm prints. As a result, since it is possible to determine whether or not the person is the person while suppressing the influence of the palm print, the authentication accuracy can be improved.
- the biometric authentication device 1 of the first embodiment even when the method of physically separating the feature indicating the palm print from the image cannot be applied, it is possible to suppress an increase in the acceptance rate of others.
- melanin deposited strongly in the palm of the subject even when the palm print is abundant in directional characteristics g theta, a determination of whether a person with reduced influence of a palm print in the non-oriented line feature LF Therefore, the acceptance rate of others can be reduced.
- the veins mainly extend from the wrist toward the four fingers, and therefore the direction ⁇ corresponding to the directional feature g ⁇ estimated to include many veins is 90 °. Therefore, the directivity characteristics g theta of P component, becomes oriented features g vein is emphasized within the palm region theta, generated directional characteristic g theta of P component raised the overall brightness value is used
- the omni-directional line feature LF is enhanced with veins. As a result, it is possible to determine whether or not the person is the person by emphasizing veins that are more diverse than the palm print, so that the person rejection rate can be reduced.
- FIG. 5 is a diagram illustrating an example of a biometric authentication apparatus according to the second embodiment.
- the biometric authentication device 1 illustrated in FIG. 5 includes an image acquisition unit 2, a region specifying unit 3, a feature extraction unit 4, a matching processing unit 5, a score determination unit 6 (determination unit), and a storage unit 7. .
- the feature extraction unit 4 includes an omnidirectional feature generation processing unit 8 and a directional feature generation processing unit 9.
- the matching processing unit 5 includes an omnidirectional feature matching processing unit 10 and a directional feature matching processing unit 11.
- FIG. 6 is a flowchart illustrating a biometric authentication method according to the second embodiment.
- the image acquisition unit 2 acquires an image of a subject's hand (S11).
- the image acquisition unit 2 is an imaging device, and acquires a captured image of a hand of a subject using a single-plate imaging element and RGB color filters in a Bayer array.
- the area specifying unit 3 specifies a palm area corresponding to the palm of the subject in the image acquired by the image acquiring unit 2 (S12).
- the omnidirectional feature generation processing unit 8 generates an omnidirectional feature from the palm region image f specified by the region specifying unit 3, and the directional feature generation processing unit 9 specifies the region specifying unit 3.
- a directional feature is generated from the image f of the palm region to be performed (S13). Note that directional is defined as not omnidirectional.
- the omnidirectional feature matching processing unit 10 is similar to the omnidirectional feature generated by the omnidirectional feature extraction processing unit 8 and the registered omnidirectional feature registered in advance in the storage unit 7.
- the directional feature matching processing unit 11 calculates the degree of similarity between the directional feature generated by the directional feature extraction processing unit 9 and the registered directional feature registered in advance in the storage unit 7. (S14).
- the score determination unit 6 determines whether or not the person is the person based on the similarity obtained by the omnidirectional feature matching processing unit 10 and the similarity obtained by the directional feature matching processing unit 11 (S15).
- FIG. 7 is a diagram illustrating an example of the feature extraction unit 4 of the first embodiment.
- symbol is attached
- the feature extraction unit 4 shown in FIG. 7 includes a filter 41, a point-wise maximum selection unit 43, a binarization unit 44, a thinning unit 45, a selection unit 46, and binarization. Part 47.
- Each point maximum value selection unit 43 as shown in Equation 1 above, among the directional features g ⁇ (i, j) extracted from the filter 41, the maximum directional feature max ⁇ ⁇ g ⁇ (i, j) ⁇ is output as an omnidirectional feature g (i, j).
- the binarization unit 44 sets 1 to an omnidirectional plane when the omnidirectional feature g (i, j) output from each point maximum value selection unit 43 is a positive value.
- the omnidirectional feature g (i, j) is a value other than a positive value, 0 is output as the omnidirectional feature b (i, j).
- the omnidirectional surface feature b obtained at this time is stored in the storage unit 7.
- the thinning unit 45 obtains an omnidirectional line feature LF by performing thinning processing (skeltonizing) on the omnidirectional surface feature b as shown in the above equation 3. Note that skel represents a thinning process. Further, the omnidirectional line feature LF obtained at this time is stored in the storage unit 7.
- the selection unit 46 selects the directional feature g 0 ° corresponding to the direction ⁇ of 0 ° from the directional features g ⁇ extracted from the filter 41 and outputs the selected directional feature g 0 ° as the main directional feature g s. 41 out of the directional characteristic g theta extracted from outputs as the main directional characteristic g directional characteristics corresponding to the direction perpendicular to the theta g 90 ° orthogonal directional characteristics g p select.
- Binarizing unit 47 performs respective binarization processing to the main directional characteristic g s and the orthogonal oriented features g p selected by the selection unit 46, the result main oriented surface features b s and quadrature and outputs it as directional surface characteristics b p.
- the binarization unit 47 when the main directional feature g s (i, j) is positive, the binarization unit 47 outputs 1 as the main directional surface feature b s (i, j). When the main directional feature g s (i, j) is non-positive, 0 is output as the main directional surface feature b s (i, j).
- the main directional surface feature b s obtained at this time is stored in the storage unit 7.
- Equation 5 when the orthogonal directivity feature g p (i, j) is positive, the binarization unit 47 outputs 1 as the orthogonal directivity surface feature b p (i, j). When the orthogonal directional feature g p (i, j) is other than positive, 0 is output as the orthogonal directional surface feature b p (i, j). Incidentally, the orthogonal oriented surface characteristics b p obtained at this time is stored in the storage unit 7.
- the binarization unit 47 performs the binarization process by a threshold process using a simple constant 0. However, the binarization process may be performed using a more advanced adaptive-thresholding method.
- FIG. 8 is a diagram illustrating an example of the matching processing unit 5 according to the second embodiment.
- the matching processing unit 5 illustrated in FIG. 8 includes an omnidirectional feature matching processing unit 10, a directional feature matching processing unit 11, and a similarity adjustment unit 51.
- the omnidirectional feature matching processing unit 10 outputs the omnidirectional line feature LF output from the thinning unit 45 and stored in the storage unit 7, and the registered omnidirectional line feature registered in advance in the storage unit 7.
- the similarity score 1 with the TLF is obtained.
- the directional feature matching processing unit 11 outputs the main directional surface feature b s output from the binarization unit 47 and stored in the storage unit 7, and the registered main directional surface registered in advance in the storage unit 7. together determine the similarity score 2 between the characteristic Tb s, 2 orthogonal oriented surface characteristics b p stored is outputted to the storage unit 7 from the binarization unit 47, a registration stored in the storage unit 7 is registered in advance The similarity score 3 with the orthogonal directional surface feature Tb p is obtained.
- the similarity adjustment unit 51 weights the similarity score 2 and the similarity score 3 output from the directional feature matching processing unit 11 with a constant ak and a constant c as shown in Equation 6, and after weighting The sum of the similarity score 2 and the similarity score 3 is output as the similarity score 4 .
- the similarity adjuster 51 makes negative use of the similarity score 2 by reducing the weighting of the similarity score 2 compared to the weighting of the similarity score 3 .
- the similarity adjustment unit 51 actively uses the similarity score 3 by increasing the weighting of the similarity score 3 compared to the weighting of the similarity score 2 .
- the score determination unit 6 shown in FIG. 1 determines that the person is the person when the total similarity score of the similarity score 1 and the similarity score 4 is equal to or greater than a threshold value.
- the biometric authentication device 1 of the second embodiment it is possible to suppress an increase in the acceptance rate of others even when a method for physically separating a feature indicating a palmprint from an image cannot be applied.
- a method for physically separating a feature indicating a palmprint from an image cannot be applied.
- melanin is strongly deposited in the palm of the subject and a lot of palm prints are included in the omnidirectional line feature LF, it is determined whether the person is the person by suppressing the influence of the palm prints on the omnidirectional line feature LF. Therefore, the acceptance rate of others can be reduced.
- FIG. 9 is a diagram illustrating an example of hardware configuring the biometric authentication device 1 according to the embodiment of the present disclosure.
- the hardware configuring the biometric authentication device 1 includes a control unit 1201, a storage unit 1202, a recording medium reading device 1203, an input / output interface 1204, and a communication interface 1205. Each is connected by a bus 1206. Note that the hardware configuring the image processing apparatus 1 may be realized using a cloud or the like.
- the control unit 1201 may use, for example, a Central Processing Unit (CPU), a multi-core CPU, and a programmable device (Field Programmable Gate Array (FPGA), Programmable Logic Device (PLD), etc.). It corresponds to the area specifying unit 3, the feature extracting unit 4, the matching processing unit 5, and the score determining unit 6 shown.
- CPU Central Processing Unit
- FPGA Field Programmable Gate Array
- PLD Programmable Logic Device
- the storage unit 1202 corresponds to the storage unit 7 shown in FIG. 1 or FIG. Note that the storage unit 1202 may be used as a work area at the time of execution. Further, another storage unit may be provided outside the biometric authentication device 1.
- the recording medium reading device 1203 reads data recorded on the recording medium 1207 or writes data to the recording medium 1207 under the control of the control unit 1201.
- the removable recording medium 1207 is a non-transitory recording medium that can be read by a computer.
- a magnetic recording device for example, a hard disk device (HDD) can be considered.
- a hard disk device HDD
- the optical disc for example, Digital Versatile Disc (DVD), DVD-RAM, Compact Disc Read Only Memory (CD-ROM), CD-R (Recordable) / RW (ReWritable), and the like can be considered.
- the magneto-optical recording medium for example, Magneto-Optical disk (MO) can be considered.
- the storage unit 1202 is also included in a non-transitory recording medium.
- the input / output interface 1204 is connected to the input / output unit 1208 and sends information input from the input / output unit 1208 by the user to the control unit 1201 via the bus 1206.
- the input / output interface 1204 sends information sent from the control unit 1201 to the input / output unit 1208 via the bus 1206.
- the input / output unit 1208 corresponds to the image acquisition unit 2 shown in FIG. 1 or 5 and may be an imaging device, for example.
- the input / output unit 1208 may be, for example, a keyboard, a pointing device (such as a mouse), a touch panel, a Cathode Ray Ray Tube (CRT) display, a printer, or the like.
- the communication interface 1205 is an interface for performing Local Area Network (LAN) connection and Internet connection. Further, the communication interface 1205 may be used as an interface for performing a LAN connection, Internet connection, or wireless connection with another computer as necessary.
- LAN Local Area Network
- the communication interface 1205 may be used as an interface for performing a LAN connection, Internet connection, or wireless connection with another computer as necessary.
- each processing function (for example, the area specifying unit 3, the feature extracting unit 4, the matching processing unit 5, the score determination) is executed by executing a program describing the contents of various processing functions performed by the biometric authentication device 1 on a computer.
- Part 6 is realized on a computer.
- a program describing the contents of various processing functions can be stored in the storage unit 1202 or the recording medium 1207.
- a recording medium 1207 such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to record the program in a storage device of the server computer and transfer the program from the server computer to another computer via a network.
- the computer that executes the program stores, for example, the program recorded in the recording medium 1207 or the program transferred from the server computer in the storage unit 1202.
- the computer reads the program from the storage unit 1202 and executes processing according to the program.
- the computer can also read the program directly from the recording medium 1207 and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.
- the image processing apparatus that performs authentication using a palm vein has been described as an example.
- the present invention is not limited to this, and any other feature detection site of a living body may be used.
- other feature detection parts of a living body are not limited to veins, but include a blood vessel image of a living body, a pattern of a living body, a fingerprint or palm print of a living body, a sole, a toe, a back of a limb, a wrist, an arm, etc. May be.
- the other characteristic detection part of a biological body should just be a part which can observe a vein.
- any other feature detection part of a living body that can specify biological information is advantageous for authentication.
- a part can be specified from the acquired image.
- the above-described embodiment can be variously modified within a range not departing from the gist of the embodiment. Further, the above-described embodiments can be modified and changed by those skilled in the art, and are not limited to the exact configurations and application examples described.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
図1は、第1実施形態の生体認証装置の一例を示す図である。
まず、画像取得部2は、被験者の手の画像を取得する(S1)。例えば、画像取得部2は、撮像装置であって、単板の撮像素子とベイヤー配列のRGBの各カラーフィルタにより被検者の手の撮像画像を取得する。
図3に示す特徴抽出部4は、フィルタ41と、直交フィルタ42と、各点最大値選択(point-wise maximum)部43と、2値化部44と、細線化部45とを備える。
図5は、第2実施形態の生体認証装置の一例を示す図である。
照合処理部5は、無指向的特徴照合処理部10と、指向的特徴照合処理部11とを備える。
まず、画像取得部2は、被験者の手の画像を取得する(S11)。例えば、画像取得部2は、撮像装置であって、単板の撮像素子とベイヤー配列のRGBの各カラーフィルタにより被検者の手の撮像画像を取得する。
図8に示す照合処理部5は、無指向的特徴照合処理部10と、指向的特徴照合処理部11と、類似度調整部51とを備える。
2 画像取得部
3 領域特定部
4 特徴抽出部
5 照合処理部
6 スコア判定部
7 記憶部
8 無指向的特徴生成処理部
9 指向的特徴生成処理部
10 無指向的特徴照合処理部
11 指向的特徴照合処理部
41 フィルタ
42 直交フィルタ
43 各点最大値選択部
44 2値化部
45 細線化部
46 選択部
47 2値化部
51 類似度調整部
Claims (9)
- 入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出するフィルタと、
前記フィルタから抽出される複数の指向的特徴のうち、所定方向に対応する指向的特徴の全体の輝度値を下げ、前記所定方向に対応する指向的特徴に直交する方向に対応する指向的特徴の全体の輝度値を上げ、その他の指向的特徴をそのまま出力する直交フィルタと、
前記直交フィルタから出力される複数の指向的特徴により無指向的特徴を生成する無指向的特徴生成処理部と、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求める照合処理部と、
前記類似度を用いて、本人であるか否かの判定を行う判定部と、
を備える生体認証装置。 - 入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出するフィルタと、
前記フィルタから抽出される複数の指向的特徴により無指向的特徴を生成する無指向的特徴生成処理部と、
前記フィルタから抽出される複数の指向的特徴のうち、所定方向に対応する指向的特徴を選択して主要指向的特徴として出力するともに、前記フィルタから抽出される複数の指向的特徴のうち、前記主要指向的特徴に直交する方向に対応する指向的特徴を選択して直交指向的特徴として出力する選択部と、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との第1の類似度を求める無指向的特徴照合処理部と、
前記主要指向的特徴と前記記憶部に記憶されている登録主要指向的特徴との第2の類似度を求めるとともに、前記直交指向的特徴と前記記憶部に記憶されている登録直交指向的特徴との第3の類似度を求める指向的特徴照合処理部と、
前記第2の類似度の重み付けを前記第3の類似度の重み付けに比べて小さくし、それらの合計を第4の類似度として出力する類似度調整部と、
前記第1の類似度及び前記第4の類似度を用いて、本人であるか否かの判定を行う判定部と、
を備える生体認証装置。 - 入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出するフィルタと、
前記フィルタから抽出される複数の指向的特徴により無指向的特徴を生成する無指向的特徴生成処理部と、
前記フィルタから抽出される複数の指向的特徴のうち、所定方向に対応する指向的特徴を選択して主要指向的特徴として出力するともに、前記フィルタから抽出される複数の指向的特徴のうち、前記主要指向的特徴に直交する方向に対応する指向的特徴を選択して直交指向的特徴として出力する選択部と、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との第1の類似度を求める無指向的特徴照合処理部と、
前記主要指向的特徴と前記記憶部に記憶されている登録主要指向的特徴との第2の類似度を求めるとともに、前記直交指向的特徴と前記記憶部に記憶されている登録直交指向的特徴との第3の類似度を求める指向的特徴照合処理部と、
前記第3の類似度の重み付けを前記第2の類似度の重み付けに比べて大きくし、それらの合計を第4の類似度として出力する類似度調整部と、
前記第1の類似度及び前記第4の類似度を用いて、本人であるか否かの判定を行う判定部と、
を備える生体認証装置。 - コンピュータが、
入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出し、
前記抽出した複数の指向的特徴のうち、所定方向に対応する指向的特徴の全体の輝度値を下げ、前記所定方向に対応する指向的特徴に直交する方向に対応する指向的特徴の全体の輝度値を上げ、その他の指向的特徴をそのまま出力し、
前記出力した複数の指向的特徴により無指向的特徴を生成し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求め、
前記類似度を用いて、本人であるか否かの判定を行う、
ことを特徴とする生体認証方法。 - コンピュータが、
入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出し、
前記抽出した複数の指向的特徴により無指向的特徴を生成し、
前記抽出した複数の指向的特徴のうち、所定方向に対応する指向的特徴を選択して主要指向的特徴として出力し、
前記抽出した複数の指向的特徴のうち、前記主要指向的特徴に直交する方向に対応する指向的特徴を選択して直交指向的特徴として出力し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との第1の類似度を求め、
前記主要指向的特徴と前記記憶部に記憶されている登録主要指向的特徴との第2の類似度を求め、
前記直交指向的特徴と前記記憶部に記憶されている登録直交指向的特徴との第3の類似度を求め、
前記第2の類似度の重み付けを前記第3の類似度の重み付けに比べて小さくし、それらの合計を第4の類似度として出力し、
前記第1の類似度及び前記第4の類似度を用いて、本人であるか否かの判定を行う、
ことを特徴とする生体認証方法。 - コンピュータが、
入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出し、
前記抽出した複数の指向的特徴により無指向的特徴を生成し、
前記抽出した複数の指向的特徴のうち、所定方向に対応する指向的特徴を選択して主要指向的特徴として出力し、
前記抽出した複数の指向的特徴のうち、前記主要指向的特徴に直交する方向に対応する指向的特徴を選択して直交指向的特徴として出力し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との第1の類似度を求め、
前記主要指向的特徴と前記記憶部に記憶されている登録主要指向的特徴との第2の類似度を求め、
前記直交指向的特徴と前記記憶部に記憶されている登録直交指向的特徴との第3の類似度を求め、
前記第3の類似度の重み付けを前記第2の類似度の重み付けに比べて大きくし、それらの合計を第4の類似度として出力し、
前記第1の類似度及び前記第4の類似度を用いて、本人であるか否かの判定を行う、
ことを特徴とする生体認証方法。 - コンピュータに、
入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出し、
前記抽出した複数の指向的特徴のうち、所定方向に対応する指向的特徴の全体の輝度値を下げ、前記所定方向に対応する指向的特徴に直交する方向に対応する指向的特徴の全体の輝度値を上げ、その他の指向的特徴をそのまま出力し、
前記出力した複数の指向的特徴により無指向的特徴を生成し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求め、
前記類似度を用いて、本人であるか否かの判定を行う、
ことを実行させるためのプログラム。 - コンピュータに、
入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出し、
前記抽出した複数の指向的特徴により無指向的特徴を生成し、
前記抽出した複数の指向的特徴のうち、所定方向に対応する指向的特徴を選択して主要指向的特徴として出力し、
前記抽出した複数の指向的特徴のうち、前記主要指向的特徴に直交する方向に対応する指向的特徴を選択して直交指向的特徴として出力し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との第1の類似度を求め、
前記主要指向的特徴と前記記憶部に記憶されている登録主要指向的特徴との第2の類似度を求め、
前記直交指向的特徴と前記記憶部に記憶されている登録直交指向的特徴との第3の類似度を求め、
前記第2の類似度の重み付けを前記第3の類似度の重み付けに比べて小さくし、それらの合計を第4の類似度として出力し、
前記第1の類似度及び前記第4の類似度を用いて、本人であるか否かの判定を行う、
ことを実行させるためのプログラム。 - コンピュータに、
入力される画像からそれぞれ互いに異なる方向に対応する指向的特徴を抽出し、
前記抽出した複数の指向的特徴により無指向的特徴を生成し、
前記抽出した複数の指向的特徴のうち、所定方向に対応する指向的特徴を選択して主要指向的特徴として出力し、
前記抽出した複数の指向的特徴のうち、前記主要指向的特徴に直交する方向に対応する指向的特徴を選択して直交指向的特徴として出力し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との第1の類似度を求め、
前記主要指向的特徴と前記記憶部に記憶されている登録主要指向的特徴との第2の類似度を求め、
前記直交指向的特徴と前記記憶部に記憶されている登録直交指向的特徴との第3の類似度を求め、
前記第3の類似度の重み付けを前記第2の類似度の重み付けに比べて大きくし、それらの合計を第4の類似度として出力し、
前記第1の類似度及び前記第4の類似度を用いて、本人であるか否かの判定を行う、
ことを実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016509686A JP6069582B2 (ja) | 2014-03-25 | 2014-03-25 | 生体認証装置、生体認証方法、及びプログラム |
PCT/JP2014/058386 WO2015145591A1 (ja) | 2014-03-25 | 2014-03-25 | 生体認証装置、生体認証方法、及びプログラム |
EP14887087.6A EP3125193B1 (en) | 2014-03-25 | 2014-03-25 | Biometric authentication device, biometric authentication method, and program |
US15/261,137 US10019619B2 (en) | 2014-03-25 | 2016-09-09 | Biometrics authentication device and biometrics authentication method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/058386 WO2015145591A1 (ja) | 2014-03-25 | 2014-03-25 | 生体認証装置、生体認証方法、及びプログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/261,137 Continuation US10019619B2 (en) | 2014-03-25 | 2016-09-09 | Biometrics authentication device and biometrics authentication method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015145591A1 true WO2015145591A1 (ja) | 2015-10-01 |
Family
ID=54194189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/058386 WO2015145591A1 (ja) | 2014-03-25 | 2014-03-25 | 生体認証装置、生体認証方法、及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10019619B2 (ja) |
EP (1) | EP3125193B1 (ja) |
JP (1) | JP6069582B2 (ja) |
WO (1) | WO2015145591A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017022507A (ja) * | 2015-07-09 | 2017-01-26 | 株式会社リコー | 通信装置、通信システム、及びプログラム |
KR102468133B1 (ko) * | 2016-02-29 | 2022-11-18 | 엘지전자 주식회사 | 발 정맥 인증 장치 |
CN113722692B (zh) * | 2021-09-07 | 2022-09-02 | 墨奇科技(北京)有限公司 | 身份识别的装置及其方法 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000358025A (ja) * | 1999-06-15 | 2000-12-26 | Nec Corp | 情報処理方法、情報処理装置及び情報処理プログラムを記憶した記録媒体 |
JP2005149455A (ja) * | 2003-10-21 | 2005-06-09 | Sharp Corp | 画像照合装置、画像照合方法、画像照合プログラムおよび画像照合プログラムを記録したコンピュータ読取り可能な記録媒体 |
JP2006301881A (ja) * | 2005-04-19 | 2006-11-02 | Glory Ltd | 貨幣識別装置、貨幣識別方法および貨幣識別プログラム |
JP2009245347A (ja) * | 2008-03-31 | 2009-10-22 | Fujitsu Ltd | パターンの位置合わせ方法、照合方法及び照合装置 |
JP2009301104A (ja) * | 2008-06-10 | 2009-12-24 | Chube Univ | 物体検出装置 |
WO2012020718A1 (ja) * | 2010-08-12 | 2012-02-16 | 日本電気株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2012073684A (ja) * | 2010-09-27 | 2012-04-12 | Fujitsu Ltd | 画像認識方法及び装置並びにプログラム |
WO2013136553A1 (ja) * | 2012-03-16 | 2013-09-19 | ユニバーサルロボット株式会社 | 個人認証方法及び個人認証装置 |
JP2013200673A (ja) * | 2012-03-23 | 2013-10-03 | Fujitsu Ltd | 生体情報処理装置、生体情報処理方法、および生体情報処理プログラム |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6282304B1 (en) * | 1999-05-14 | 2001-08-28 | Biolink Technologies International, Inc. | Biometric system for biometric input, comparison, authentication and access control and method therefor |
US7072525B1 (en) * | 2001-02-16 | 2006-07-04 | Yesvideo, Inc. | Adaptive filtering of visual image using auxiliary image information |
US6876757B2 (en) * | 2001-05-25 | 2005-04-05 | Geometric Informatics, Inc. | Fingerprint recognition system |
US7636455B2 (en) * | 2002-06-04 | 2009-12-22 | Raytheon Company | Digital image edge detection and road network tracking method and system |
CN1238809C (zh) * | 2002-09-04 | 2006-01-25 | 长春鸿达光电子与生物统计识别技术有限公司 | 指纹识别方法、以及指纹控制方法和系统 |
US7496214B2 (en) * | 2002-09-25 | 2009-02-24 | The Hong Kong Polytechnic University | Method of palm print identification |
HK1062117A2 (en) * | 2002-09-25 | 2004-09-17 | Univ Hong Kong Polytechnic | Method of palm print identification using geometry, line and/or texture features |
US7664326B2 (en) * | 2004-07-09 | 2010-02-16 | Aloka Co., Ltd | Method and apparatus of image processing to detect and enhance edges |
US7359555B2 (en) * | 2004-10-08 | 2008-04-15 | Mitsubishi Electric Research Laboratories, Inc. | Detecting roads in aerial images using feature-based classifiers |
KR100752640B1 (ko) * | 2005-01-05 | 2007-08-29 | 삼성전자주식회사 | 방향성 기울기 필터를 이용한 지문 영역 분할 장치 및 방법 |
US20070036400A1 (en) * | 2005-03-28 | 2007-02-15 | Sanyo Electric Co., Ltd. | User authentication using biometric information |
JP4871144B2 (ja) * | 2006-01-13 | 2012-02-08 | 株式会社東芝 | 画像処理装置、方法、プログラム |
AU2006350242A1 (en) * | 2006-11-03 | 2008-05-08 | Snowflake Technologies Corporation | Method and apparatus for extraction and matching of biometric detail |
US20080298642A1 (en) * | 2006-11-03 | 2008-12-04 | Snowflake Technologies Corporation | Method and apparatus for extraction and matching of biometric detail |
KR101035930B1 (ko) * | 2007-01-24 | 2011-05-23 | 후지쯔 가부시끼가이샤 | 화상 판독 장치, 화상 판독 프로그램을 기록한 기록 매체, 화상 판독 방법 |
KR101484566B1 (ko) * | 2007-03-21 | 2015-01-20 | 루미다임 인크. | 국소적으로 일관된 피처를 기초로 하는 생체인식 |
KR100946155B1 (ko) * | 2007-08-20 | 2010-03-10 | 경희대학교 산학협력단 | 데시메이션-프리 방향성 필터 뱅크를 이용한 혈관 선명도개선 방법 |
JP5061988B2 (ja) * | 2008-03-25 | 2012-10-31 | 日本電気株式会社 | 隆線方向抽出装置および隆線方向抽出プログラムと隆線方向抽出方法 |
US20100158329A1 (en) * | 2008-12-19 | 2010-06-24 | Shajil Asokan Thaniyath | Elegant Solutions for Fingerprint Image Enhancement |
US20120108973A1 (en) * | 2010-11-01 | 2012-05-03 | Toshiba Medical Systems Corporation | Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus |
WO2012078114A1 (en) * | 2010-12-09 | 2012-06-14 | Nanyang Technological University | Method and an apparatus for determining vein patterns from a colour image |
JP2012256272A (ja) * | 2011-06-10 | 2012-12-27 | Seiko Epson Corp | 生体識別装置、及び、生体識別方法 |
US20130004028A1 (en) * | 2011-06-28 | 2013-01-03 | Jones Michael J | Method for Filtering Using Block-Gabor Filters for Determining Descriptors for Images |
KR101217214B1 (ko) * | 2012-02-15 | 2012-12-31 | 동국대학교 산학협력단 | 의료용 혈관 영상의 선명화 방법 |
JP5971089B2 (ja) * | 2012-11-14 | 2016-08-17 | 富士通株式会社 | 生体情報補正装置、生体情報補正方法及び生体情報補正用コンピュータプログラム |
JP6116291B2 (ja) * | 2013-02-27 | 2017-04-19 | オリンパス株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
US9141872B2 (en) * | 2013-09-11 | 2015-09-22 | Digitalglobe, Inc. | Automated and scalable object and feature extraction from imagery |
-
2014
- 2014-03-25 WO PCT/JP2014/058386 patent/WO2015145591A1/ja active Application Filing
- 2014-03-25 EP EP14887087.6A patent/EP3125193B1/en active Active
- 2014-03-25 JP JP2016509686A patent/JP6069582B2/ja active Active
-
2016
- 2016-09-09 US US15/261,137 patent/US10019619B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000358025A (ja) * | 1999-06-15 | 2000-12-26 | Nec Corp | 情報処理方法、情報処理装置及び情報処理プログラムを記憶した記録媒体 |
JP2005149455A (ja) * | 2003-10-21 | 2005-06-09 | Sharp Corp | 画像照合装置、画像照合方法、画像照合プログラムおよび画像照合プログラムを記録したコンピュータ読取り可能な記録媒体 |
JP2006301881A (ja) * | 2005-04-19 | 2006-11-02 | Glory Ltd | 貨幣識別装置、貨幣識別方法および貨幣識別プログラム |
JP2009245347A (ja) * | 2008-03-31 | 2009-10-22 | Fujitsu Ltd | パターンの位置合わせ方法、照合方法及び照合装置 |
JP2009301104A (ja) * | 2008-06-10 | 2009-12-24 | Chube Univ | 物体検出装置 |
WO2012020718A1 (ja) * | 2010-08-12 | 2012-02-16 | 日本電気株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2012073684A (ja) * | 2010-09-27 | 2012-04-12 | Fujitsu Ltd | 画像認識方法及び装置並びにプログラム |
WO2013136553A1 (ja) * | 2012-03-16 | 2013-09-19 | ユニバーサルロボット株式会社 | 個人認証方法及び個人認証装置 |
JP2013200673A (ja) * | 2012-03-23 | 2013-10-03 | Fujitsu Ltd | 生体情報処理装置、生体情報処理方法、および生体情報処理プログラム |
Non-Patent Citations (2)
Title |
---|
ARUN ROSS ET AL.: "A hybrid fingerprint matcher", PATTERN RECOGNITION, vol. 36, no. 7, 31 July 2003 (2003-07-31), pages 1661 - 1673, XP004417159 * |
See also references of EP3125193A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP6069582B2 (ja) | 2017-02-01 |
JPWO2015145591A1 (ja) | 2017-04-13 |
EP3125193A1 (en) | 2017-02-01 |
EP3125193B1 (en) | 2020-12-23 |
EP3125193A4 (en) | 2017-10-11 |
US20170206402A1 (en) | 2017-07-20 |
US10019619B2 (en) | 2018-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6553976B2 (ja) | 認証装置及び認証方法並びに記録媒体 | |
JP6528608B2 (ja) | 診断装置、及び診断装置における学習処理方法、並びにプログラム | |
JP2011159035A (ja) | 生体認証装置、生体認証方法およびプログラム | |
CN108399374B (zh) | 选择用于指纹识别的候选指纹图像的方法和设备 | |
JP5656768B2 (ja) | 画像特徴量抽出装置およびそのプログラム | |
JP6069582B2 (ja) | 生体認証装置、生体認証方法、及びプログラム | |
EP3217659B1 (en) | Image processing apparatus, image processing method, and program | |
JP2015204023A (ja) | 被写体検出装置、被写体検出方法及びプログラム | |
JP6069581B2 (ja) | 生体認証装置、生体認証方法、及びプログラム | |
JP6629150B2 (ja) | 手のひら検知装置、掌紋認証装置、手のひら検知方法、及びプログラム | |
KR101450247B1 (ko) | Sift 특징점에 기반한 손가락 정맥 인증 방법 | |
JP6431044B2 (ja) | 生体認証装置、生体認証方法、及びプログラム | |
JP6117988B2 (ja) | 生体認証装置、生体認証方法、及びプログラム | |
CN109409322B (zh) | 活体检测方法、装置及人脸识别方法和人脸检测系统 | |
KR20210127257A (ko) | 사용자의 생체 특성을 가지는 이미지 내에서 객체를 식별하고 생체 특성을 포함하는 이미지의 일부를 이미지의 다른 부분으로부터 분리함으로써 사용자의 신원을 식별하기 위한 방법(Method for verifying the identity of a user by identifying an object within an image that has a biometric characteristic of the user and separating a portion of the image comprising the biometric characteristic from other portions of the image) | |
JP2010277196A (ja) | 情報処理装置、情報処理方法およびプログラム | |
WO2015147088A1 (ja) | 生体情報登録方法、生体認証方法、生体情報登録装置、生体認証装置及びプログラム | |
Sindu et al. | Extraction of Iris Crypt, Pigment Spot, and Wolfflin Nodule Biological Feature using Feature Point Selection Algorithms | |
Ahmer et al. | Human identity recognition using ear lobes geometric features |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14887087 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016509686 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2014887087 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014887087 Country of ref document: EP |