CN118329894A - Review image processing method and review system - Google Patents
Review image processing method and review system Download PDFInfo
- Publication number
- CN118329894A CN118329894A CN202310033813.XA CN202310033813A CN118329894A CN 118329894 A CN118329894 A CN 118329894A CN 202310033813 A CN202310033813 A CN 202310033813A CN 118329894 A CN118329894 A CN 118329894A
- Authority
- CN
- China
- Prior art keywords
- image
- review
- detected
- flaw
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012552 review Methods 0.000 title claims abstract description 86
- 238000003672 processing method Methods 0.000 title claims abstract description 16
- 238000001514 detection method Methods 0.000 claims abstract description 29
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 7
- 230000007547 defect Effects 0.000 claims description 25
- 238000000034 method Methods 0.000 claims description 22
- 238000012545 processing Methods 0.000 claims description 22
- 230000003287 optical effect Effects 0.000 claims description 10
- 238000012360 testing method Methods 0.000 claims description 10
- 238000003708 edge detection Methods 0.000 claims description 9
- 238000007689 inspection Methods 0.000 claims description 9
- 230000002950 deficient Effects 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000003703 image analysis method Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims 4
- 238000010191 image analysis Methods 0.000 claims 2
- 238000012795 verification Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 2
- 238000013102 re-test Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012067 mathematical method Methods 0.000 description 1
- 239000013074 reference sample Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
A review image processing method comprising the steps of: obtaining detection information of an object to be detected, dividing the object to be detected into a plurality of sub-blocks, wherein the object to be detected has a flaw; acquiring a plurality of reinspected images shot at a plurality of focusing positions for each sub-block with the flaw according to the detection information; calculating the edge sharpness of each recheck image, and obtaining the recheck image with the highest edge sharpness; synthesizing the review image with the highest edge sharpness; and obtaining a complete and accurately focused flaw image.
Description
Technical Field
The present application relates to an image processing method, and more particularly, to a review image processing method and a review system after automatic optical detection.
Background
In the process of manufacturing products, such as printed circuit boards and semiconductor processes, in order to maintain high yield and high quality, performing flaw detection is a necessary procedure, and in the process of detecting objects to be detected, an automatic optical detection device is usually used to capture the objects to be detected through a camera, and then an image processing technique is used to obtain the features in the images of the objects to be detected, and the features are compared with a reference sample to determine flaws.
When the automatic optical detection equipment judges the flaws on the object to be detected through the image characteristics, the flaw detection equipment can deliver a rechecking system, and then the rechecking camera shoots the positions of the flaws to be judged, and image rechecking is carried out to confirm the authenticity of the flaws. In the re-inspection procedure, in the prior art, a focusing mechanism of a re-inspection camera is controlled to shoot a flaw part of an object to be inspected, so that the focusing mechanism reaches a focusing position and performs the flaw re-inspection after shooting.
When the camera in the recheck system adopts a lens with smaller depth of field, the requirement on focusing accuracy is very high, and in general, a flaw image is blurred only by a little deviation, especially, the recheck image shot on a to-be-detected object with a height difference can appear a local defocusing phenomenon, so that the recheck effect is influenced.
Disclosure of Invention
Aiming at the requirement that the rechecking program of the flaw image needs to acquire the image with an accurate focus, the main purpose of the application is to provide a rechecking image processing method, which comprises the following steps: obtaining detection information of an object to be detected, dividing the object to be detected into a plurality of sub-blocks, wherein the object to be detected has a flaw; acquiring a plurality of reinspected images shot at a plurality of focusing positions for each sub-block with the flaw according to the detection information; calculating the edge sharpness of each recheck image, and obtaining the recheck image with the highest edge sharpness; synthesizing the review image with the highest edge sharpness; and obtaining a complete and accurately focused flaw image.
Another object of the present application is to provide a review system, which obtains a detection information of an object to be detected having a flaw from an automatic optical detection system. The rechecking system comprises a carrying platform, a camera and an image processing device. The carrier is used for carrying the object to be detected. The camera is used for shooting the object to be detected. The image processing device is used for controlling the camera and executing a review image processing method. The recheck image processing method comprises the following steps: dividing an object to be detected into a plurality of subareas; acquiring a plurality of reinspected images shot at a plurality of focusing positions for each sub-block with the flaw according to the detection information; calculating the edge sharpness of each recheck image, and obtaining the recheck image with the highest edge sharpness; synthesizing the review image with the highest edge sharpness; and obtaining a complete and accurately focused flaw image.
Therefore, the application can effectively solve the problem of the blurring of the recheck image caused by taking a photo by adopting a shallow depth lens when performing optical recheck. Meanwhile, the positions with height differences in the retest images are clear, so that the overall image capturing effect is as the same as that of increasing the depth of field of the lens.
For a further understanding of the nature and the technical aspects of the present application, reference should be made to the following detailed description of the application and to the accompanying drawings, which are provided for purposes of reference only and are not intended to limit the application.
Drawings
FIG. 1 shows a schematic diagram of a review system according to an embodiment of the present application.
Fig. 2A is a schematic diagram showing an embodiment of the review system of the present application using a camera to capture an object to be inspected.
FIG. 2B is an enlarged view of the IIB block in FIG. 2A.
FIG. 3 is a flowchart of a review image processing method according to an embodiment of the application.
Fig. 4 is a schematic diagram showing an embodiment of the present application for dividing an object to be detected into a plurality of sub-blocks.
Fig. 5 is a schematic diagram showing an embodiment of the present application for obtaining multiple pitch review images of an object to be inspected.
FIG. 6 is a schematic diagram showing an embodiment of the present application for obtaining the clearest image from a plurality of review images.
FIG. 7 is a schematic diagram showing a method for processing a review image according to an embodiment of the present application, in which the best sharpness is selected for the combination of the most sharp defective images.
Detailed Description
For the purpose of flaw processing, one of the main targets is to obtain a flaw image with an accurate focus, in order to obtain a flaw image on the surface of an object to be detected (ObjectUnderTest), the object to be detected may be photographed by an automatic optical detection system (AutoOpticalInspection, AOI) through a camera, then a reference image is compared by an image recognition technology, or a trained deep learning system is used to judge the part with a flaw therein, and finally a flaw is confirmed through a rechecking process and a clear flaw image is obtained.
In contrast, the present application provides a review image processing method and a review system, as shown in fig. 1, in an embodiment, the review system includes an image processing apparatus 100, a camera 110, and a stage 120. In an embodiment, the image processing apparatus 100 may be implemented by a computer system, and the various functional modules are implemented by a software method in combination with hardware, but not limited thereto. The image processing apparatus 100 includes a detection information acquisition unit 101, a camera control unit 103, a flaw definition judgment unit 105, a flaw image synthesis unit 107, and a review unit 109. In other embodiments, the implementation of the image processing apparatus 100 may be changed according to design requirements.
In one embodiment, after the detection information obtaining unit 101 obtains a detection information of the object to be detected 200 from the automatic optical detection system, the camera control unit 103 controls the camera 110 to shoot the object to be detected 200 placed on the stage 120, so as to perform the flaw rechecking operation. In one embodiment, the object 200 has a flaw. In one embodiment, the object 200 may be, but not limited to, various industrial products such as a printed circuit board, a wafer, an integrated circuit layout or a substrate. In an embodiment, the detection information of the object to be detected 200 may be information such as a type, a size, a position, etc., but is not limited thereto. In another embodiment, the detection information of the object to be detected 200 may be a detection image, and the type, size and position information of the flaw are analyzed by the image processing device 100.
In one embodiment, the defect definition determining unit 105 obtains the review image from the camera 110 and performs the image definition determination to confirm the in-focus clear image. In one embodiment, the defect image synthesizing unit 107 synthesizes the clear images of the different defect image blocks to obtain a complete and accurately focused defect image. In one embodiment, the rechecking unit 109 performs rechecking judgment on the complete and accurately focused flaw image to confirm the authenticity of the flaw, so as to complete the rechecking operation.
In one embodiment of acquiring an image of the object to be inspected 200, an example of capturing an image of the object to be inspected having a height drop by a camera is shown in fig. 2A and 2B. According to fig. 2A, in one embodiment, the camera 110 has a lens 111 and the object 200 to be inspected has a surface 210. In one embodiment, the lens 111 is a shallow depth lens, and the surface 210 of the object 200 has a height difference region 211. The camera 110 photographs the surface 210 of the object to be inspected 200 through the lens 111. In one embodiment, the height difference region 211 of the surface 210 has a height difference d.
When the lens 111 is used to capture the surface 210 of the object 200 with a single focal length, a blurred area is generated in an image because the surface has a height difference and cannot be focused on each position accurately. For example, in the high-low drop region 211, when an image is captured with accurate focusing at an upper edge position 211a of the height drop d, a lower edge position 211b portion of the height drop d in the image may be blurred. Therefore, in order to obtain a complete image with accurate focus at a plurality of locations, in one embodiment, the camera 110 is controlled to capture a plurality of images with different focus depths (FocusDepth) at a plurality of focus positions (FocusPosition) on the same area of the surface 210, so as to obtain a clear focused image.
In order to obtain an accurate focusing flaw image, as shown in FIG. 3, the method for processing a review image of the present application includes the steps of S701 obtaining detection information of an object to be detected, dividing the object to be detected into a plurality of sub-blocks, and determining that the object to be detected has a flaw; s703, acquiring a plurality of review images captured at a plurality of focusing positions for each sub-block having a flaw according to the detection information; s705, calculating the edge sharpness of each recheck image, and obtaining the recheck image with the highest edge sharpness; s707 synthesizes the review image with the highest edge sharpness; and S709 obtains a flaw image that is complete and in focus.
In step S701, as shown in fig. 4, an embodiment of dividing the object to be detected 200 into a plurality of sub-blocks is shown. The object to be detected 200 is divided into a plurality of sub-blocks, wherein all or a part of the sub-blocks cover a defect 212 to form a plurality of blocks to be detected. One of the main purposes is to distinguish different blocks from the whole object 200 to be detected, so as to process the blocks one by one to obtain the block to be detected with accurate focusing. It should be noted that the size of the sub-block can be determined according to the actual requirements, and the reference requirements such as the surface defect pattern and defect resolution are required. In one embodiment, for example, the sub-blocks covered by the defect 212 are the test blocks numbered No.21, no.22, no.23, no.27, no.28 and No.29, and the subsequent re-inspection is performed on the six test blocks.
Next, referring to fig. 1, in step S703, after obtaining a plurality of blocks to be tested covering the defect 212, the review system controls the camera 110 through the camera control unit 103 of the image processing apparatus 100 to sequentially adjust the camera 110 to a plurality of focusing positions with a fixed Pitch (Pitch) to capture a plurality of review images of the blocks to be tested of the object 200.
In an embodiment, the implementation of obtaining multiple review images may be further combined with the embodiment shown in fig. 5, where multiple pitch review images are obtained for the object 200 to be detected. The review system sequentially uses an axial (in one embodiment, the Z-axis) adjustment camera 110 to capture the object 200 at a plurality of focus positions at a fixed pitch p, and particularly, to capture images at the positions of the blocks to be tested on the object 200, so as to obtain review images 401,403 and 405 as shown in fig. 4, wherein the review images 401,403 and 405 are for the same specific block to be tested. It should be noted that the determination of the pitch p may be determined according to the actual depth of field and the height fall of the surface defect, so as to enable each pixel of the defective image to obtain the highest edge sharpness (EdgeAcutance).
In detail, the review system first captures a plurality of review images at a plurality of focus positions at a fixed pitch p for the block under test No.21, no.22, no.23, no.27, no.28, and No.29 covered by the defect 212 for the block under test No. 21. After the multiple review images of the block No.21 to be tested are shot, the multiple review images of the block No.22 to be tested are shot for the block No.22 to be tested according to the same method. The re-inspection image photographing operations of the blocks to be tested No.21, no.22, no.23, no.27, no.28 and No.29 are sequentially completed according to the serial numbers of the blocks to be tested, respectively.
Then, in step S705, in the review image processing method, the defect definition determining unit 105 in the image processing apparatus 100 obtains the pixel gray-scale value of each review image block in the specific block to be detected, and further compares one review image with the highest edge tax, mainly referring to the sharpness or Resolution (Resolution) of the whole image.
In one embodiment, the edge detection (EdgeDetection) may be performed by using a mathematical method such as laplace operator (LaplacianOperator), other methods such as Sobel, scharr, etc., and the edge detection method is not limited in the present application, and the main purpose is to obtain the edge sharpness of each review image. The edge sharpness refers to gray-scale contrast of a dark and light tone boundary, and because the pixel value in the block to be detected has suddenly changed characteristics, one of the multiple review images which are shot at multiple focusing positions can be accurately focused according to the edge sharpness.
For example, in combination with fig. 6, an embodiment in which the clearest image is obtained from a plurality of review images is shown, in which the plurality of review images (501, 502,503,504, and 505) obtained by photographing the object to be inspected at a plurality of focus positions at a fixed pitch p are displayed as a first review image 501 numbered n=1, a second review image 502 numbered n=2, a third review image 503 numbered n=3, a fourth review image 504 numbered n=4, and a fifth review image 505 numbered n=5, respectively.
Since the purpose of the edge detection method is to obtain pixel characteristics with abrupt changes in shade in each image, it can be determined as an edge, that is, a portion with high edge sharpness. According to one of the embodiments of edge detection, the laplace operator is applied to each of the multiple images to calculate the edge sharpness of the gray-scaled pixels in each of the multiple images in a single differential manner, so as to generate a two-dimensional schematic diagram as shown in fig. 6, so as to determine the edge sharpness variation of different multiple images in a specific block to be detected. In one embodiment, the gray level of the third review image 503 (n=3) is differentiated once to obtain a peak 50, and the peak 50 represents the highest edge sharpness of the third review image 503. In one embodiment, a threshold may also be set to determine the review image with the highest edge sharpness. Further, the second differentiation can be performed by the Laplace operator to further confirm the review image with the highest edge sharpness.
Next, in step S707 and step S709, the review images of the block to be tested having the highest sharpness are synthesized. The flaw image synthesizing unit 107 of the image processing apparatus 100 combines the accurate focusing review images of the plurality of blocks to be tested to finally form a flaw image with the highest definition.
When the edge sharpness value of each recheck image in all the blocks to be tested covering the flaws is obtained, and then the images with the maximum edge sharpness are combined for one image with the maximum edge sharpness in each block to be tested after comparison, so that the flaw image with the highest sharpness is formed. As shown in fig. 7, the most clear flaw image embodiment obtained by the best sharpness selection is shown, wherein a plurality of blocks to be tested covering the flaw 212 are schematically shown, namely blocks to be tested numbered No.21, no.22, no.23, no.27, no.28, no.29, respectively, and a flaw image 60 with the highest sharpness formed by combining blocks to be tested with the greatest edge sharpness.
After obtaining the defect image 60 with the highest definition, the rechecking unit 109 of the image processing apparatus 100 confirms the authenticity of the defect 212 in the defect image 60, and then completes the rechecking operation. In an embodiment, the rechecking unit 109 of the image processing apparatus 100 may obtain the type and position of the defect 212 through a general image analysis method (such as gaussian, fourier, binarization, image subtraction, and pattern analysis), or perform the identification and positioning of the defect type by using a neural network through machine learning, deep learning, etc., which is not limited in the present application.
In summary, the review image processing method and the review system provided by the application can effectively solve the problem of review image blurring caused by taking a shallow depth lens during optical review. Meanwhile, the positions with height differences in the retest images are clear, so that the overall image capturing effect is as the same as that of increasing the depth of field of the lens.
The above disclosure is only a preferred embodiment of the present application and is not intended to limit the scope of the present application, so that all equivalent technical changes made by the specification and drawings of the present application are included in the scope of the present application.
Claims (18)
1. A method of review image processing, the method comprising:
obtaining detection information of an object to be detected, and dividing the object to be detected into a plurality of sub-blocks, wherein the object to be detected has a flaw;
Acquiring a plurality of review images photographed at a plurality of focusing positions for each sub-block having the flaw according to the detection information;
calculating the edge sharpness of each recheck image, and obtaining the recheck image with the highest edge sharpness;
synthesizing the duplicate images with the highest edge sharpness; and
A complete and accurately focused flaw image is obtained.
2. The method of claim 1, wherein the detection information of the object to be detected is a detection image obtained by capturing with an automatic optical detection system.
3. The review image processing method according to claim 1, wherein the detection information of the object to be detected is category, size, and/or position information including flaws.
4. The method of claim 1, wherein all or a portion of the sub-blocks cover the defect to form a plurality of blocks.
5. The method of claim 4, wherein in the step of obtaining a plurality of review images captured at a plurality of focus positions, a camera is sequentially adjusted to the focus positions at a fixed pitch, and the position of one of the blocks to be detected of the object to be detected is captured to obtain the review images.
6. The method for processing a review image according to claim 1, wherein each of the review images is processed into a gray-scale image after the review image is acquired, edge detection is performed according to pixel gray-scale value calculation, and a review image with a highest value after the gray-scale value calculation is determined as the review image with the highest edge sharpness.
7. The method of claim 6, wherein edge detection is performed to differentiate each of the review images to obtain edge sharpness, thereby obtaining edge features that cover abrupt changes in pixel values of the defects.
8. The review image processing method as set forth in claim 1, wherein after the defective image is obtained, the defective image is confirmed by an image analysis method.
9. The review image processing method as set forth in claim 8, wherein after the defective image is obtained, image analysis of the defective image is completed using a type of neural network.
10. A re-inspection system for obtaining inspection information of an object to be inspected having a flaw from an automated optical inspection system, the re-inspection system comprising:
a carrying platform for carrying the object to be detected;
A camera for shooting the object to be detected; and
An image processing apparatus for controlling the camera and for performing a review image processing method, the method comprising:
dividing the object to be detected into a plurality of subareas;
acquiring a plurality of reinspected images shot at a plurality of focusing positions for each sub-block with the flaw according to the detection information;
calculating the edge sharpness of each recheck image, and obtaining the recheck image with the highest edge sharpness;
Synthesizing the review image having the highest edge sharpness; and
A complete and accurately focused flaw image is obtained.
11. The review system of claim 10, wherein all or a portion of the sub-blocks cover the defect to form a plurality of blocks under test.
12. The review system of claim 11 wherein the camera sequentially takes the position of one of the test areas of the object to be inspected at a fixed pitch to the focus position to obtain the review image.
13. The review system of claim 10 wherein each of the review images is processed to a gray-scale image after the review image is acquired, edge detection is performed according to pixel gray-scale value calculation, and a review image with a highest value after gray-scale value calculation is determined to be the review image with the highest edge sharpness.
14. The recheck system of claim 13 wherein edge detection is performed using a laplacian operator to derive edge sharpness by differentiating each of the recheck images once to derive edge features that cover abrupt changes in pixel values of the defects.
15. The review system of claim 10 wherein the test information of the object is a test image and the image processing device is configured to analyze the location, size, and type of the defect.
16. The review system of claim 10 wherein the test information of the test object is information including the type, size, and/or location of the flaw.
17. The review system of claim 10 wherein the image processing device performs the verification of the defect on the defect image by an image analysis method.
18. The review system of claim 17 wherein the image processing device performs an image analysis of the defect image using a neural network.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310033813.XA CN118329894A (en) | 2023-01-10 | 2023-01-10 | Review image processing method and review system |
TW112107210A TW202429379A (en) | 2023-01-10 | 2023-03-01 | Method for processing review images and review system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310033813.XA CN118329894A (en) | 2023-01-10 | 2023-01-10 | Review image processing method and review system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118329894A true CN118329894A (en) | 2024-07-12 |
Family
ID=91773232
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310033813.XA Pending CN118329894A (en) | 2023-01-10 | 2023-01-10 | Review image processing method and review system |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN118329894A (en) |
TW (1) | TW202429379A (en) |
-
2023
- 2023-01-10 CN CN202310033813.XA patent/CN118329894A/en active Pending
- 2023-03-01 TW TW112107210A patent/TW202429379A/en unknown
Also Published As
Publication number | Publication date |
---|---|
TW202429379A (en) | 2024-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101886947B1 (en) | Apparatus and methods for weld measurement | |
CN110889823A (en) | SiC defect detection method and system | |
CN106501272B (en) | Machine vision soldering tin positioning detection system | |
CN105718931B (en) | System and method for determining clutter in acquired images | |
CN111812103A (en) | Image acquisition device, visual detection system and detection point extraction method | |
US20080040064A1 (en) | Surface inspection apparatus and surface inspection method | |
CN112150460A (en) | Detection method, detection system, device, and medium | |
JP3645547B2 (en) | Sample inspection equipment | |
CN113748311A (en) | Training method of automatic detection system for blade defects of turbine engine | |
JP2018096908A (en) | Inspection device and inspection method | |
JP5417197B2 (en) | Inspection apparatus and inspection method | |
JP3361768B2 (en) | X-ray fluorescence analyzer and X-ray irradiation position confirmation method | |
CN114113116A (en) | Accurate detection process method for micro-defects on surface of large-diameter element | |
CN112200790A (en) | Cloth defect detection method, device and medium | |
CN110426395B (en) | Method and device for detecting surface of solar EL battery silicon wafer | |
CN114964032B (en) | Blind hole depth measurement method and device based on machine vision | |
CN118329894A (en) | Review image processing method and review system | |
CN114113112B (en) | Surface micro defect positioning and identifying method based on three-light-source microscopic system | |
JP2005315792A (en) | Defect inspecting/classifying apparatus | |
TWI493177B (en) | Method of detecting defect on optical film with periodic structure and device thereof | |
CN112782176A (en) | Product appearance detection method and device | |
CN114286078A (en) | Camera module lens appearance inspection method and equipment | |
CN111521617A (en) | Optical detection apparatus, control method of optical detection apparatus, and storage medium | |
JP6595800B2 (en) | Defect inspection apparatus and defect inspection method | |
CN112926439B (en) | Detection method and device, detection equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |