Nothing Special   »   [go: up one dir, main page]

WO2020183837A1 - Counting system, counting device, machine learning device, counting method, component arrangement method, and program - Google Patents

Counting system, counting device, machine learning device, counting method, component arrangement method, and program Download PDF

Info

Publication number
WO2020183837A1
WO2020183837A1 PCT/JP2019/049274 JP2019049274W WO2020183837A1 WO 2020183837 A1 WO2020183837 A1 WO 2020183837A1 JP 2019049274 W JP2019049274 W JP 2019049274W WO 2020183837 A1 WO2020183837 A1 WO 2020183837A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
group
counting
unit
image
Prior art date
Application number
PCT/JP2019/049274
Other languages
French (fr)
Japanese (ja)
Inventor
祐也 島▲崎▼
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to CN201980093531.0A priority Critical patent/CN113518998B/en
Priority to JP2021505523A priority patent/JP7134331B2/en
Publication of WO2020183837A1 publication Critical patent/WO2020183837A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06MCOUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
    • G06M11/00Counting of objects distributed at random, e.g. on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Definitions

  • This disclosure relates to a counting system, a counting device, a machine learning device, a counting method, a component placement method, and a program.
  • Patent Document 1 describes an article that is continuously transferred, and binarizes the photographed image to obtain the area of the article. A counting device for counting articles is disclosed.
  • the counting device described in Patent Document 1 photographs articles continuously transferred by a conveyor and measures the number of articles existing between the dividing lines in which the articles do not exist. Therefore, the articles on the conveyor must be arranged so that there is a dividing line. In addition, the number of articles that can be transferred per unit time is limited, and counting takes time. Even if it is configured to count articles in a predetermined measurement range, there is a problem that accurate counting cannot be performed when articles exist across the boundary of the measurement range.
  • the present disclosure has been made in view of the above-mentioned problems, and it is not necessary to store a reference image of an article in a counting device that counts an article from a photographed image, and the time is shorter.
  • the purpose is to enable accurate counting.
  • the counting system includes a photographing device for photographing a stationary article and a counting device for counting the articles photographed by the photographing device.
  • the counting device has an image acquisition unit, a binarization unit, an area calculation unit, an article number calculation unit, and an information output unit.
  • the image acquisition unit acquires a captured image from the photographing device.
  • the binarization unit binarizes the captured image acquired by the image acquisition unit.
  • the area calculation unit calculates the area of a group that classifies the pixel distribution of the captured image that has been binarized by the binarization unit.
  • the article number calculation unit calculates the number of articles for each group from the area of the group calculated by the area calculation unit, totals the number of articles for each group, and indicates the total number of articles shown in the photographed image. To generate.
  • the information output unit outputs the number of articles information generated by the number of articles calculation unit.
  • a counting device that counts articles from captured images binarizes captured images of stationary articles and classifies the pixel distribution of the binarized captured images based on the area of the group. By counting, it is not necessary to store a reference image of the article, and accurate counting can be performed in a shorter time.
  • the figure which shows the example of the part number information which concerns on Embodiment 1. A flowchart showing a counting process executed by the counting device according to the first embodiment.
  • the figure which shows the functional configuration example of the counting apparatus which concerns on Embodiment 3. A flowchart showing a counting process executed by the counting device according to the third embodiment.
  • the figure which shows the functional structure example of the counting apparatus which concerns on Embodiment 5. A flowchart showing a counting process executed by the counting device according to the fifth embodiment.
  • the counting system, counting device, machine learning device, counting method, component placement method, and program according to the present embodiment will be described in detail below with reference to the drawings.
  • the same or corresponding parts are designated by the same reference numerals in the drawings.
  • This embodiment is an example of a counting system that counts parts.
  • the counting system 100 includes a counting device 1 for counting parts P, a photographing device 2 for photographing a photographing range C, and a user terminal 3 used by a user.
  • the counting device 1, the photographing device 2, and the user terminal 3 are connected by wire or wirelessly.
  • the photographing device 2 photographs the photographing range C while the component P is stationary, and transmits the photographed image to the counting device 1.
  • the timing at which the photographing device 2 shoots may be the timing at which the user inputs a shooting instruction to the photographing device 2, or may be the timing at which the photographing device 2 detects that the component P to be counted has been put into the shooting range C.
  • the counting device 1 binarizes the captured image received from the photographing device 2.
  • the counting device 1 classifies the pixel distribution of the binarized captured image into a group.
  • the counting device 1 calculates and totals the number of parts P for each group from the area of the group, and calculates the total number of parts P shown in the captured image.
  • the counting device 1 generates component number information indicating the calculated total number of components P and transmits it to the user terminal 3.
  • the number of parts information is an example of the number of articles information.
  • the counting device 1 When there is a non-countable group whose number of parts P cannot be calculated from the area of the group, the counting device 1 generates non-countable information indicating the non-countable group and transmits it to the user terminal 3.
  • the user terminal 3 displays the non-countable information received from the counting device 1.
  • the user visually counts the number of parts P of the uncountable group indicated by the uncountable information displayed on the user terminal 3.
  • the user inputs numerical information indicating the number of parts P of the uncountable group to the user terminal 3.
  • the user terminal 3 transmits the input numerical information to the counting device 1.
  • the counting device 1 When the counting device 1 receives the numerical information from the user terminal 3, the counting device 1 adopts the number of the parts P of the group that cannot be counted indicated by the numerical information, and calculates the total number of the parts P shown in the captured image. When the parts P do not overlap as in the case where the shape of the parts P is a sphere, a group that cannot be counted does not occur.
  • the counting device 1 generates component number information indicating the calculated total number of components P and transmits it to the user terminal 3.
  • the user terminal 3 outputs the component number information received from the counting device 1 by a method such as screen display or voice output. As a result, the user can grasp the number of inserted parts P.
  • the counting device 1 has, as a functional configuration, an image acquisition unit 11 that receives a captured image from the photographing device 2, a binarization unit 12 that binarizes the captured image, and a binarized photographing.
  • a reduction processing unit 13 that reduces the image
  • an area calculation unit 14 that calculates the area of the group that classifies the pixel distribution of the reduced captured image
  • a component number calculation unit 15 that calculates the total number of parts P from the area of the group. It includes an information output unit 16 that outputs information to the user terminal 3.
  • the image acquisition unit 11 receives the captured image from the photographing device 2.
  • the captured image is digital data.
  • the binarization unit 12 converts the captured image received by the image acquisition unit 11 into grayscale, and removes noise by a smoothing filter such as a Gaussian filter or a moving average filter.
  • the binarization unit 12 calculates a threshold value for a grayscale photographed image from which noise has been removed.
  • a method of calculating the threshold value of binarization there are a method of calculating the threshold value while performing iterative calculation by two-class clustering, a discriminant analysis method of calculating the threshold value at which the degree of separation is maximized, and the like. According to the discriminant analysis method, since it is not necessary to perform the iterative calculation, the threshold value for binarization can be calculated at high speed.
  • the binarization unit 12 binarizes the grayscale photographed image from which noise has been removed by using the calculated threshold value.
  • the reduction processing unit 13 reduces the captured image binarized by the binarization unit 12.
  • the fine structure of the component P shown in the captured image can be removed by reducing the binarized captured image. ..
  • erroneous detection of the contour of the component P can be suppressed. If the component P does not have a fine structure, the counting device 1 does not have to include the reduction processing unit 13.
  • the area calculation unit 14 classifies the pixel distribution of the captured image reduced by the reduction processing unit 13 into groups, and calculates the number of pixels of each group as the area.
  • a method of classifying into a group for example, there is a K-means method.
  • the area calculation unit 14 counts the number of edges in the binarized captured image and sets it as the value of the group number k.
  • the findcons function of OpenCV OpenSource Computer Vision Library
  • the area calculation unit 14 classifies the pixel distribution of the captured image binarized by the binarization unit 12 into groups, and sets the number of pixels of each group as the area. Calculate as.
  • the part number calculation unit 15 determines that the area of the smallest group excluding noise among the areas of each group calculated by the area calculation unit 14 is the area of one part P, and uses it as the reference area. As a method of removing noise, for example, there is the following method. A plurality of experiments are performed in advance using the counting device 1 to calculate the lower and upper limits of the area that the group of parts P can take. Alternatively, the component number calculation unit 15 arranges the area values of each group calculated by the area calculation unit 14 in ascending or descending order. The part number calculation unit 15 compares the value of the area of each group from the center to the smaller one, and if there is a difference of more than a predetermined value between adjacent values, the larger value is set as the lower limit.
  • the number of parts calculation unit 15 compares the value of the area of each group from the center to the larger one, and if there is a difference of more than a predetermined value between adjacent values, the smaller value is set as the upper limit.
  • the component number calculation unit 15 removes a group having an area smaller than the lower limit and a group having an area larger than the upper limit as noise.
  • the number of parts calculation unit 15 calculates the number of parts P of the group of the area within the first range including 1 times the reference area as one.
  • the part number calculation unit 15 calculates the number of parts P of the group of the area within the second range including twice the reference area as two.
  • the part number calculation unit 15 calculates the number of parts P of the group of the area within the third range including three times the reference area as three.
  • the component number calculation unit 15 calculates the number of components P of the group of the area within the Nth range including N times the reference area as N, and adds them.
  • N is a positive integer.
  • the Nth range is the minimum range that absorbs errors due to the shape and orientation of the component P.
  • the component number calculation unit 15 determines that the group whose area is out of the Nth range has overlapping components P, and cannot count the group.
  • the part number calculation unit 15 generates non-countable information indicating a non-countable group.
  • the information output unit 16 transmits the non-countable information generated by the component number calculation unit 15 to the user terminal 3.
  • the component number calculation unit 15 receives numerical information indicating the number of components P in a group that cannot be counted from the user terminal 3.
  • the part number calculation unit 15 adopts the number of parts P of the uncountable group indicated by the numerical information, and calculates the total number of parts P shown in the photographed image.
  • the component number calculation unit 15 generates component number information indicating the total number of calculated components P.
  • the number of parts calculation unit 15 is an example of the number of articles calculation unit.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3.
  • FIG. 3A shows an example of non-countable information.
  • images of all populations except noise are displayed.
  • the group that the number of parts calculation unit 15 has determined to be uncountable is surrounded by a square frame, and the characters "uncountable” are displayed.
  • the characters "Uncountable” are also displayed in the "Number of parts" column.
  • the non-countable information is not limited to the example of FIG. 3A, and for example, the group that the part number calculation unit 15 has made uncountable may be displayed in a different color from other groups, or may be blinked. There is no need to have a column for "number of parts".
  • the non-countable information may be information indicating the coordinates of the non-countable group on the photographing range C.
  • the user visually counts the number of parts P in the group surrounded by the square frame, and obtains numerical information indicating the number "5" of the parts P. Input to the user terminal 3.
  • the user terminal 3 transmits the input numerical information to the counting device 1.
  • the component number calculation unit 15 of the counting device 1 adds the number of components P "5" indicated by the numerical information to the total "46" obtained by adding the number of components P for each group calculated from the area, and adds the number of components P indicated by the numerical information to the captured image. Calculate the total number "51" of the parts P in the picture.
  • the component number calculation unit 15 generates component number information indicating the total number of components P “51” as shown in FIG. 3B.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3.
  • the user terminal 3 displays the received component number information. As a result, the user knows that the number of input parts P is 51.
  • images of all groups excluding noise are displayed.
  • a group in which the number of parts P is other than one is surrounded by a square frame, and the number is displayed.
  • the number "5" of the parts P indicated by the numerical information input by the user is also reflected in the group displayed as the group that cannot be counted in FIG. 3A.
  • "51" which is the total number of parts P, is displayed.
  • the number of parts information is not limited to the example of FIG. 3B, and for example, only the total number of parts P may be displayed, or the previous counting result, the cumulative number of parts P counted in a predetermined period, and the like may be displayed. May be good. Alternatively, it may be voice data notifying the total number of parts P.
  • the information output unit 16 of the counting device 1 may output the non-counting information and the number of parts information by a method such as screen display or voice output without transmitting the information to the user terminal 3.
  • the counting system 100 does not have to include the user terminal 3.
  • the user visually counts the number of parts P of the uncountable group indicated by the uncountable information output by the information output unit 16, and inputs numerical information indicating the number of parts P of the uncountable group to the part number calculation unit 15. To do.
  • the counting process shown in FIG. 4 starts when the power of the counting device 1 is turned on.
  • the image acquisition unit 11 of the counting device 1 does not receive the captured image from the photographing device 2 (step S11; NO)
  • the image acquisition unit 11 repeats step S11 and waits for the reception of the captured image.
  • the binarization unit 12 converts the captured image received by the image acquisition unit 11 into grayscale, and removes noise by a smoothing filter.
  • the binarization unit 12 calculates the binarization threshold value for the grayscale photographed image from which noise has been removed (step S12). At this time, if the discriminant analysis method is used, the threshold value for binarization can be calculated at high speed.
  • the binarization unit 12 binarizes the grayscale photographed image from which noise has been removed by using the calculated threshold value (step S13).
  • the reduction processing unit 13 reduces the captured image binarized by the binarization unit 12 (step S14). If the counting device 1 does not include the reduction processing unit 13, step S13 is omitted.
  • the area calculation unit 14 classifies the pixel distribution of the captured image reduced by the reduction processing unit 13 into groups, and calculates the area of each group (step S15). When step S13 is omitted, the area calculation unit 14 classifies the pixel distribution of the captured image binarized by the binarization unit 12 into groups, and calculates the area of each group.
  • the component number calculation unit 15 calculates the smallest area of each group calculated by the area calculation unit 14 excluding noise as a reference area which is the area of one component P (step S16). At this time, the component number calculation unit 15 determines a lower limit and an upper limit, and removes a group having an area smaller than the lower limit and a group having an area larger than the upper limit as noise.
  • the number of parts calculation unit 15 determines whether or not the area of each group classified by the area calculation unit 14 is within the Nth range including N times the reference area (step S17). When the area is within the Nth range including N times the reference area (step S17; YES), the number of parts P of the group is set to N and added to the number of parts P (step S20). When the area is outside the Nth range including N times the reference area (step S17; NO), the component number calculation unit 15 determines that the components P overlap and cannot count. The part number calculation unit 15 generates non-countable information indicating a non-countable group. The information output unit 16 transmits the non-countable information generated by the component number calculation unit 15 to the user terminal 3 (step S18). When the counting system 100 does not include the user terminal 3, the information output unit 16 displays the non-countable information in step S18.
  • step S17 images of all groups excluding noise are displayed.
  • the characters "Uncountable” are also displayed in the "Number of parts" column.
  • step S19 when the component number calculation unit 15 does not receive the numerical information indicating the number of the uncountable group of components P from the user terminal 3 (step S19; NO), the component number calculation unit 15 repeats step S19 to receive the numerical information. Wait.
  • step S19 receives numerical information indicating the number of components P in a group that cannot be counted from the user terminal 3 (step S19; YES)
  • the number of components P indicated by the numerical information is converted into the number of components P. Add (step S20).
  • step S19 the component number calculation unit 15 determines whether or not the numerical information has been input.
  • step S21 If there is a next group (step S21; YES), the process returns to step S17 and repeats steps S17 to S21. Steps S17 to S21 are repeated for all the groups classified by the area calculation unit 14, and when there is no next group (step S21; NO), the part number calculation unit 15 totals the number of parts P of all the groups. Generates component number information indicating the total number of component Ps.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3 (step S22). If the counting system 100 does not include the user terminal 3, the information output unit 16 displays the number of parts information in step S22.
  • step S17 images of all groups excluding noise are displayed.
  • step S17 the group determined to have an area within the second range is surrounded by a square frame, and the number “2” of the parts P is displayed.
  • step S17 the group determined to have an area within the third range is surrounded by a square frame, and the number of parts P “3” is displayed.
  • the group to which the area is determined to be out of the Nth range in step S17 and the uncountable information shown in FIG. 3A is transmitted in step S18 is also surrounded by a square frame and the numerical information received in step S19 is displayed.
  • the number "5" of the indicated component P is displayed.
  • "51" which is the total number of parts P, is displayed.
  • step S23 when the power of the counting device 1 is not turned off (step S23; NO), the process returns to step S11 and repeats steps S11 to S23.
  • step S23; YES when the power of the counting device 1 is turned off (step S23; YES), the process ends.
  • steps S18 and S19 may be omitted.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 stores the group image information in which the image of the group and the number of parts P of the group are associated with each other, and corresponds to the image of the group similar to the image of the group that cannot be counted.
  • the number of attached parts P is adopted as the number of parts P of a group that cannot be counted.
  • the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations.
  • a group image storage unit 17 for storing group image information is provided.
  • the group image is associated with the image of the uncountable group and the number of components P indicated by the numerical information. Generate information.
  • the group image storage unit 17 stores the group image information generated by the component number calculation unit 15. An example of group image information is shown in FIG. For example, in the first group image information, the image of the uncountable group shown in the non-countable information of FIG. 3A is associated with the number “5” of the parts P indicated by the numerical information. That is, the group image information is information in which the image of the group that could not be counted is associated with the number of parts P of the group.
  • the component number calculation unit 15 refers to the group image storage unit 17, and whether or not there is group image information in which the image of the group that cannot be counted and the image of the group are similar. To judge.
  • a method of determining whether or not the two images are similar for example, there is a method of extracting feature points of the two images, calculating the similarity, and determining that the two images are similar if the similarity is higher than the threshold value.
  • the component number calculation unit 15 determines the number of parts P associated with the image of the group that is similar to the image of the group that cannot be counted. It is adopted as the number of parts P in a group that cannot be counted.
  • the component number calculation unit 15 generates the non-countable information and the information output unit 16 cannot count, as in the first embodiment.
  • the information is transmitted to the user terminal 3.
  • the part number calculation unit 15 adopts the number of parts P of the uncountable group indicated by the numerical information received from the user terminal 3.
  • the component number calculation unit 15 generates group image information each time numerical information indicating the number of uncountable group components P is received from the user terminal 3. Group image information is cumulatively stored in the group image storage unit 17. Other functions of the counting device 1 are the same as those of the first embodiment.
  • the counting process shown in FIG. 7 starts when the power of the counting device 1 is turned on. Since steps S31 to S36 are the same as steps S11 to S16 of the flowchart shown in FIG. 4, the description thereof will be omitted.
  • the component number calculation unit 15 determines whether or not the area of each group classified by the area calculation unit 14 is within the Nth range including N times the reference area (step S37). When the area is within the Nth range including N times the reference area (step S37; YES), the number of parts P of the group is set to N and added to the number of parts P (step S42).
  • the component number calculation unit 15 refers to the group image storage unit 17, and the group image and the group image that cannot be counted. Determines if there is similar group image information (step S38).
  • the part number calculation unit 15 is associated with the group image similar to the non-countable group image.
  • the number of parts P is adopted as the number of parts P in a group that cannot be counted, and is added to the number of parts P (step S42).
  • step S38 When there is no group image information in which the non-countable group image and the group image are similar (step S38; NO), the component number calculation unit 15 generates non-countable information indicating the non-countable group.
  • the information output unit 16 transmits the non-countable information generated by the component number calculation unit 15 to the user terminal 3 (step S39).
  • step S40 When the component number calculation unit 15 does not receive the numerical information indicating the number of the non-countable group of components P from the user terminal 3 (step S40; NO), the component number calculation unit 15 repeats step S40 and waits for the reception of the numerical information.
  • the component number calculation unit 15 receives numerical information indicating the number of parts P of the uncountable group from the user terminal 3 (step S40; YES), the image of the uncountable group and the number of components P indicated by the numerical information
  • the group image information is generated in association with the above, and the group image information is stored in the group image storage unit 17 (step S41).
  • the image of the uncountable group shown in the uncountable information of FIG. 3A is associated with the number “5” of the parts P indicated by the numerical information. There is.
  • the component number calculation unit 15 adds the number of components P indicated by the received numerical information to the number of components P (step S42). If there is a next population (step S43; YES), the process returns to step S37 and repeats steps S37-S43. Steps S37 to S43 are repeated for all the groups classified by the area calculation unit 14, and when there is no next group (step S43; NO), the part number calculation unit 15 totals the number of parts P of all the groups. Generates component number information indicating the total number of component Ps.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3 (step S44). When the power of the counting device 1 is not turned off (step S45; NO), the process returns to step S31 and repeats steps S31 to S45. When the power of the counting device 1 is turned off (step S45; YES), the process ends.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the user can visually count the number of parts P in a group that cannot be counted, and instead of inputting numerical information to the user terminal 3, the parts P on which the leveling operation devices overlap can be leveled and counted. Perform a leveling operation.
  • the counting system 100 includes a counting device 1, a photographing device 2, and a user terminal 3, as well as a leveling operation device 4 that performs a leveling operation.
  • the counting device 1 and the leveling operation device 4 are connected by wire or wirelessly.
  • the counting device 1 transmits the leveling operation instruction information instructing the leveling operation to the leveling operation device 4.
  • the leveling operation device 4 receives the leveling operation instruction information from the counting device 1, the leveling operation device 4 performs the leveling operation.
  • the leveling operation device 4 breaks the overlap of the parts P by, for example, vibrating the table on which the parts P are loaded.
  • the leveling operation instruction information includes the coordinates of the non-countable group on the photographing range C, and the leveling operation device 4 injects air with respect to the coordinates of the non-countable group to overlap the parts P. It may be a device that breaks or breaks the overlap of parts P with a robot arm.
  • the photographing device 2 photographs the photographing range C after the leveling operation device 4 breaks the overlap of the parts P.
  • the timing of shooting by the shooting device 2 the user may input a shooting instruction to the shooting device 2, or the shooting device 2 may detect the leveling operation of the leveling operation device 4 and shoot.
  • Other configurations of the counting system 100 are the same as those in the first embodiment.
  • the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations.
  • a leveling operation instruction unit 18 is provided.
  • the image acquisition unit 11, the binarization unit 12, the reduction processing unit 13, and the area calculation unit 14 perform the same processing as in the first embodiment.
  • the part number calculation unit 15 calculates the reference area regarded as the area of one part P, and calculates the number of parts P of the group of the area within the Nth range including N times the reference area as N pieces.
  • the component number calculation unit 15 determines that the group whose area is out of the Nth range has overlapping components P, and cannot count the group.
  • the component number calculation unit 15 generates leveling operation instruction information that instructs the leveling operation device 4 to perform the leveling operation.
  • the leveling operation instruction information includes the coordinates of the uncountable group on the photographing range C
  • the component number calculation unit 15 includes the coordinates of all the uncountable groups on the photographing range C.
  • the information output unit 16 transmits the leveling operation instruction information generated by the component number calculation unit 15 to the leveling operation device 4.
  • the leveling operation device 4 performs a leveling operation when it receives the leveling operation instruction information from the counting device 1.
  • the photographing device 2 photographs the photographing range C after the leveling operation device 4 breaks the overlap of the parts P by the leveling operation.
  • the image acquisition unit 11, the binarization unit 12, the reduction processing unit 13, and the area calculation unit 14 perform the same processing as in the first embodiment.
  • the number of parts calculation unit 15 totals the number of parts P of all the groups and calculates the total number of parts P shown in the captured image.
  • the component number calculation unit 15 generates component number information indicating the total number of calculated components P.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3.
  • Other functions of the counting device 1 are the same as those of the first embodiment.
  • the counting process shown in FIG. 10 starts when the power of the counting device 1 is turned on. Since steps S51 to S56 are the same as steps S11 to S16 of the flowchart shown in FIG. 4, the description thereof will be omitted.
  • the component number calculation unit 15 determines whether or not the area of each group classified by the area calculation unit 14 is within the Nth range including N times the reference area (step S57). When the area is outside the Nth range including N times the reference area (step S57; NO), the component number calculation unit 15 generates leveling operation instruction information for instructing the leveling operation device 4 to perform the leveling operation. To do.
  • the information output unit 16 transmits the leveling operation instruction information generated by the component number calculation unit 15 to the leveling operation device 4 (step S58).
  • the component number calculation unit 15 determines in step S57 whether or not all the groups are within the Nth range. If there is a group outside the Nth range, leveling operation instruction information including the coordinates of all the uncountable groups on the photographing range C is generated.
  • the information output unit 16 transmits the leveling operation instruction information generated by the component number calculation unit 15 to the leveling operation device 4.
  • the component number calculation unit 15 totals the number of component Ps of all the groups, generates component number information indicating the total number of component Ps shown in the captured image, and omits step S60. ..
  • the leveling operation device 4 performs a leveling operation when it receives the leveling operation instruction information from the counting device 1.
  • the photographing device 2 photographs the photographing range C after the leveling operation device 4 breaks the overlap of the parts P.
  • the process returns to step S51 and repeats steps S51 to S57.
  • the area is within the Nth range including N times the reference area (step S57; YES)
  • the number of parts P of the group is set to N and added to the number of parts P (step S59).
  • step S60 If there is a next group (step S60; YES), the process returns to step S57 and repeats steps S57 to S60. Steps S57 to S60 are repeated for all the groups classified by the area calculation unit 14, and when there is no next group (step S60; NO), the part number calculation unit 15 totals the number of parts P of all the groups. Generates component number information indicating the total number of component Ps.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3 (step S61). If the power of the counting device 1 is not turned off (step S62; NO), the process returns to step S51 and repeats steps S51 to S62. When the power of the counting device 1 is turned off (step S62; YES), the process ends.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 counts the number of parts to be delivered.
  • the counting device 1 stores the shooting record information in which the information for identifying the part and the shot image of the part are associated with each other, and determines whether or not the part to be delivered and the part P shown in the shot image match. judge. Further, the counting device 1 reflects the number of parts P to be delivered in the inventory information indicating the number of stocks of the parts P.
  • the counting system 100 includes a counting device 1, a photographing device 2, a user terminal 3, a production management system 5 for instructing the delivery of parts, and an inventory management system 6 for managing the inventory of parts. And.
  • the counting device 1 is connected to the production management system 5 and the inventory management system 6 via a network.
  • the production management system 5 transmits the delivery information indicating the parts to be delivered to the user terminal 3 and the counting device 1.
  • the user terminal 3 outputs the delivery information received from the production management system 5 by a method such as screen display or voice output.
  • the user terminal 3 When the user terminal 3 outputs the delivery information, the user puts the parts indicated by the delivery information into the shooting range C.
  • the work of putting the parts indicated by the shipping information into the shooting range C is referred to as the shipping work.
  • the photographing device 2 photographs the photographing range C, and transmits the photographed image to the counting device 1.
  • the counting device 1 receives the shipping information from the production management system 5, and each time it receives the shooting image from the shooting device 2, it determines whether or not there is shooting record information in which the information for identifying the goods to be shipped matches the information for identifying the goods. To do. If there is shooting record information in which the information for identifying the articles matches, the counting device 1 refers to the information and determines whether or not the parts to be delivered and the parts P shown in the shot image match. If they do not match, the counting device 1 outputs error information indicating a warning. If there is no photographing record information that matches the information for identifying the article, the counting device 1 corresponds the photographed image received from the photographing device 2 with the information for identifying the part included in the delivery information received from the production management system 5. Attach to generate and store shooting record information. If the counting device 1 generates and stores the shooting record information and it is found that the component P put into the shooting range C by the user is not a component to be delivered, the corresponding shooting record information is deleted. It may be configured.
  • the inventory management system 6 stores inventory information indicating the number of parts P in stock.
  • the counting device 1 calculates the total number of parts P shown in the captured image, that is, the total number of parts P to be delivered in one delivery operation.
  • the inventory information is updated by subtracting the total number of parts P to be delivered in one delivery operation from the number of parts P in the inventory information stored in the inventory management system 6.
  • Other configurations of the counting system 100 are the same as those in the first embodiment.
  • the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations.
  • a correctness / rejection determination unit 19 that determines whether or not the parts to be delivered and the component P shown in the photographed image match, the photography record storage unit 20 that stores the shooting record information, and the inventory update that updates the inventory information.
  • a unit 21 is provided.
  • the correctness determination unit 19 When the correctness determination unit 19 receives the delivery information from the production management system 5 and the image acquisition unit 11 receives the captured image from the photographing device 2, the correctness determination unit 19 refers to the photographing result storage unit 20 and identifies the parts included in the issuing information. It is determined whether or not there is shooting record information in which the information and the information for identifying the part match.
  • the correctness / rejection determination unit 19 is included in the shot image received by the image acquisition unit 11 from the shooting device 2 and the shipping information received from the production management system 5.
  • the shooting record information is generated in association with the information for identifying the parts, and is stored in the shooting record storage unit 20.
  • the correctness determination unit 19 compares the shot image of the part indicated by the shooting record information with the shot image received by the image acquisition unit 11, and the same component. It is determined whether or not it is.
  • a method of determining whether or not the components are the same for example, there is a method of extracting feature points from two captured images, calculating the similarity, and determining that the components are the same if the similarity is higher than the threshold value.
  • the correctness determination unit 19 generates error information indicating a warning when it is determined that the parts are not the same.
  • the information output unit 16 transmits the error information generated by the correctness determination unit 19 to the user terminal 3.
  • the user terminal 3 displays the error information received from the counting device 1. As a result, the user can know that the inserted component P is incorrect.
  • the correctness determination unit 19 determines that the parts are the same, the binarization unit 12, the reduction processing unit 13, and the area calculation unit 14 perform the same processing as in the first embodiment.
  • the delivery information is transmitted to the counting device 1 not only in the production management system 5.
  • the user may input the delivery information into the correctness determination unit 19, or the correctness determination unit 19 may acquire the delivery information input by the user into the user terminal 3 from the user terminal 3.
  • the number of parts calculation unit 15 calculates the total number of parts P shown in the photographed image
  • the number of parts information indicating the calculated total number of parts P is generated as shown in FIG.
  • the information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3.
  • the user terminal 3 displays the received component number information.
  • the “cumulative number of parts” 114 which is the cumulative number of parts P counted after receiving the shipping information, and the shipping information.
  • the number of shipments of the component P indicated by "the number of delivery instructions" 200 is displayed. As a result, the user can grasp the number of the input parts P and the remaining 86 parts P need to be delivered.
  • the inventory management system 6 causes the inventory update unit 21 to calculate.
  • the inventory information is updated by subtracting the total number of parts P to be delivered in one delivery operation from the number of parts P in stock in the stored inventory information.
  • the inventory information is not limited to the inventory management system 6.
  • the counting device 1 may store the inventory information, or an external storage device may store the inventory information. Other functions of the counting device 1 are the same as those of the first embodiment.
  • the counting process shown in FIG. 14 starts when the power of the counting device 1 is turned on.
  • the correctness determination unit 19 of the counting device 1 receives the delivery information from the production management system 5 (step S71).
  • the image acquisition unit 11 does not receive the captured image from the photographing device 2 (step S72; NO)
  • the image acquisition unit 11 repeats step S72 and waits for the reception of the captured image.
  • the correctness determination unit 19 refers to the photographing result storage unit 20, and the information for identifying the component included in the delivery information and the component. It is determined whether or not there is shooting record information that matches the information that identifies the image (step S73).
  • the correctness / rejection determination unit 19 is included in the shot image received by the image acquisition unit 11 from the shooting device 2 and the shipping information received from the production management system 5.
  • the shooting record information is generated by associating it with the information that identifies the part.
  • the correctness / rejection determination unit 19 stores the generated shooting record information in the shooting record storage unit 20 (step S76), and the process proceeds to step S77.
  • step S73 When there is matching shooting record information (step S73; YES), the correctness determination unit 19 compares the shot image of the component indicated by the shooting record information with the shot image received by the image acquisition unit 11, and the same component. It is determined whether or not it is (step S74). When it is determined that the parts are not the same (step S74; NO), the correctness determination unit 19 generates error information indicating a warning.
  • the information output unit 16 transmits the error information generated by the correctness determination unit 19 to the user terminal 3 (step S75).
  • the user terminal 3 displays the error information received from the counting device 1.
  • step S72 returns to step S72, and steps S72 to S74 are repeated. If it is determined that the parts are the same (step S74; YES), the process proceeds to step S77. Since steps S77 to S87 are the same as steps S12 to S22 of the flowchart shown in FIG. 4, the description thereof will be omitted.
  • the inventory update unit 21 calculates the inventory stored in the inventory management system 6.
  • the inventory information is updated by subtracting the total number of parts P to be delivered in one delivery operation from the number of parts P in stock in the information (step S88). If the power of the counting device 1 is not turned off (step S89; NO), the process returns to step S71, and steps S71 to S89 are repeated. When the power of the counting device 1 is turned off (step S89; YES), the process ends.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 stores the shooting record information in which the information for identifying the component and the captured image of the component are associated with each other, and whether or not the component to be delivered and the component P shown in the captured image match. Is determined, and if they do not match, error information is output to prevent the wrong parts from being delivered. Further, by updating the inventory information every time the warehousing operation is performed, it is possible to shorten the time during which the inventory quantity of the component P indicated by the inventory information does not match the actual inventory quantity of the component P.
  • the counting device 1 detects a marker that defines the counting range from the captured image and counts the number of parts P in the counting range.
  • the counting system 100 includes a counting device 1, a photographing device 2, and a user terminal 3.
  • the photographing device 2 photographs the photographing range C and transmits the photographed image to the counting device 1.
  • a marker M is assigned to the photographing range C, and the counting device 1 calculates the number of parts P in the counting range R surrounded by the marker M.
  • the marker M may be given in advance or may be given by the user.
  • Other configurations of the counting system 100 are the same as those in the first embodiment.
  • the counting device 1 has functional configurations such as an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16.
  • a marker storage unit 22 for storing marker information indicating the marker M is provided.
  • the binarization unit 12 refers to the marker information and detects the marker M in the captured image received by the image acquisition unit 11.
  • the binarization unit 12 identifies the counting range R surrounded by the markers M from the captured image received by the image acquisition unit 11.
  • the binarization unit 12, the reduction processing unit 13, the area calculation unit 14, the number of parts calculation unit 15, and the information output unit 16 perform the same processing as in the first embodiment on the captured image of the counting range R. ..
  • the counting process shown in FIG. 17 starts when the power of the counting device 1 is turned on.
  • the image acquisition unit 11 of the counting device 1 does not receive the captured image from the photographing device 2 (step S91; NO)
  • the image acquisition unit 11 repeats step S91 and waits for the reception of the captured image.
  • the binarization unit 12 refers to the marker information and displays the marker M in the captured image received by the image acquisition unit 11. To detect.
  • the binarization unit 12 identifies the counting range R surrounded by the markers M from the captured image received by the image acquisition unit 11 (step S92).
  • the binarization unit 12, the reduction processing unit 13, the area calculation unit 14, the number of parts calculation unit 15, and the information output unit 16 perform the processes of steps S93 to S104 on the captured image of the counting range R. Since steps S93 to S104 are the same as steps S12 to S23 of the flowchart shown in FIG. 4, the description thereof will be omitted.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting system 100 includes a counting device 1, a photographing device 2, a user terminal 3, and a machine learning device 7 that learns the counting result of the counting device 1.
  • the machine learning device 7 connects to the counting device 1 and the user terminal 3 via a network.
  • the machine learning device 7 may be incorporated in the counting device 1 or may exist on the cloud server.
  • the photographing device 2 photographs the photographing range C while the component P is stationary, and transmits the photographed image to the counting device 1.
  • the counting device 1 calculates the total number of parts P appearing in the captured image based on the captured image received from the photographing device 2.
  • the counting device 1 generates component number information indicating the calculated total number of components P and transmits it to the user terminal 3. Further, the counting device 1 transmits the calculated number of parts information indicating the total number of parts P and the photographed image used for calculating the total number of parts P in association with each other to the machine learning device 7.
  • the user terminal 3 When the user terminal 3 outputs the component number information, the user counts the total number of the actual components P and inputs the correctness information indicating whether or not the counting result of the counting device 1 is correct to the user terminal 3. The user terminal 3 associates the input correctness information with the number of parts information and transmits the input to the machine learning device 7.
  • the machine learning device 7 learns the counting result of the counting device 1 from the data set generated based on the captured image and the number of parts information received from the counting device 1 and the correctness information received from the user terminal 3. As a result of learning the counting result of the counting device 1, the machine learning device 7 generates a learned model that outputs the total number of parts P shown in the input captured image.
  • Other configurations of the counting system 100 are the same as those in the first embodiment.
  • the image acquisition unit 11 of the counting device 1 sends the captured image received from the photographing device 2 to the information output unit 16.
  • the information output unit 16 transmits the captured image received from the image acquisition unit 11 and the component number information generated by the component number calculation unit 15 to the machine learning device 7.
  • Other functional configurations of the counting device 1 are the same as those in the first embodiment.
  • the machine learning device 7 learns the data acquisition unit 71 that receives the captured image and the number of parts information from the counting device 1, the correctness information acquisition unit 72 that receives the correctness information from the user terminal 3, and the counting result of the counting device 1. It includes a learning unit 73 that generates a trained model, and a storage unit 74 that stores the trained model generated by the learning unit 73.
  • the data acquisition unit 71 sends the captured image and the number of parts information received from the counting device 1 to the learning unit 73.
  • the correctness information acquisition unit 72 sends the correctness information received from the user terminal 3 to the learning unit 73.
  • the learning unit 73 generates a data set for machine learning based on the captured image and the number of parts information received from the data acquisition unit 71 and the correctness information received from the correctness information acquisition unit 72. The learning unit 73 learns the counting result of the counting device 1 from the generated data set.
  • the learning unit 73 learns the counting result of the counting device 1 by supervised learning according to, for example, a neural network model.
  • Supervised learning is a model in which a learning device learns features in those data sets and estimates the result from the input by giving a large amount of data sets of inputs and results (labels) to the learning device.
  • the data set generated by the learning unit 73 for supervised learning uses the captured image in which the counting result of the counting device 1 indicated by the correctness information is correct as input data, and labels the total number of parts P indicated by the corresponding number of parts information. It is a data set associated as data.
  • the total number of parts P indicated by the number of parts information corresponding to the photographed image in which the counting result of the counting device 1 indicated by the correctness information was correct is the total number of parts P in which the counting result of the counting device 1 indicated by the correctness information was correct.
  • the captured image in which the counting result of the counting device 1 indicated by the correctness information is correct is simply referred to as a captured image in which the counting result is correct.
  • the total number of parts P for which the counting result of the counting device 1 indicated by the correctness information is correct is simply referred to as the total number of parts P for which the counting result is correct.
  • a neural network is composed of an input layer composed of a plurality of neurons, an intermediate layer composed of a plurality of neurons, and an output layer composed of a plurality of neurons.
  • the intermediate layer may be one layer or two or more layers.
  • the values are multiplied by weights w11 to w16 to form an intermediate layer. It is input to neurons Y1 and Y2.
  • the results output from the neurons Y1 to Y2 are further multiplied by the weights w21 to w26 and output from the output layers neurons Z1 to Z3.
  • the results output from neurons Z1 to Z3 vary depending on the values of weights w11 to w16 and weights w21 to w26.
  • the learning unit 73 learns the counting result of the counting device 1 according to the neural network model, the captured image with the correct counting result is input to the input layer of the neural network.
  • the neural network adjusts the weights and learns to bring the result output from the output layer closer to the total number of parts P for which the counting result is correct, that is, the label data.
  • the learning unit 73 stores in the storage unit 74 a learned model of the neural network that outputs the total number of parts P shown in the input captured image. The learning is completed, for example, when the data set for the test is input and the correct answer rate of the output exceeds the threshold value.
  • the learning unit 73 may classify the supervised learning and learn the counting result.
  • the learning unit 73 classifies the input data into classes corresponding to the label data. That is, the captured image with the correct counting result is classified into the class corresponding to the total number of parts P with the correct counting result.
  • the learning unit 73 learns the characteristics of the captured image for which the counting result is correct for each class, that is, for each total number of parts P for which the counting result is correct.
  • the learning unit 73 classifies the input captured image into one of the classes, and stores in the storage unit 74 a learned model that outputs the total number of parts P which are labels corresponding to the classified classes. ..
  • the learning unit 73 may learn the counting result according to the data set generated for the plurality of parts.
  • the data acquisition unit 71 receives the information for identifying the parts from the counting device 1 in addition to the captured image and the information on the number of parts.
  • the learning unit 73 may generate a data set based on captured images of a plurality of parts collected from the same counting device 1, information on the number of parts, information for identifying the parts, and correctness / rejection information, or may generate a plurality of countings.
  • a data set may be generated based on a photographed image of a plurality of parts collected from the device 1, information on the number of parts, information for identifying the parts, and correctness information.
  • the parts for which the data set is to be collected may be added or deleted in the middle.
  • the machine learning device 7 that has learned the counting result of a certain part of a certain counting device 1 may be attached to another counting device 1 to relearn the counting result of another part.
  • the learning algorithm used in the learning unit 73 deep learning, which learns the extraction of the feature amount itself, can also be used.
  • the learning unit 73 may execute machine learning according to other known methods such as genetic programming, functional logic programming, and support vector machines.
  • the correctness information acquisition unit 72 of the machine learning device 7 may acquire correctness information from other than the user terminal 3.
  • the counting system 100 measures the total weight of the parts P put into the shooting range C in addition to the counting device 1, the photographing device 2, the user terminal 3, and the machine learning device 7.
  • the weight measuring device 8 may be provided.
  • the machine learning device 7 and the weight measuring device 8 are connected by wire or wirelessly.
  • the weight measuring device 8 transmits the total part weight information indicating the total weight of the parts P put into the photographing range C to the machine learning device 7.
  • the correctness / rejection information acquisition unit 72 stores in advance unit weight information indicating the weight of one component P.
  • the correctness / rejection information acquisition unit 72 divides the total weight of the parts P indicated by the total weight information received from the weight measuring device 8 by the weight of one part P, and calculates the total number of parts P put into the photographing range C. calculate.
  • the correctness / rejection information acquisition unit 72 the total number of parts P indicated by the number of parts information received from the measuring device 1 by the data acquisition unit 71 and the total number of parts P calculated based on the total weight information received from the weight measuring device 8 are calculated. If they match, the correctness information indicating that the counting result of the counting device 1 is correct is generated. If they do not match, the correctness / rejection information acquisition unit 72 generates correctness / rejection information indicating that the counting result of the counting device 1 is incorrect.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the trained model generated by the machine learning device 7 the total number of parts P shown in the captured image can be calculated faster than that of the counting device 1.
  • the trained model generated by the machine learning device 7 is stored in a computer-readable recording medium and distributed, and the trained model is installed in a computer capable of acquiring captured images to obtain the counting device 1. A similar device can be easily realized.
  • the component P counted by the counting device 1 is arranged on the substrate.
  • the counting system 100 includes a counting device 1, a photographing device 2, a user terminal 3, and a component arranging device 9 for arranging the component P on a substrate.
  • the component placement device 9 is connected to the counting device 1 and the user terminal 3.
  • the counting device 1 may be incorporated in the component arranging device 9.
  • the component arranging device 9 stores the component P counted by the counting device 1 in a component supply unit (not shown).
  • the component arranging device 9 includes a component supply unit for each component.
  • the counting device 1 can count and the component arranging device 9 has a plurality of components to be arranged on the substrate, for example, the user inputs information for identifying the components put into the photographing range C into the user terminal 3.
  • the user terminal 3 transmits information for identifying the input component to the counting device.
  • the counting device 1 associates the information for identifying the parts received from the user terminal 3 with the parts number information.
  • the component arranging device 9 arranges the components housed in the component supply unit on the substrate.
  • the substrate on which the component P is arranged is sent to the solder flow tank, so that the component P is soldered and mounted on the substrate.
  • the component P arranged on the substrate is soldered and mounted by the soldering device.
  • the component P placed on the board by the operator is manually soldered and mounted.
  • the information output unit 16 of the counting device 1 transmits the number of parts information to the parts arranging device 9 instead of the user terminal 3.
  • Other functional configurations of the counting device 1 are the same as those in the first embodiment.
  • the component arranging device 9 receives the component number information from the counting device 1, the component arranging device 9 stores the component P counted by the counting device 1 in the component supply unit while adding the total number of components P indicated by the component number information.
  • the parts arranging device 9 indicates to the user terminal 3 that the number of parts P has reached a certain number. Send arrival information.
  • a certain number may be, for example, the number of parts P that the component arranging device 9 can arrange on the board during the operating time of one day, that is, the required number of components P of the component arranging device 9 per day.
  • the number may be determined based on the upper limit of the number of parts P that can be stored in the parts supply unit.
  • the user terminal 3 outputs the arrival information received from the component arranging device 9 by a method such as screen display or voice output.
  • the user stops putting the component P into the shooting range C.
  • the counting device 1 can count and the component arranging device 9 has other components to be arranged on the substrate, the user puts the next component into the photographing range C.
  • the arrival information may include information that identifies the next component. In this case, the user puts the next component into the shooting range C based on the information for identifying the next component included in the arrival information output by the user terminal 3.
  • step S22 the information output unit 16 transmits the component number information not only to the user terminal 3 but also to the component arranging device 9.
  • the component placement process shown in FIG. 23 starts when the power of the component placement device 9 is turned on.
  • step S111; NO the component arranging device 9 repeats step S111 and waits for the reception of the component number information.
  • the parts arranging device 9 adds the total number of parts indicated by the number of parts information (step S112), and supplies the parts counted by the counting device 1. It is stored in the unit (step S113).
  • the component arranging device 9 determines whether or not the total number of added components, that is, the number of components stored in the component supply unit has reached a certain number (step S114). If a certain number has not been reached (step S114; NO), the process returns to step S111, and steps S111 to S114 are repeated.
  • step S114 When a certain number is reached (step S114; YES), the component arranging device 9 transmits arrival information indicating that the number of components has reached a certain number to the user terminal 3 (step S115).
  • the user terminal 3 outputs the arrival information received from the component arranging device 9.
  • the user stops putting the parts into the shooting range C.
  • the counting device 1 When the counting device 1 is capable of counting and the component arranging device 9 has other components to be arranged on the substrate (step S116; YES), the user puts the next component into the photographing range C.
  • the arrival information may include information that identifies the next component. In this case, the user sees the information for identifying the next component output by the user terminal 3 and puts the component into the shooting range C.
  • the counting device 1 calculates the total number of the next parts put into the photographing range C, and transmits the part number information to the parts arranging device 9.
  • the process returns to step S111, and the component arranging device 9 receives component number information from the counting device 1 (step S111; YES).
  • the component arranging device 9 repeats steps S111 to S116.
  • the counting device 1 can count and the component arranging device 9 has no other components to be arranged on the board (step S116; NO)
  • the component arranging device 9 arranges the components stored in the component supply unit on the board. (Step S117), and the process
  • the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image.
  • the component arranging device 9 stores the components P counted by the counting device 1 in the component supply unit until a certain number is reached. As a result, the component arranging device 9 can smoothly arrange the components on the substrate without running out of the components stored in the component supply unit.
  • the counting device 1 includes a temporary storage unit 101, a storage unit 102, a calculation unit 103, an operation unit 104, an input / output unit 105, and a display unit 106.
  • the temporary storage unit 101, the storage unit 102, the operation unit 104, the input / output unit 105, and the display unit 106 are all connected to the calculation unit 103 via the BUS.
  • the calculation unit 103 is, for example, a CPU (Central Processing Unit). According to the control program stored in the storage unit 102, the calculation unit 103 includes a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and a leveling operation instruction unit 18. Each process of the correctness determination unit 19 and the inventory update unit 21 is executed.
  • a CPU Central Processing Unit
  • the temporary storage unit 101 is, for example, a RAM (Random-Access Memory).
  • the temporary storage unit 101 loads the control program stored in the storage unit 102 and is used as a work area of the calculation unit 103.
  • the storage unit 102 is a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc-Random Access Memory), and a DVD-RW (Digital Versatile Disc-ReWritable).
  • the storage unit 102 stores in advance a program for causing the calculation unit 103 to perform the processing of the counting device 1, and supplies the data stored by this program to the calculation unit 103 according to the instruction of the calculation unit 103, and the calculation unit 102.
  • the data supplied from 103 is stored.
  • the group image storage unit 17, the shooting record storage unit 20, and the marker storage unit 22 are configured in the storage unit 102.
  • the operation unit 104 is an interface device that connects an input device such as a keyboard and a pointing device and an input device such as a keyboard and a pointing device to the BUS. For example, in the case of a configuration in which information is directly input to the counting device 1, the input information is supplied to the calculation unit 103 via the operation unit 104.
  • the input / output unit 105 is a network termination device or wireless communication device connected to the network, and a serial interface or LAN (Local Area Network) interface connected to them.
  • the input / output unit 105 functions as an image acquisition unit 11, a component number calculation unit 15, an information output unit 16, a leveling operation instruction unit 18, a correctness determination unit 19, and an inventory update unit 21.
  • the display unit 106 is a display device such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display). For example, in the case of a configuration in which information is directly input to the counting device 1, the display unit 106 displays an operation screen. Further, when the information output unit 16 displays the non-countable information and the number of parts information, the display unit 106 functions as the information output unit 16.
  • CTR Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the control program controls the processing of the group image storage unit 17, the leveling operation instruction unit 18, the shooting result storage unit 20, the correctness / rejection determination unit 19, the inventory update unit 21, and the marker storage unit 22. It is executed by processing using the calculation unit 103, the storage unit 102, the operation unit 104, the input / output unit 105, the display unit 106, and the like as resources.
  • the central part of the counting device 1, such as the calculation unit 103, the temporary storage unit 101, the storage unit 102, the operation unit 104, the input / output unit 105, and the display unit 106, is a normal computer regardless of a dedicated system. It can be realized using a system.
  • a computer program for executing the above operation can be a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc-Read Only Memory), or a DVD-ROM (Digital Versatile Disc-Read Only Memory).
  • the counting device 1 that executes the above-mentioned processing may be configured by storing and distributing the computer program in the computer and installing the computer program in the computer. Further, the counting device 1 may be configured by storing the computer program in a storage device of a server device on a communication network such as the Internet and downloading it by a normal computer system.
  • the function of the counting device 1 is realized by sharing the OS (Operating System) and the application program, or by coordinating the OS and the application program, only the application program part is stored in a recording medium, a storage device, or the like. You may.
  • OS Operating System
  • the computer program may be posted on a bulletin board system (BBS, Bulletin Board System) on the communication network, and the computer program may be provided via the communication network. Then, by starting this computer program and executing it in the same manner as other application programs under the control of the OS, the above processing may be executed.
  • BSS bulletin board System
  • the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations.
  • a correct / reject determination unit 19 a shooting record storage unit 20, and an inventory update unit 21 are provided, but the present invention is not limited to this.
  • the counting device 1 includes an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, an information output unit 16, a correctness determination unit 19, and a shooting record storage.
  • a configuration may be configured in which the unit 20 is provided but the inventory update unit 21 is not provided, or the correctness determination unit 19 and the shooting record storage unit 20 are not provided but the inventory update unit 21 is provided.
  • the counting device 1 of the above-described embodiments 4 and 5 adds a function to the counting device 1 of the first embodiment, but the present invention is not limited to this. Both functions of the functions of the fourth and fifth embodiments may be added to the counting device 1 of the first embodiment, or the counting devices 1 of the second embodiment or the third embodiment may have the functions of the fourth and fifth embodiments. One or both of the functions of the above may be added.
  • the machine learning device 7 of the sixth embodiment learns the counting result of the counting device 1, but the present invention is not limited to this, and the correctness of the counting result of the counting device 1 may be learned.
  • the data set generated by the learning unit 73 for supervised learning uses the total number of parts P indicated by the captured image and the number of parts information as input data, and is the counting result of the counting device 1 indicated by the corresponding correctness information. It is a data set in which correctness is associated as label data.
  • the learning unit 73 learns the correctness of the counting result of the counting device 1 from the generated data set.
  • the learning unit 73 As a result of learning the correctness of the counting result of the counting device 1, the learning unit 73 generates a learned model that outputs the correctness of the counting result with respect to the total number of input captured images and parts P. For example, another counting method in which the photographed image of the counting device 1 and the number of parts information are input to this trained model, and when the counting result of the counting device 1 is output to be incorrect, the user visually counts. By using, the accuracy rate of the counting result of the total number of parts P can be increased.
  • the counting system 100 of the above-described embodiments 6 and 7 adds a machine learning device 7 or a component arranging device 9 to the counting system 100 of the first embodiment, but the present invention is not limited to this.
  • the machine learning device 7 or the component placement device 9 may be added to the counting system 100 of the first to fifth embodiments, or both the machine learning device 7 and the component placement device 9 may be added.
  • the counting device 1 of the above embodiment 7 transmits the component number information to the component arranging device 9 instead of the user terminal 3, and the component arranging device 9 reaches a certain number of components P to the user terminal 3.
  • Send arrival information indicating that.
  • the arrival information is displayed on the user terminal 3, the user stops putting the component P into the shooting range C.
  • the counting device 1 transmits the component number information to the user terminal 3 and the component arranging device 9, adds the number of components P indicated by the component number information received by the user terminal 3, and reaches a certain number. Then, the arrival information may be displayed. In this case, the component arranging device 9 does not have to transmit the arrival information to the user terminal 3.
  • the counting system 100 may be any counting system that counts articles.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

An image acquisition unit (11) of this counting device (1) acquires a captured image from an image-capturing device (2) that captures a stationary component. A binarization unit (12) binarizes the captured image acquired by the image acquisition unit (11). An area calculation unit (14) calculates the area of a group into which the pixel distribution of the captured image binarized by the binarization unit (12) is classified. A number-of-components calculation unit (15) calculates the number of components for each group from the area of the group calculated by the area calculation unit (14), sums the number of components for each group, and generates number-of-components information indicating the total number of components captured in the captured image. An information output unit (16) outputs the number-of-components information generated by the number-of-components calculation unit (15).

Description

計数システム、計数装置、機械学習装置、計数方法、部品配置方法、および、プログラムCounting system, counting device, machine learning device, counting method, component placement method, and program
 本開示は、計数システム、計数装置、機械学習装置、計数方法、部品配置方法、および、プログラムに関する。 This disclosure relates to a counting system, a counting device, a machine learning device, a counting method, a component placement method, and a program.
 物品を計数する技術として、物品を撮影した撮影画像と、物品の基準となる画像とを比較して、撮影画像から物品を検出して計数する技術がある。しかしながら、この技術では、物品の基準となる画像を記憶しておく必要がある。そこで、物品の基準となる画像を記憶しておく必要がない物品の計数装置として、特許文献1には、連続して移送される物品を撮影し、撮影した画像を二値化して物品の面積から物品を計数する計数装置が開示されている。 As a technique for counting articles, there is a technique for detecting and counting articles from photographed images by comparing a photographed image of the articles with a reference image of the articles. However, in this technique, it is necessary to store a reference image of the article. Therefore, as an article counting device that does not need to store a reference image of an article, Patent Document 1 describes an article that is continuously transferred, and binarizes the photographed image to obtain the area of the article. A counting device for counting articles is disclosed.
特開平9-124142号公報JP-A-9-124142
 特許文献1に記載の計数装置は、コンベアで連続して移送される物品を撮影し、物品の存在していない区切り線と区切り線との間に存在する物品の数を計測する。したがって、コンベア上の物品を区切り線が存在する配置にしなければならない。また、単位時間あたりに移送できる物品の数が制限され、計数に時間がかかる。予め決められた計測範囲の物品を計数する構成にしたとしても、計測範囲の境界を跨いで物品が存在した場合、正確に計数できない問題がある。 The counting device described in Patent Document 1 photographs articles continuously transferred by a conveyor and measures the number of articles existing between the dividing lines in which the articles do not exist. Therefore, the articles on the conveyor must be arranged so that there is a dividing line. In addition, the number of articles that can be transferred per unit time is limited, and counting takes time. Even if it is configured to count articles in a predetermined measurement range, there is a problem that accurate counting cannot be performed when articles exist across the boundary of the measurement range.
 本開示は、上記のような問題点に鑑みてなされたものであり、撮影画像から物品を計数する計数装置において、物品の基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数を可能にすることを目的とする。 The present disclosure has been made in view of the above-mentioned problems, and it is not necessary to store a reference image of an article in a counting device that counts an article from a photographed image, and the time is shorter. The purpose is to enable accurate counting.
 上記目的を達成するため、本開示に係る計数システムは、静止した物品を撮影する撮影装置、および、撮影装置が撮影した物品を計数する計数装置を備える。計数装置は、画像取得部と、二値化部と、面積算出部と、物品数算出部と、情報出力部とを有する。画像取得部は、撮影装置から撮影画像を取得する。二値化部は、画像取得部が取得した撮影画像を二値化する。面積算出部は、二値化部が二値化した撮影画像のピクセル分布を分類した集団の面積を算出する。物品数算出部は、面積算出部が算出した集団の面積から集団ごとの物品の数を算出し、集団ごとの物品の数を合計し、撮影画像に写っている物品の総数を示す物品数情報を生成する。情報出力部は、物品数算出部が生成した物品数情報を出力する。 In order to achieve the above object, the counting system according to the present disclosure includes a photographing device for photographing a stationary article and a counting device for counting the articles photographed by the photographing device. The counting device has an image acquisition unit, a binarization unit, an area calculation unit, an article number calculation unit, and an information output unit. The image acquisition unit acquires a captured image from the photographing device. The binarization unit binarizes the captured image acquired by the image acquisition unit. The area calculation unit calculates the area of a group that classifies the pixel distribution of the captured image that has been binarized by the binarization unit. The article number calculation unit calculates the number of articles for each group from the area of the group calculated by the area calculation unit, totals the number of articles for each group, and indicates the total number of articles shown in the photographed image. To generate. The information output unit outputs the number of articles information generated by the number of articles calculation unit.
 本開示によれば、撮影画像から物品を計数する計数装置が、静止した物品を撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて物品を計数することで、物品の基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。 According to the present disclosure, a counting device that counts articles from captured images binarizes captured images of stationary articles and classifies the pixel distribution of the binarized captured images based on the area of the group. By counting, it is not necessary to store a reference image of the article, and accurate counting can be performed in a shorter time.
実施の形態1に係る計数システムの構成を示す図The figure which shows the structure of the counting system which concerns on Embodiment 1. 実施の形態1に係る計数装置の機能構成例を示す図The figure which shows the functional configuration example of the counting apparatus which concerns on Embodiment 1. 実施の形態1に係る計数不可情報の例を示す図The figure which shows the example of the non-countable information which concerns on Embodiment 1. 実施の形態1に係る部品数情報の例を示す図The figure which shows the example of the part number information which concerns on Embodiment 1. 実施の形態1に係る計数装置が実行する計数処理を示すフローチャートA flowchart showing a counting process executed by the counting device according to the first embodiment. 実施の形態2に係る計数装置の機能構成例を示す図The figure which shows the functional configuration example of the counting apparatus which concerns on Embodiment 2. 実施の形態1に係る集団画像情報の例を示す図The figure which shows the example of the group image information which concerns on Embodiment 1. 実施の形態2に係る計数装置が実行する計数処理を示すフローチャートA flowchart showing a counting process executed by the counting device according to the second embodiment. 実施の形態3に係る計数システムの構成を示す図The figure which shows the structure of the counting system which concerns on Embodiment 3. 実施の形態3に係る計数装置の機能構成例を示す図The figure which shows the functional configuration example of the counting apparatus which concerns on Embodiment 3. 実施の形態3に係る計数装置が実行する計数処理を示すフローチャートA flowchart showing a counting process executed by the counting device according to the third embodiment. 実施の形態4に係る計数システムの構成を示す図The figure which shows the structure of the counting system which concerns on Embodiment 4. 実施の形態4に係る計数装置の機能構成例を示す図The figure which shows the functional configuration example of the counting apparatus which concerns on Embodiment 4. 実施の形態4に係る部品数情報の例を示す図The figure which shows the example of the part number information which concerns on Embodiment 4. 実施の形態4に係る計数装置が実行する計数処理を示すフローチャートA flowchart showing a counting process executed by the counting device according to the fourth embodiment. 実施の形態5に係る計数システムの構成を示す図The figure which shows the structure of the counting system which concerns on Embodiment 5. 実施の形態5に係る計数装置の機能構成例を示す図The figure which shows the functional structure example of the counting apparatus which concerns on Embodiment 5. 実施の形態5に係る計数装置が実行する計数処理を示すフローチャートA flowchart showing a counting process executed by the counting device according to the fifth embodiment. 実施の形態6に係る計数システムの構成を示す図The figure which shows the structure of the counting system which concerns on Embodiment 6. 実施の形態6に係る計数装置および機械学習装置の機能構成例を示す図The figure which shows the functional configuration example of the counting device and the machine learning device which concerns on Embodiment 6. 実施の形態6に係る機械学習装置が生成するニューラルネットワークの例を示す図The figure which shows the example of the neural network generated by the machine learning apparatus which concerns on Embodiment 6. 実施の形態6に係る計数システムの構成の他の例を示す図The figure which shows another example of the structure of the counting system which concerns on Embodiment 6. 実施の形態7に係る計数システムの構成の他の例を示す図The figure which shows another example of the structure of the counting system which concerns on Embodiment 7. 実施の形態7に係る部品配置装置が実行する計数処理を示すフローチャートA flowchart showing a counting process executed by the component arranging device according to the seventh embodiment. 実施の形態1から7に係る計数装置のハードウェア構成の一例を示す図The figure which shows an example of the hardware configuration of the counting apparatus which concerns on Embodiments 1 to 7.
 以下に、本実施の形態に係る計数システム、計数装置、機械学習装置、計数方法、部品配置方法、および、プログラムについて図面を参照して詳細に説明する。なお、図中同一または相当する部分には同じ符号を付す。本実施の形態は、部品の計数を行う計数システムの例である。 The counting system, counting device, machine learning device, counting method, component placement method, and program according to the present embodiment will be described in detail below with reference to the drawings. The same or corresponding parts are designated by the same reference numerals in the drawings. This embodiment is an example of a counting system that counts parts.
(実施の形態1)
 図1に示すように、計数システム100は、部品Pの計数を行う計数装置1と、撮影範囲Cを撮影する撮影装置2と、ユーザが使用するユーザ端末3とを備える。計数装置1と、撮影装置2およびユーザ端末3とは、有線または無線で接続している。
(Embodiment 1)
As shown in FIG. 1, the counting system 100 includes a counting device 1 for counting parts P, a photographing device 2 for photographing a photographing range C, and a user terminal 3 used by a user. The counting device 1, the photographing device 2, and the user terminal 3 are connected by wire or wirelessly.
 ユーザが撮影範囲Cに部品Pを投入すると、部品Pが静止した状態で撮影装置2が撮影範囲Cを撮影し、計数装置1に撮影画像を送信する。撮影装置2が撮影するタイミングは、ユーザが撮影装置2に撮影指示を入力したタイミングでもよいし、撮影装置2が撮影範囲Cに計数対象の部品Pが投入されたことを検知したタイミングでもよい。計数装置1は、撮影装置2から受信した撮影画像を二値化する。計数装置1は、二値化した撮影画像のピクセル分布を集団に分類する。計数装置1は、集団の面積から集団ごとの部品Pの数を算出して合計し、撮影画像に写っている部品Pの総数を算出する。計数装置1は、算出した部品Pの総数を示す部品数情報を生成し、ユーザ端末3に送信する。部品数情報は、物品数情報の例である。 When the user puts the component P into the photographing range C, the photographing device 2 photographs the photographing range C while the component P is stationary, and transmits the photographed image to the counting device 1. The timing at which the photographing device 2 shoots may be the timing at which the user inputs a shooting instruction to the photographing device 2, or may be the timing at which the photographing device 2 detects that the component P to be counted has been put into the shooting range C. The counting device 1 binarizes the captured image received from the photographing device 2. The counting device 1 classifies the pixel distribution of the binarized captured image into a group. The counting device 1 calculates and totals the number of parts P for each group from the area of the group, and calculates the total number of parts P shown in the captured image. The counting device 1 generates component number information indicating the calculated total number of components P and transmits it to the user terminal 3. The number of parts information is an example of the number of articles information.
 計数装置1は、集団の面積から部品Pの数を算出できない計数不可の集団がある場合には、計数不可の集団を示す計数不可情報を生成し、ユーザ端末3に送信する。ユーザ端末3は計数装置1から受信した計数不可情報を表示する。ユーザは、ユーザ端末3に表示された計数不可情報が示す計数不可の集団の部品Pの数を目視で数える。ユーザは、計数不可の集団の部品Pの数を示す数値情報をユーザ端末3に入力する。ユーザ端末3は、入力された数値情報を計数装置1に送信する。計数装置1は、ユーザ端末3から数値情報を受信すると、数値情報が示す計数不可の集団の部品Pの数を採用して、撮影画像に写っている部品Pの総数を算出する。なお、部品Pの形状が球体である場合のように部品Pが重ならない場合は、計数不可の集団は発生しない。計数装置1は、算出した部品Pの総数を示す部品数情報を生成し、ユーザ端末3に送信する。ユーザ端末3は計数装置1から受信した部品数情報を画面表示、音声出力などの方法で出力する。これにより、ユーザは投入した部品Pの数を把握できる。 When there is a non-countable group whose number of parts P cannot be calculated from the area of the group, the counting device 1 generates non-countable information indicating the non-countable group and transmits it to the user terminal 3. The user terminal 3 displays the non-countable information received from the counting device 1. The user visually counts the number of parts P of the uncountable group indicated by the uncountable information displayed on the user terminal 3. The user inputs numerical information indicating the number of parts P of the uncountable group to the user terminal 3. The user terminal 3 transmits the input numerical information to the counting device 1. When the counting device 1 receives the numerical information from the user terminal 3, the counting device 1 adopts the number of the parts P of the group that cannot be counted indicated by the numerical information, and calculates the total number of the parts P shown in the captured image. When the parts P do not overlap as in the case where the shape of the parts P is a sphere, a group that cannot be counted does not occur. The counting device 1 generates component number information indicating the calculated total number of components P and transmits it to the user terminal 3. The user terminal 3 outputs the component number information received from the counting device 1 by a method such as screen display or voice output. As a result, the user can grasp the number of inserted parts P.
 ここで、図2を用いて実施の形態1に係る計数装置1の機能構成について説明する。図2に示すように、計数装置1は、機能構成として、撮影装置2から撮影画像を受信する画像取得部11と、撮影画像を二値化する二値化部12と、二値化した撮影画像を縮小する縮小処理部13と、縮小した撮影画像のピクセル分布を分類した集団の面積を算出する面積算出部14と、集団の面積から部品Pの総数を算出する部品数算出部15と、ユーザ端末3に情報を出力する情報出力部16とを備える。 Here, the functional configuration of the counting device 1 according to the first embodiment will be described with reference to FIG. As shown in FIG. 2, the counting device 1 has, as a functional configuration, an image acquisition unit 11 that receives a captured image from the photographing device 2, a binarization unit 12 that binarizes the captured image, and a binarized photographing. A reduction processing unit 13 that reduces the image, an area calculation unit 14 that calculates the area of the group that classifies the pixel distribution of the reduced captured image, and a component number calculation unit 15 that calculates the total number of parts P from the area of the group. It includes an information output unit 16 that outputs information to the user terminal 3.
 画像取得部11は、撮影装置2から撮影画像を受信する。撮影画像はデジタルデータである。二値化部12は、画像取得部11が受信した撮影画像をグレースケールに変換し、ガウシアンフィルタ、移動平均化フィルタなどの平滑化フィルタにより、ノイズを除去する。二値化部12は、ノイズが除去されたグレースケールの撮影画像に対し、閾値を算出する。二値化の閾値の算出する方法として、2クラスのクラスタリングによって繰り返し計算を行いながら閾値を算出する方法、分離度が最大となる閾値を算出する判別分析法などがある。判別分析法によれば、繰り返し計算を行う必要がないので、高速で二値化の閾値を算出することができる。二値化部12は、算出した閾値を用いて、ノイズが除去されたグレースケールの撮影画像を二値化する。 The image acquisition unit 11 receives the captured image from the photographing device 2. The captured image is digital data. The binarization unit 12 converts the captured image received by the image acquisition unit 11 into grayscale, and removes noise by a smoothing filter such as a Gaussian filter or a moving average filter. The binarization unit 12 calculates a threshold value for a grayscale photographed image from which noise has been removed. As a method of calculating the threshold value of binarization, there are a method of calculating the threshold value while performing iterative calculation by two-class clustering, a discriminant analysis method of calculating the threshold value at which the degree of separation is maximized, and the like. According to the discriminant analysis method, since it is not necessary to perform the iterative calculation, the threshold value for binarization can be calculated at high speed. The binarization unit 12 binarizes the grayscale photographed image from which noise has been removed by using the calculated threshold value.
 縮小処理部13は、二値化部12が二値化した撮影画像を縮小する。部品Pに、例えば電子部品のリード線部分のような微細な構造がある場合、二値化した撮影画像を縮小することで撮影画像に写っている部品Pの微細な構造を除去することができる。これにより、部品Pの輪郭の誤検出を抑制できる。なお、部品Pに微細な構造がない場合は、計数装置1は縮小処理部13を備えなくてもよい。 The reduction processing unit 13 reduces the captured image binarized by the binarization unit 12. When the component P has a fine structure such as a lead wire portion of an electronic component, the fine structure of the component P shown in the captured image can be removed by reducing the binarized captured image. .. As a result, erroneous detection of the contour of the component P can be suppressed. If the component P does not have a fine structure, the counting device 1 does not have to include the reduction processing unit 13.
 面積算出部14は、縮小処理部13が縮小した撮影画像のピクセル分布を集団に分類し、各集団のピクセル数を面積として算出する。集団に分類する方法には、例えば、K-means法がある。K-means法を用いる場合、面積算出部14は、二値化した撮影画像中のエッジ数をカウントし、集団数kの値として設定する。エッジ数のカウントには、例えばOpenCV(Open Source Computer Vision Library)のfindcontours関数を使用する。なお、計数装置1が縮小処理部13を備えない場合は、面積算出部14は、二値化部12が二値化した撮影画像のピクセル分布を集団に分類し、各集団のピクセル数を面積として算出する。 The area calculation unit 14 classifies the pixel distribution of the captured image reduced by the reduction processing unit 13 into groups, and calculates the number of pixels of each group as the area. As a method of classifying into a group, for example, there is a K-means method. When the K-means method is used, the area calculation unit 14 counts the number of edges in the binarized captured image and sets it as the value of the group number k. For counting the number of edges, for example, the findcons function of OpenCV (OpenSource Computer Vision Library) is used. When the counting device 1 does not include the reduction processing unit 13, the area calculation unit 14 classifies the pixel distribution of the captured image binarized by the binarization unit 12 into groups, and sets the number of pixels of each group as the area. Calculate as.
 部品数算出部15は、面積算出部14が算出した各集団の面積のうち、ノイズを除いた最も小さい集団の面積を部品P1個分の面積と判断し、基準面積とする。ノイズを除く方法として、例えば次の方法がある。計数装置1を用いて予め複数回の実験を行って部品Pの集団がとりうる面積の下限および上限を算出しておく。あるいは、部品数算出部15は、面積算出部14が算出した各集団の面積の値を昇順または降順に並べる。部品数算出部15は、各集団の面積の値を中央から小さい方に比較していき、隣り合う値に決められた値以上の差があれば、その大きい方の値を下限とする。部品数算出部15は、各集団の面積の値を中央から大きい方に比較していき、隣り合う値に決められた値以上の差があれば、その小さい方の値を上限とする。部品数算出部15は、下限よりも小さい面積の集団および上限よりも大きい面積の集団をノイズとして除去する。 The part number calculation unit 15 determines that the area of the smallest group excluding noise among the areas of each group calculated by the area calculation unit 14 is the area of one part P, and uses it as the reference area. As a method of removing noise, for example, there is the following method. A plurality of experiments are performed in advance using the counting device 1 to calculate the lower and upper limits of the area that the group of parts P can take. Alternatively, the component number calculation unit 15 arranges the area values of each group calculated by the area calculation unit 14 in ascending or descending order. The part number calculation unit 15 compares the value of the area of each group from the center to the smaller one, and if there is a difference of more than a predetermined value between adjacent values, the larger value is set as the lower limit. The number of parts calculation unit 15 compares the value of the area of each group from the center to the larger one, and if there is a difference of more than a predetermined value between adjacent values, the smaller value is set as the upper limit. The component number calculation unit 15 removes a group having an area smaller than the lower limit and a group having an area larger than the upper limit as noise.
 部品数算出部15は、基準面積の1倍を含む第1の範囲内の面積の集団の部品Pの数を1個と算出する。部品数算出部15は、基準面積の2倍を含む第2の範囲内の面積の集団の部品Pの数を2個と算出する。部品数算出部15は、基準面積の3倍を含む第3の範囲内の面積の集団の部品Pの数を3個と算出する。このように、部品数算出部15は、基準面積のN倍を含む第Nの範囲内の面積の集団の部品Pの数をN個と算出し、加算していく。Nは正の整数である。第Nの範囲は、部品Pの形状・向きなどによる誤差を吸収する最小の範囲とする。部品数算出部15は、面積が第Nの範囲外の集団は、部品Pが重なっているものと判断し、計数不可とする。部品数算出部15は、計数不可の集団を示す計数不可情報を生成する。情報出力部16は、部品数算出部15が生成した計数不可情報をユーザ端末3に送信する。 The number of parts calculation unit 15 calculates the number of parts P of the group of the area within the first range including 1 times the reference area as one. The part number calculation unit 15 calculates the number of parts P of the group of the area within the second range including twice the reference area as two. The part number calculation unit 15 calculates the number of parts P of the group of the area within the third range including three times the reference area as three. In this way, the component number calculation unit 15 calculates the number of components P of the group of the area within the Nth range including N times the reference area as N, and adds them. N is a positive integer. The Nth range is the minimum range that absorbs errors due to the shape and orientation of the component P. The component number calculation unit 15 determines that the group whose area is out of the Nth range has overlapping components P, and cannot count the group. The part number calculation unit 15 generates non-countable information indicating a non-countable group. The information output unit 16 transmits the non-countable information generated by the component number calculation unit 15 to the user terminal 3.
 部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信する。部品数算出部15は、数値情報が示す計数不可の集団の部品Pの数を採用して、撮影画像に写っている部品Pの総数を算出する。部品数算出部15は、算出した部品Pの総数を示す部品数情報を生成する。部品数算出部15は、物品数算出部の例である。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する。 The component number calculation unit 15 receives numerical information indicating the number of components P in a group that cannot be counted from the user terminal 3. The part number calculation unit 15 adopts the number of parts P of the uncountable group indicated by the numerical information, and calculates the total number of parts P shown in the photographed image. The component number calculation unit 15 generates component number information indicating the total number of calculated components P. The number of parts calculation unit 15 is an example of the number of articles calculation unit. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3.
 続いて、計数不可情報および部品数情報について図3を用いて説明する。図3Aは、計数不可情報の一例を示す。図3Aの例では、ノイズを除いたすべての集団の画像が表示されている。部品数算出部15が計数不可とした集団が四角い枠で囲まれて、「計数不可」の文字が表示されている。「部品数」の欄にも「計数不可」の文字が表示されている。計数不可情報は、図3Aの例に限らず、例えば、部品数算出部15が計数不可とした集団を他の集団と異なる色で表示してもよいし、点滅させてもよい。「部品数」の欄はなくてもよい。あるいは、計数不可情報は、撮影範囲C上の計数不可の集団の座標を示す情報であってもよい。 Next, the non-countable information and the number of parts information will be described with reference to FIG. FIG. 3A shows an example of non-countable information. In the example of FIG. 3A, images of all populations except noise are displayed. The group that the number of parts calculation unit 15 has determined to be uncountable is surrounded by a square frame, and the characters "uncountable" are displayed. The characters "Uncountable" are also displayed in the "Number of parts" column. The non-countable information is not limited to the example of FIG. 3A, and for example, the group that the part number calculation unit 15 has made uncountable may be displayed in a different color from other groups, or may be blinked. There is no need to have a column for "number of parts". Alternatively, the non-countable information may be information indicating the coordinates of the non-countable group on the photographing range C.
 図3Aに示した計数不可情報がユーザ端末3に表示されると、ユーザは四角い枠で囲まれた集団の部品Pの数を目視で計数し、部品Pの数「5」を示す数値情報をユーザ端末3に入力する。ユーザ端末3は、入力された数値情報を計数装置1に送信する。計数装置1の部品数算出部15は、面積から算出した集団ごとの部品Pの数を加算した合計「46」に、数値情報が示す部品Pの数「5」を加算して、撮影画像に写っている部品Pの総数「51」を算出する。部品数算出部15は、図3Bに示すような、部品Pの総数「51」を示す部品数情報を生成する。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する。ユーザ端末3は、受信した部品数情報を表示する。これにより、ユーザは、投入した部品Pの数が51個であったことを把握する。 When the non-countable information shown in FIG. 3A is displayed on the user terminal 3, the user visually counts the number of parts P in the group surrounded by the square frame, and obtains numerical information indicating the number "5" of the parts P. Input to the user terminal 3. The user terminal 3 transmits the input numerical information to the counting device 1. The component number calculation unit 15 of the counting device 1 adds the number of components P "5" indicated by the numerical information to the total "46" obtained by adding the number of components P for each group calculated from the area, and adds the number of components P indicated by the numerical information to the captured image. Calculate the total number "51" of the parts P in the picture. The component number calculation unit 15 generates component number information indicating the total number of components P “51” as shown in FIG. 3B. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3. The user terminal 3 displays the received component number information. As a result, the user knows that the number of input parts P is 51.
 図3Bの例では、ノイズを除いたすべての集団の画像が表示されている。部品Pの数が1個以外の集団が四角い枠で囲まれて、その数が表示されている。図3Aで計数不可の集団として表示されていた集団についても、ユーザが入力した数値情報が示す部品Pの数「5」が反映されている。「部品数」の欄には部品Pの総数である「51」が表示されている。部品数情報は、図3Bの例に限らず、例えば、部品Pの総数だけを表示してもよいし、前回の計数結果、決められた期間に計数した部品Pの累積数などを表示してもよい。あるいは、部品Pの総数を通知する音声データであってもよい。 In the example of FIG. 3B, images of all groups excluding noise are displayed. A group in which the number of parts P is other than one is surrounded by a square frame, and the number is displayed. The number "5" of the parts P indicated by the numerical information input by the user is also reflected in the group displayed as the group that cannot be counted in FIG. 3A. In the "Number of parts" column, "51", which is the total number of parts P, is displayed. The number of parts information is not limited to the example of FIG. 3B, and for example, only the total number of parts P may be displayed, or the previous counting result, the cumulative number of parts P counted in a predetermined period, and the like may be displayed. May be good. Alternatively, it may be voice data notifying the total number of parts P.
 なお、計数装置1の情報出力部16は、計数不可情報および部品数情報をユーザ端末3に送信せずに、画面表示、音声出力などの方法で出力してもよい。この場合、計数システム100は、ユーザ端末3を備えなくてもよい。ユーザは、情報出力部16が出力した計数不可情報が示す計数不可の集団の部品Pの数を目視で数え、計数不可の集団の部品Pの数を示す数値情報を部品数算出部15に入力する。 Note that the information output unit 16 of the counting device 1 may output the non-counting information and the number of parts information by a method such as screen display or voice output without transmitting the information to the user terminal 3. In this case, the counting system 100 does not have to include the user terminal 3. The user visually counts the number of parts P of the uncountable group indicated by the uncountable information output by the information output unit 16, and inputs numerical information indicating the number of parts P of the uncountable group to the part number calculation unit 15. To do.
 ここで、計数装置1が実行する計数処理の流れを説明する。図4に示す計数処理は、計数装置1の電源が投入されたときに開始する。計数装置1の画像取得部11は、撮影装置2から撮影画像を受信しない場合(ステップS11;NO)、ステップS11を繰り返して、撮影画像の受信を待機する。撮影装置2から撮影画像を受信した場合(ステップS11;YES)、二値化部12は、画像取得部11が受信した撮影画像をグレースケールに変換し、平滑化フィルタによりノイズを除去する。二値化部12は、ノイズが除去されたグレースケールの撮影画像に対し、二値化の閾値を算出する(ステップS12)。このとき、判別分析法を用いれば、高速で二値化の閾値を算出することができる。二値化部12は、算出した閾値を用いて、ノイズが除去されたグレースケールの撮影画像を二値化する(ステップS13)。 Here, the flow of the counting process executed by the counting device 1 will be described. The counting process shown in FIG. 4 starts when the power of the counting device 1 is turned on. When the image acquisition unit 11 of the counting device 1 does not receive the captured image from the photographing device 2 (step S11; NO), the image acquisition unit 11 repeats step S11 and waits for the reception of the captured image. When a captured image is received from the photographing device 2 (step S11; YES), the binarization unit 12 converts the captured image received by the image acquisition unit 11 into grayscale, and removes noise by a smoothing filter. The binarization unit 12 calculates the binarization threshold value for the grayscale photographed image from which noise has been removed (step S12). At this time, if the discriminant analysis method is used, the threshold value for binarization can be calculated at high speed. The binarization unit 12 binarizes the grayscale photographed image from which noise has been removed by using the calculated threshold value (step S13).
 縮小処理部13は、二値化部12が二値化した撮影画像を縮小する(ステップS14)。なお、計数装置1が縮小処理部13を備えない場合は、ステップS13は省略される。面積算出部14は、縮小処理部13が縮小した撮影画像のピクセル分布を集団に分類し、各集団の面積を算出する(ステップS15)。ステップS13が省略された場合は、面積算出部14は、二値化部12が二値化した撮影画像のピクセル分布を集団に分類し、各集団の面積を算出する。部品数算出部15は、面積算出部14が算出した各集団の面積うち、ノイズを除いた最も小さい面積を、部品P1個分の面積である基準面積として算出する(ステップS16)。このとき、部品数算出部15は、下限および上限を定めて、下限よりも小さい面積の集団および上限よりも大きい面積の集団をノイズとして除去する。 The reduction processing unit 13 reduces the captured image binarized by the binarization unit 12 (step S14). If the counting device 1 does not include the reduction processing unit 13, step S13 is omitted. The area calculation unit 14 classifies the pixel distribution of the captured image reduced by the reduction processing unit 13 into groups, and calculates the area of each group (step S15). When step S13 is omitted, the area calculation unit 14 classifies the pixel distribution of the captured image binarized by the binarization unit 12 into groups, and calculates the area of each group. The component number calculation unit 15 calculates the smallest area of each group calculated by the area calculation unit 14 excluding noise as a reference area which is the area of one component P (step S16). At this time, the component number calculation unit 15 determines a lower limit and an upper limit, and removes a group having an area smaller than the lower limit and a group having an area larger than the upper limit as noise.
 部品数算出部15は、面積算出部14が分類した各集団について、面積が基準面積のN倍を含む第Nの範囲内であるか否かを判定する(ステップS17)。面積が基準面積のN倍を含む第Nの範囲内である場合(ステップS17;YES)、当該集団の部品Pの数をN個として部品Pの数に加算する(ステップS20)。面積が基準面積のN倍を含む第Nの範囲外である場合(ステップS17;NO)、部品数算出部15は、部品Pが重なっているものと判断し、計数不可とする。部品数算出部15は、計数不可の集団を示す計数不可情報を生成する。情報出力部16は、部品数算出部15が生成した計数不可情報をユーザ端末3に送信する(ステップS18)。計数システム100がユーザ端末3を備えない場合は、ステップS18で情報出力部16は、計数不可情報を表示する。 The number of parts calculation unit 15 determines whether or not the area of each group classified by the area calculation unit 14 is within the Nth range including N times the reference area (step S17). When the area is within the Nth range including N times the reference area (step S17; YES), the number of parts P of the group is set to N and added to the number of parts P (step S20). When the area is outside the Nth range including N times the reference area (step S17; NO), the component number calculation unit 15 determines that the components P overlap and cannot count. The part number calculation unit 15 generates non-countable information indicating a non-countable group. The information output unit 16 transmits the non-countable information generated by the component number calculation unit 15 to the user terminal 3 (step S18). When the counting system 100 does not include the user terminal 3, the information output unit 16 displays the non-countable information in step S18.
 図3Aの計数不可情報の例では、ノイズを除いたすべての集団の画像が表示されている。ステップS17で、面積が第Nの範囲外であると判定された集団、つまり計数不可の集団が四角い枠で囲まれて、「計数不可」の文字が表示されている。「部品数」の欄にも「計数不可」の文字が表示されている。 In the example of non-countable information in FIG. 3A, images of all groups excluding noise are displayed. In step S17, a group whose area is determined to be outside the Nth range, that is, a group that cannot be counted is surrounded by a square frame, and the characters “cannot be counted” are displayed. The characters "Uncountable" are also displayed in the "Number of parts" column.
 図4に戻り、部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信しない場合(ステップS19;NO)、ステップS19を繰り返して、数値情報の受信を待機する。部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信した場合(ステップS19;YES)、数値情報が示す部品Pの数を、部品Pの数に加算する(ステップS20)。計数システム100がユーザ端末3を備えない場合は、ステップS19で部品数算出部15は、数値情報が入力されたか否かを判定する。 Returning to FIG. 4, when the component number calculation unit 15 does not receive the numerical information indicating the number of the uncountable group of components P from the user terminal 3 (step S19; NO), the component number calculation unit 15 repeats step S19 to receive the numerical information. Wait. When the component number calculation unit 15 receives numerical information indicating the number of components P in a group that cannot be counted from the user terminal 3 (step S19; YES), the number of components P indicated by the numerical information is converted into the number of components P. Add (step S20). When the counting system 100 does not include the user terminal 3, in step S19, the component number calculation unit 15 determines whether or not the numerical information has been input.
 次の集団がある場合(ステップS21;YES)、処理はステップS17に戻り、ステップS17~ステップS21を繰り返す。面積算出部14が分類したすべての集団について、ステップS17~ステップS21を繰り返し、次の集団がない場合(ステップS21;NO)、部品数算出部15は、すべての集団の部品Pの数を合計した部品Pの総数を示す部品数情報を生成する。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する(ステップS22)。計数システム100がユーザ端末3を備えない場合は、ステップS22で、情報出力部16は部品数情報を表示する。 If there is a next group (step S21; YES), the process returns to step S17 and repeats steps S17 to S21. Steps S17 to S21 are repeated for all the groups classified by the area calculation unit 14, and when there is no next group (step S21; NO), the part number calculation unit 15 totals the number of parts P of all the groups. Generates component number information indicating the total number of component Ps. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3 (step S22). If the counting system 100 does not include the user terminal 3, the information output unit 16 displays the number of parts information in step S22.
 図3Bの部品数情報の例では、ノイズを除いたすべての集団の画像が表示されている。ステップS17で、面積が第2の範囲内であると判定された集団が四角い枠で囲まれて、部品Pの数「2」が表示されている。ステップS17で、面積が第3の範囲内であると判定された集団が四角い枠で囲まれて、部品Pの数「3」が表示されている。ステップS17で、面積が第Nの範囲外であると判定され、ステップS18で図3Aに示す計数不可情報が送信された集団についても、四角い枠で囲まれて、ステップS19で受信した数値情報が示す部品Pの数「5」が表示されている。「部品数」の欄には部品Pの総数である「51」が表示されている。 In the example of the number of parts information in FIG. 3B, images of all groups excluding noise are displayed. In step S17, the group determined to have an area within the second range is surrounded by a square frame, and the number “2” of the parts P is displayed. In step S17, the group determined to have an area within the third range is surrounded by a square frame, and the number of parts P “3” is displayed. The group to which the area is determined to be out of the Nth range in step S17 and the uncountable information shown in FIG. 3A is transmitted in step S18 is also surrounded by a square frame and the numerical information received in step S19 is displayed. The number "5" of the indicated component P is displayed. In the "Number of parts" column, "51", which is the total number of parts P, is displayed.
 図4に戻り、計数装置1の電源がOFFになっていない場合(ステップS23;NO)、処理はステップS11に戻り、ステップS11~ステップS23を繰り返す。計数装置1の電源がOFFになると(ステップS23;YES)、処理を終了する。なお、部品Pの形状が球体である場合のように部品Pが重ならない場合は、計数不可の集団は発生しないので、ステップS18およびステップS19は省略してもよい。 Returning to FIG. 4, when the power of the counting device 1 is not turned off (step S23; NO), the process returns to step S11 and repeats steps S11 to S23. When the power of the counting device 1 is turned off (step S23; YES), the process ends. In addition, when the parts P do not overlap as in the case where the shape of the parts P is a sphere, a group that cannot be counted does not occur, so that steps S18 and S19 may be omitted.
 実施の形態1に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。 According to the counting system 100 according to the first embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time.
(実施の形態2)
 実施の形態2では、計数装置1は、集団の画像と当該集団の部品Pの数とを対応付けた集団画像情報を記憶しておき、計数不可の集団の画像と類似する集団の画像に対応付けられた部品Pの数を、計数不可の集団の部品Pの数として採用する。
(Embodiment 2)
In the second embodiment, the counting device 1 stores the group image information in which the image of the group and the number of parts P of the group are associated with each other, and corresponds to the image of the group similar to the image of the group that cannot be counted. The number of attached parts P is adopted as the number of parts P of a group that cannot be counted.
 図5を用いて実施の形態2に係る計数装置1の機能構成について説明する。図5に示すように、計数装置1は、機能構成として、画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16に加え、集団画像情報を記憶する集団画像記憶部17を備える。 The functional configuration of the counting device 1 according to the second embodiment will be described with reference to FIG. As shown in FIG. 5, the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations. In addition, a group image storage unit 17 for storing group image information is provided.
 部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信すると、計数不可の集団の画像と数値情報が示す部品Pの数とを対応付けて集団画像情報を生成する。集団画像記憶部17は、部品数算出部15が生成した集団画像情報を記憶する。集団画像情報の例を図6に示す。例えば、1つ目の集団画像情報では、図3Aの計数不可情報に示された計数不可の集団の画像と、数値情報が示す部品Pの数「5」とが対応付けられている。つまり、集団画像情報は、計数不可であった集団の画像と当該集団の部品Pの数とを対応付けた情報である。 When the component number calculation unit 15 receives numerical information indicating the number of parts P of the uncountable group from the user terminal 3, the group image is associated with the image of the uncountable group and the number of components P indicated by the numerical information. Generate information. The group image storage unit 17 stores the group image information generated by the component number calculation unit 15. An example of group image information is shown in FIG. For example, in the first group image information, the image of the uncountable group shown in the non-countable information of FIG. 3A is associated with the number “5” of the parts P indicated by the numerical information. That is, the group image information is information in which the image of the group that could not be counted is associated with the number of parts P of the group.
 図5に戻り、部品数算出部15は、計数不可の集団がある場合、集団画像記憶部17を参照し、計数不可の集団の画像と集団の画像が類似する集団画像情報があるか否かを判定する。2つの画像が類似するか否かの判定方法は、例えば、2つの画像の特徴点を抽出して類似度を算出し、類似度が閾値よりも高ければ類似すると判定する方法がある。 Returning to FIG. 5, when there is a group that cannot be counted, the component number calculation unit 15 refers to the group image storage unit 17, and whether or not there is group image information in which the image of the group that cannot be counted and the image of the group are similar. To judge. As a method of determining whether or not the two images are similar, for example, there is a method of extracting feature points of the two images, calculating the similarity, and determining that the two images are similar if the similarity is higher than the threshold value.
 計数不可の集団の画像と集団の画像が類似する集団画像情報がある場合、部品数算出部15は、計数不可の集団の画像と類似する集団の画像に対応付けられた部品Pの数を、計数不可の集団の部品Pの数として採用する。計数不可の集団の画像と集団の画像が類似する集団画像情報がない場合には、実施の形態1と同様に、部品数算出部15は計数不可情報を生成し、情報出力部16が計数不可情報をユーザ端末3に送信する。部品数算出部15は、ユーザ端末3から受信した数値情報が示す計数不可の集団の部品Pの数を採用する。部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信するごとに集団画像情報を生成する。集団画像記憶部17には、集団画像情報が累積記憶されていく。計数装置1のその他の機能は、実施の形態1と同様である。 When there is group image information in which the image of the group that cannot be counted and the image of the group are similar, the component number calculation unit 15 determines the number of parts P associated with the image of the group that is similar to the image of the group that cannot be counted. It is adopted as the number of parts P in a group that cannot be counted. When there is no group image information in which the non-countable group image and the group image are similar, the component number calculation unit 15 generates the non-countable information and the information output unit 16 cannot count, as in the first embodiment. The information is transmitted to the user terminal 3. The part number calculation unit 15 adopts the number of parts P of the uncountable group indicated by the numerical information received from the user terminal 3. The component number calculation unit 15 generates group image information each time numerical information indicating the number of uncountable group components P is received from the user terminal 3. Group image information is cumulatively stored in the group image storage unit 17. Other functions of the counting device 1 are the same as those of the first embodiment.
 ここで、計数装置1が実行する計数処理の流れを説明する。図7に示す計数処理は、計数装置1の電源が投入されたときに開始する。ステップS31~ステップS36は、図4に示すフローチャートのステップS11~ステップS16と同様であるので説明を省略する。部品数算出部15は、面積算出部14が分類した各集団について、面積が基準面積のN倍を含む第Nの範囲内であるか否かを判定する(ステップS37)。面積が基準面積のN倍を含む第Nの範囲内である場合(ステップS37;YES)、当該集団の部品Pの数をN個として部品Pの数に加算する(ステップS42)。面積が基準面積のN倍を含む第Nの範囲外である場合(ステップS37;NO)、部品数算出部15は、集団画像記憶部17を参照し、計数不可の集団の画像と集団の画像が類似する集団画像情報があるか否かを判定する(ステップS38)。 Here, the flow of the counting process executed by the counting device 1 will be described. The counting process shown in FIG. 7 starts when the power of the counting device 1 is turned on. Since steps S31 to S36 are the same as steps S11 to S16 of the flowchart shown in FIG. 4, the description thereof will be omitted. The component number calculation unit 15 determines whether or not the area of each group classified by the area calculation unit 14 is within the Nth range including N times the reference area (step S37). When the area is within the Nth range including N times the reference area (step S37; YES), the number of parts P of the group is set to N and added to the number of parts P (step S42). When the area is outside the Nth range including N times the reference area (step S37; NO), the component number calculation unit 15 refers to the group image storage unit 17, and the group image and the group image that cannot be counted. Determines if there is similar group image information (step S38).
 計数不可の集団の画像と集団の画像が類似する集団画像情報がある場合(ステップS38;YES)、部品数算出部15は、計数不可の集団の画像と類似する集団の画像に対応付けられた部品Pの数を、計数不可の集団の部品Pの数として採用し、部品Pの数に加算する(ステップS42)。 When there is group image information in which the non-countable group image and the group image are similar (step S38; YES), the part number calculation unit 15 is associated with the group image similar to the non-countable group image. The number of parts P is adopted as the number of parts P in a group that cannot be counted, and is added to the number of parts P (step S42).
 計数不可の集団の画像と集団の画像が類似する集団画像情報がない場合には(ステップS38;NO)、部品数算出部15は、計数不可の集団を示す計数不可情報を生成する。情報出力部16は、部品数算出部15が生成した計数不可情報をユーザ端末3に送信する(ステップS39)。部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信しない場合(ステップS40;NO)、ステップS40を繰り返して、数値情報の受信を待機する。部品数算出部15は、ユーザ端末3から計数不可の集団の部品Pの数を示す数値情報を受信した場合(ステップS40;YES)、計数不可の集団の画像と数値情報が示す部品Pの数とを対応付けて集団画像情報を生成し、集団画像記憶部17に集団画像情報を記憶する(ステップS41)。 When there is no group image information in which the non-countable group image and the group image are similar (step S38; NO), the component number calculation unit 15 generates non-countable information indicating the non-countable group. The information output unit 16 transmits the non-countable information generated by the component number calculation unit 15 to the user terminal 3 (step S39). When the component number calculation unit 15 does not receive the numerical information indicating the number of the non-countable group of components P from the user terminal 3 (step S40; NO), the component number calculation unit 15 repeats step S40 and waits for the reception of the numerical information. When the component number calculation unit 15 receives numerical information indicating the number of parts P of the uncountable group from the user terminal 3 (step S40; YES), the image of the uncountable group and the number of components P indicated by the numerical information The group image information is generated in association with the above, and the group image information is stored in the group image storage unit 17 (step S41).
 例えば、図6に示す1つ目の集団画像情報では、図3Aの計数不可情報に示された計数不可の集団の画像と、数値情報が示す部品Pの数「5」とが対応付けられている。 For example, in the first group image information shown in FIG. 6, the image of the uncountable group shown in the uncountable information of FIG. 3A is associated with the number “5” of the parts P indicated by the numerical information. There is.
 図7に戻り、部品数算出部15は、受信した数値情報が示す部品Pの数を、部品Pの数に加算する(ステップS42)。次の集団がある場合(ステップS43;YES)、処理はステップS37に戻り、ステップS37~ステップS43を繰り返す。面積算出部14が分類したすべての集団について、ステップS37~ステップS43を繰り返し、次の集団がない場合(ステップS43;NO)、部品数算出部15は、すべての集団の部品Pの数を合計した部品Pの総数を示す部品数情報を生成する。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する(ステップS44)。計数装置1の電源がOFFになっていない場合(ステップS45;NO)、処理はステップS31に戻り、ステップS31~ステップS45を繰り返す。計数装置1の電源がOFFになると(ステップS45;YES)、処理を終了する。 Returning to FIG. 7, the component number calculation unit 15 adds the number of components P indicated by the received numerical information to the number of components P (step S42). If there is a next population (step S43; YES), the process returns to step S37 and repeats steps S37-S43. Steps S37 to S43 are repeated for all the groups classified by the area calculation unit 14, and when there is no next group (step S43; NO), the part number calculation unit 15 totals the number of parts P of all the groups. Generates component number information indicating the total number of component Ps. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3 (step S44). When the power of the counting device 1 is not turned off (step S45; NO), the process returns to step S31 and repeats steps S31 to S45. When the power of the counting device 1 is turned off (step S45; YES), the process ends.
 実施の形態2に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。また、計数不可の集団の画像と類似する集団の画像に対応付けられた部品Pの数を、計数不可の集団の部品Pの数として採用することで、ユーザが計数不可の集団の部品Pの数を目視で数え、数値情報をユーザ端末3に入力する作業を減らすことができる。 According to the counting system 100 according to the second embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time. Further, by adopting the number of parts P associated with the image of the group similar to the image of the group that cannot be counted as the number of parts P of the group that cannot be counted, the user can use the parts P of the group that cannot be counted. It is possible to reduce the work of visually counting the numbers and inputting the numerical information to the user terminal 3.
(実施の形態3)
 実施の形態3では、ユーザが計数不可の集団の部品Pの数を目視で数え、数値情報をユーザ端末3に入力する代わりに、均し動作装置が重なっている部品Pを均して計数可能にする均し動作を行う。
(Embodiment 3)
In the third embodiment, the user can visually count the number of parts P in a group that cannot be counted, and instead of inputting numerical information to the user terminal 3, the parts P on which the leveling operation devices overlap can be leveled and counted. Perform a leveling operation.
 図8に示すように、計数システム100は、計数装置1と、撮影装置2とユーザ端末3とに加え、均し動作を行う均し動作装置4を備える。計数装置1と均し動作装置4とは、有線または無線で接続している。 As shown in FIG. 8, the counting system 100 includes a counting device 1, a photographing device 2, and a user terminal 3, as well as a leveling operation device 4 that performs a leveling operation. The counting device 1 and the leveling operation device 4 are connected by wire or wirelessly.
 計数装置1は、集団の面積から部品Pの数を算出できない計数不可の集団がある場合には、均し動作を指示する均し動作指示情報を均し動作装置4に送信する。均し動作装置4は、計数装置1から均し動作指示情報を受信すると均し動作を行う。均し動作装置4は、例えば、部品Pが投入された台を振動させることで部品Pの重なりを崩す。あるいは、均し動作指示情報に、撮影範囲C上の計数不可の集団の座標を含み、均し動作装置4は、計数不可の集団の座標に対して、空気を射出して部品Pの重なりを崩したり、ロボットアームで部品Pの重なりを崩したりする装置であってもよい。 When there is an uncountable group in which the number of parts P cannot be calculated from the area of the group, the counting device 1 transmits the leveling operation instruction information instructing the leveling operation to the leveling operation device 4. When the leveling operation device 4 receives the leveling operation instruction information from the counting device 1, the leveling operation device 4 performs the leveling operation. The leveling operation device 4 breaks the overlap of the parts P by, for example, vibrating the table on which the parts P are loaded. Alternatively, the leveling operation instruction information includes the coordinates of the non-countable group on the photographing range C, and the leveling operation device 4 injects air with respect to the coordinates of the non-countable group to overlap the parts P. It may be a device that breaks or breaks the overlap of parts P with a robot arm.
 撮影装置2は、均し動作装置4が部品Pの重なりを崩した後の撮影範囲Cを撮影する。撮影装置2が撮影するタイミングは、ユーザが撮影装置2に撮影指示を入力してもよいし、撮影装置2が均し動作装置4の均し動作を検知して撮影してもよい。計数システム100のその他の構成は、実施の形態1と同様である。 The photographing device 2 photographs the photographing range C after the leveling operation device 4 breaks the overlap of the parts P. As for the timing of shooting by the shooting device 2, the user may input a shooting instruction to the shooting device 2, or the shooting device 2 may detect the leveling operation of the leveling operation device 4 and shoot. Other configurations of the counting system 100 are the same as those in the first embodiment.
 ここで、図9を用いて実施の形態3に係る計数装置1の機能構成について説明する。図9に示すように、計数装置1は、機能構成として、画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16に加え、均し動作指示部18を備える。画像取得部11、二値化部12、縮小処理部13、面積算出部14は、実施の形態1と同様の処理を行う。 Here, the functional configuration of the counting device 1 according to the third embodiment will be described with reference to FIG. As shown in FIG. 9, the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations. In addition, a leveling operation instruction unit 18 is provided. The image acquisition unit 11, the binarization unit 12, the reduction processing unit 13, and the area calculation unit 14 perform the same processing as in the first embodiment.
 部品数算出部15は、部品P1個分の面積とみなす基準面積を算出し、基準面積のN倍を含む第Nの範囲内の面積の集団の部品Pの数をN個と算出する。部品数算出部15は、面積が第Nの範囲外の集団は、部品Pが重なっているものと判断し、計数不可とする。計数不可の集団がある場合、部品数算出部15は、均し動作装置4に均し動作を指示する均し動作指示情報を生成する。均し動作指示情報に撮影範囲C上の計数不可の集団の座標を含む場合には、部品数算出部15は、撮影範囲C上のすべての計数不可の集団の座標を含む均し動作指示情報を生成する。情報出力部16は、部品数算出部15が生成した均し動作指示情報を均し動作装置4に送信する。 The part number calculation unit 15 calculates the reference area regarded as the area of one part P, and calculates the number of parts P of the group of the area within the Nth range including N times the reference area as N pieces. The component number calculation unit 15 determines that the group whose area is out of the Nth range has overlapping components P, and cannot count the group. When there is a group that cannot be counted, the component number calculation unit 15 generates leveling operation instruction information that instructs the leveling operation device 4 to perform the leveling operation. When the leveling operation instruction information includes the coordinates of the uncountable group on the photographing range C, the component number calculation unit 15 includes the coordinates of all the uncountable groups on the photographing range C. To generate. The information output unit 16 transmits the leveling operation instruction information generated by the component number calculation unit 15 to the leveling operation device 4.
 均し動作装置4は、計数装置1から均し動作指示情報を受信すると均し動作を行う。撮影装置2は、均し動作装置4が均し動作で部品Pの重なりを崩した後の撮影範囲Cを撮影する。画像取得部11、二値化部12、縮小処理部13、および、面積算出部14は、実施の形態1と同様の処理を行う。部品数算出部15は、計数不可の集団がなくなると、すべての集団の部品Pの数を合計して、撮影画像に写っている部品Pの総数を算出する。部品数算出部15は、算出した部品Pの総数を示す部品数情報を生成する。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する。計数装置1のその他の機能は、実施の形態1と同様である。 The leveling operation device 4 performs a leveling operation when it receives the leveling operation instruction information from the counting device 1. The photographing device 2 photographs the photographing range C after the leveling operation device 4 breaks the overlap of the parts P by the leveling operation. The image acquisition unit 11, the binarization unit 12, the reduction processing unit 13, and the area calculation unit 14 perform the same processing as in the first embodiment. When the number of uncountable groups disappears, the number of parts calculation unit 15 totals the number of parts P of all the groups and calculates the total number of parts P shown in the captured image. The component number calculation unit 15 generates component number information indicating the total number of calculated components P. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3. Other functions of the counting device 1 are the same as those of the first embodiment.
 ここで、計数装置1が実行する計数処理の流れを説明する。図10に示す計数処理は、計数装置1の電源が投入されたときに開始する。ステップS51~ステップS56は、図4に示すフローチャートのステップS11~ステップS16と同様であるので説明を省略する。部品数算出部15は、面積算出部14が分類した各集団について、面積が基準面積のN倍を含む第Nの範囲内であるか否かを判定する(ステップS57)。面積が基準面積のN倍を含む第Nの範囲外である場合(ステップS57;NO)、部品数算出部15は、均し動作装置4に均し動作を指示する均し動作指示情報を生成する。情報出力部16は、部品数算出部15が生成した均し動作指示情報を均し動作装置4に送信する(ステップS58)。 Here, the flow of the counting process executed by the counting device 1 will be described. The counting process shown in FIG. 10 starts when the power of the counting device 1 is turned on. Since steps S51 to S56 are the same as steps S11 to S16 of the flowchart shown in FIG. 4, the description thereof will be omitted. The component number calculation unit 15 determines whether or not the area of each group classified by the area calculation unit 14 is within the Nth range including N times the reference area (step S57). When the area is outside the Nth range including N times the reference area (step S57; NO), the component number calculation unit 15 generates leveling operation instruction information for instructing the leveling operation device 4 to perform the leveling operation. To do. The information output unit 16 transmits the leveling operation instruction information generated by the component number calculation unit 15 to the leveling operation device 4 (step S58).
 均し動作指示情報に撮影範囲C上の計数不可の集団の座標を含む場合には、部品数算出部15は、ステップS57ですべての集団について第Nの範囲内であるか否かを判定し、第Nの範囲外である集団がある場合には、撮影範囲C上のすべての計数不可の集団の座標を含む均し動作指示情報を生成する。ステップS58で、情報出力部16は、部品数算出部15が生成した均し動作指示情報を均し動作装置4に送信する。この場合、ステップS59で、部品数算出部15は、すべての集団の部品Pの数を合計し、撮影画像に写っている部品Pの総数を示す部品数情報を生成し、ステップS60は省略する。 When the leveling operation instruction information includes the coordinates of a group that cannot be counted on the photographing range C, the component number calculation unit 15 determines in step S57 whether or not all the groups are within the Nth range. If there is a group outside the Nth range, leveling operation instruction information including the coordinates of all the uncountable groups on the photographing range C is generated. In step S58, the information output unit 16 transmits the leveling operation instruction information generated by the component number calculation unit 15 to the leveling operation device 4. In this case, in step S59, the component number calculation unit 15 totals the number of component Ps of all the groups, generates component number information indicating the total number of component Ps shown in the captured image, and omits step S60. ..
 均し動作装置4は、計数装置1から均し動作指示情報を受信すると均し動作を行う。撮影装置2は、均し動作装置4が部品Pの重なりを崩した後の撮影範囲Cを撮影する。処理は、ステップS51に戻り、ステップS51~ステップS57を繰り返す。面積が基準面積のN倍を含む第Nの範囲内である場合(ステップS57;YES)、当該集団の部品Pの数をN個として部品Pの数に加算する(ステップS59)。 The leveling operation device 4 performs a leveling operation when it receives the leveling operation instruction information from the counting device 1. The photographing device 2 photographs the photographing range C after the leveling operation device 4 breaks the overlap of the parts P. The process returns to step S51 and repeats steps S51 to S57. When the area is within the Nth range including N times the reference area (step S57; YES), the number of parts P of the group is set to N and added to the number of parts P (step S59).
 次の集団がある場合(ステップS60;YES)、処理はステップS57に戻り、ステップS57~ステップS60を繰り返す。面積算出部14が分類したすべての集団について、ステップS57~ステップS60を繰り返し、次の集団がない場合(ステップS60;NO)、部品数算出部15は、すべての集団の部品Pの数を合計した部品Pの総数を示す部品数情報を生成する。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する(ステップS61)。計数装置1の電源がOFFになっていない場合(ステップS62;NO)、処理はステップS51に戻り、ステップS51~ステップS62を繰り返す。計数装置1の電源がOFFになると(ステップS62;YES)、処理を終了する。 If there is a next group (step S60; YES), the process returns to step S57 and repeats steps S57 to S60. Steps S57 to S60 are repeated for all the groups classified by the area calculation unit 14, and when there is no next group (step S60; NO), the part number calculation unit 15 totals the number of parts P of all the groups. Generates component number information indicating the total number of component Ps. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3 (step S61). If the power of the counting device 1 is not turned off (step S62; NO), the process returns to step S51 and repeats steps S51 to S62. When the power of the counting device 1 is turned off (step S62; YES), the process ends.
 実施の形態3に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。また、計数不可の集団の部品Pを自動的に均して計数可能にするので、ユーザが計数不可の集団の部品Pの数を目視で数え、数値情報をユーザ端末3に入力する作業が不要となる。 According to the counting system 100 according to the third embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time. Further, since the parts P of the non-countable group are automatically leveled and can be counted, it is not necessary for the user to visually count the number of the non-countable group parts P and input the numerical information to the user terminal 3. It becomes.
(実施の形態4)
 実施の形態4では、計数装置1は出庫する部品の数を計数する。計数装置1は、部品を識別する情報と当該部品の撮影画像とを対応付けた撮影実績情報を記憶しておき、出庫する部品と撮影画像に写っている部品Pとが一致するか否かを判定する。また、計数装置1は、出庫する部品Pの数を、部品Pの在庫数を示す在庫情報に反映する。
(Embodiment 4)
In the fourth embodiment, the counting device 1 counts the number of parts to be delivered. The counting device 1 stores the shooting record information in which the information for identifying the part and the shot image of the part are associated with each other, and determines whether or not the part to be delivered and the part P shown in the shot image match. judge. Further, the counting device 1 reflects the number of parts P to be delivered in the inventory information indicating the number of stocks of the parts P.
 図11に示すように、計数システム100は、計数装置1と、撮影装置2とユーザ端末3とに加え、部品の出庫を指示する生産管理システム5と、部品の在庫を管理する在庫管理システム6とを備える。計数装置1は、生産管理システム5および在庫管理システム6とネットワークを介して接続する。 As shown in FIG. 11, the counting system 100 includes a counting device 1, a photographing device 2, a user terminal 3, a production management system 5 for instructing the delivery of parts, and an inventory management system 6 for managing the inventory of parts. And. The counting device 1 is connected to the production management system 5 and the inventory management system 6 via a network.
 生産管理システム5は、出庫する部品を示す出庫情報をユーザ端末3および計数装置1に送信する。ユーザ端末3は、生産管理システム5から受信した出庫情報を画面表示、音声出力などの方法で出力する。ユーザ端末3が出庫情報を出力すると、ユーザは出庫情報が示す部品を撮影範囲Cに投入する。以下、出庫情報が示す部品を撮影範囲Cに投入する作業を出庫作業という。撮影装置2が撮影範囲Cを撮影し、計数装置1に撮影画像を送信する。 The production management system 5 transmits the delivery information indicating the parts to be delivered to the user terminal 3 and the counting device 1. The user terminal 3 outputs the delivery information received from the production management system 5 by a method such as screen display or voice output. When the user terminal 3 outputs the delivery information, the user puts the parts indicated by the delivery information into the shooting range C. Hereinafter, the work of putting the parts indicated by the shipping information into the shooting range C is referred to as the shipping work. The photographing device 2 photographs the photographing range C, and transmits the photographed image to the counting device 1.
 計数装置1は、生産管理システム5から出庫情報を受信し、撮影装置2から撮影画像を受信する度に、出庫する物品と物品を識別する情報が一致する撮影実績情報があるか否かを判定する。物品を識別する情報が一致する撮影実績情報があれば、計数装置1はそれを参照して、出庫する部品と撮影画像に写っている部品Pとが一致するか否かを判定する。一致しない場合には、計数装置1は、警告を示すエラー情報を出力する。物品を識別する情報が一致する撮影実績情報がなければ、計数装置1は、撮影装置2から受信した撮影画像と、生産管理システム5から受信した出庫情報に含まれる部品を識別する情報とを対応付けて撮影実績情報を生成し、記憶する。なお、計数装置1が撮影実績情報を生成して記憶した場合であって、ユーザが撮影範囲Cに投入した部品Pが出庫する部品でないと判明した場合には、該当する撮影実績情報を削除する構成にしてもよい。 The counting device 1 receives the shipping information from the production management system 5, and each time it receives the shooting image from the shooting device 2, it determines whether or not there is shooting record information in which the information for identifying the goods to be shipped matches the information for identifying the goods. To do. If there is shooting record information in which the information for identifying the articles matches, the counting device 1 refers to the information and determines whether or not the parts to be delivered and the parts P shown in the shot image match. If they do not match, the counting device 1 outputs error information indicating a warning. If there is no photographing record information that matches the information for identifying the article, the counting device 1 corresponds the photographed image received from the photographing device 2 with the information for identifying the part included in the delivery information received from the production management system 5. Attach to generate and store shooting record information. If the counting device 1 generates and stores the shooting record information and it is found that the component P put into the shooting range C by the user is not a component to be delivered, the corresponding shooting record information is deleted. It may be configured.
 在庫管理システム6は、部品Pの在庫数を示す在庫情報を記憶している。計数装置1は、撮影画像に写っている部品Pの総数、つまり1回の出庫作業で出庫する部品Pの総数を算出する。在庫管理システム6が記憶している在庫情報の部品Pの在庫数から、1回の出庫作業で出庫する部品Pの総数を減算して、在庫情報を更新する。計数システム100のその他の構成は、実施の形態1と同様である。 The inventory management system 6 stores inventory information indicating the number of parts P in stock. The counting device 1 calculates the total number of parts P shown in the captured image, that is, the total number of parts P to be delivered in one delivery operation. The inventory information is updated by subtracting the total number of parts P to be delivered in one delivery operation from the number of parts P in the inventory information stored in the inventory management system 6. Other configurations of the counting system 100 are the same as those in the first embodiment.
 ここで、図12を用いて実施の形態4に係る計数装置1の機能構成について説明する。図12に示すように、計数装置1は、機能構成として、画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16に加え、出庫する部品と撮影画像に写っている部品Pとが一致するか否かを判定する正否判定部19、撮影実績情報を記憶する撮影実績記憶部20、および、在庫情報を更新する在庫更新部21を備える。 Here, the functional configuration of the counting device 1 according to the fourth embodiment will be described with reference to FIG. As shown in FIG. 12, the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations. In addition, a correctness / rejection determination unit 19 that determines whether or not the parts to be delivered and the component P shown in the photographed image match, the photography record storage unit 20 that stores the shooting record information, and the inventory update that updates the inventory information. A unit 21 is provided.
 正否判定部19は、生産管理システム5から出庫情報を受信し、画像取得部11が撮影装置2から撮影画像を受信すると、撮影実績記憶部20を参照し、出庫情報に含まれる部品を識別する情報と、部品を識別する情報が一致する撮影実績情報があるか否かを判定する。 When the correctness determination unit 19 receives the delivery information from the production management system 5 and the image acquisition unit 11 receives the captured image from the photographing device 2, the correctness determination unit 19 refers to the photographing result storage unit 20 and identifies the parts included in the issuing information. It is determined whether or not there is shooting record information in which the information and the information for identifying the part match.
 部品を識別する情報が一致する撮影実績情報がない場合には、正否判定部19は、画像取得部11が撮影装置2から受信した撮影画像と、生産管理システム5から受信した出庫情報に含まれる部品を識別する情報とを対応付けて撮影実績情報を生成し、撮影実績記憶部20に記憶する。 When there is no shooting record information that matches the information for identifying the parts, the correctness / rejection determination unit 19 is included in the shot image received by the image acquisition unit 11 from the shooting device 2 and the shipping information received from the production management system 5. The shooting record information is generated in association with the information for identifying the parts, and is stored in the shooting record storage unit 20.
 部品を識別する情報が一致する撮影実績情報がある場合には、正否判定部19は、撮影実績情報が示す部品の撮影画像と、画像取得部11が受信した撮影画像とを比較し、同じ部品であるか否かを判定する。同じ部品であるか否かの判定方法は、例えば、2つの撮影画像から特徴点を抽出して類似度を算出し、類似度が閾値よりも高ければ同じ部品であると判定する方法がある。 When there is shooting record information in which the information for identifying the parts matches, the correctness determination unit 19 compares the shot image of the part indicated by the shooting record information with the shot image received by the image acquisition unit 11, and the same component. It is determined whether or not it is. As a method of determining whether or not the components are the same, for example, there is a method of extracting feature points from two captured images, calculating the similarity, and determining that the components are the same if the similarity is higher than the threshold value.
 正否判定部19は、同じ部品でないと判定した場合、警告を示すエラー情報を生成する。情報出力部16は、正否判定部19が生成したエラー情報をユーザ端末3に送信する。ユーザ端末3は計数装置1から受信したエラー情報を表示する。これにより、ユーザは、投入した部品Pが間違っていることを知ることができる。正否判定部19が同じ部品であると判定した場合、二値化部12、縮小処理部13、および、面積算出部14は、実施の形態1と同様の処理を行う。なお、出庫情報を計数装置1に送信するのは、生産管理システム5に限らない。例えば、ユーザが出庫情報を正否判定部19に入力してよいし、ユーザ端末3にユーザが入力した出庫情報を正否判定部19がユーザ端末3から取得してもよい。 The correctness determination unit 19 generates error information indicating a warning when it is determined that the parts are not the same. The information output unit 16 transmits the error information generated by the correctness determination unit 19 to the user terminal 3. The user terminal 3 displays the error information received from the counting device 1. As a result, the user can know that the inserted component P is incorrect. When the correctness determination unit 19 determines that the parts are the same, the binarization unit 12, the reduction processing unit 13, and the area calculation unit 14 perform the same processing as in the first embodiment. It should be noted that the delivery information is transmitted to the counting device 1 not only in the production management system 5. For example, the user may input the delivery information into the correctness determination unit 19, or the correctness determination unit 19 may acquire the delivery information input by the user into the user terminal 3 from the user terminal 3.
 部品数算出部15は、撮影画像に写っている部品Pの総数を算出すると、図13に示すような、算出した部品Pの総数を示す部品数情報を生成する。情報出力部16は、部品数算出部15が生成した部品数情報をユーザ端末3に送信する。ユーザ端末3は、受信した部品数情報を表示する。図13の例では、図3Bに示した部品Pの総数「51」を示す部品数情報に加えて、出庫情報を受信後に計数した部品Pの累積数である「累積部品数」114と出庫情報が示す部品Pの出庫数「出庫指示数」200とが表示されている。これにより、ユーザは、投入した部品Pの数と、部品Pを残り86個出庫する必要があることを把握することができる。 When the number of parts calculation unit 15 calculates the total number of parts P shown in the photographed image, the number of parts information indicating the calculated total number of parts P is generated as shown in FIG. The information output unit 16 transmits the component number information generated by the component number calculation unit 15 to the user terminal 3. The user terminal 3 displays the received component number information. In the example of FIG. 13, in addition to the part number information indicating the total number of parts P “51” shown in FIG. 3B, the “cumulative number of parts” 114, which is the cumulative number of parts P counted after receiving the shipping information, and the shipping information. The number of shipments of the component P indicated by "the number of delivery instructions" 200 is displayed. As a result, the user can grasp the number of the input parts P and the remaining 86 parts P need to be delivered.
 図12に戻り、部品数算出部15が撮影画像に写っている部品Pの総数、つまり1回の出庫作業で出庫する部品Pの総数を算出すると、在庫更新部21は、在庫管理システム6が記憶している在庫情報の部品Pの在庫数から、1回の出庫作業で出庫する部品Pの総数を減算して、在庫情報を更新する。なお、在庫情報を記憶するのは、在庫管理システム6に限らない。例えば、計数装置1が在庫情報を記憶していてもよいし、外部の記憶装置が在庫情報を記憶していてもよい。計数装置1のその他の機能は、実施の形態1と同様である。 Returning to FIG. 12, when the number-of-parts calculation unit 15 calculates the total number of parts P shown in the photographed image, that is, the total number of parts P to be delivered in one delivery operation, the inventory management system 6 causes the inventory update unit 21 to calculate. The inventory information is updated by subtracting the total number of parts P to be delivered in one delivery operation from the number of parts P in stock in the stored inventory information. It should be noted that the inventory information is not limited to the inventory management system 6. For example, the counting device 1 may store the inventory information, or an external storage device may store the inventory information. Other functions of the counting device 1 are the same as those of the first embodiment.
 ここで、計数装置1が実行する計数処理の流れを説明する。図14に示す計数処理は、計数装置1の電源が投入されたときに開始する。計数装置1の正否判定部19は、生産管理システム5から出庫情報を受信する(ステップS71)。画像取得部11は、撮影装置2から撮影画像を受信しない場合(ステップS72;NO)、ステップS72を繰り返して、撮影画像の受信を待機する。画像取得部11が撮影装置2から撮影画像を受信した場合(ステップS72;YES)、正否判定部19は、撮影実績記憶部20を参照し、出庫情報に含まれる部品を識別する情報と、部品を識別する情報が一致する撮影実績情報があるか否かを判定する(ステップS73)。 Here, the flow of the counting process executed by the counting device 1 will be described. The counting process shown in FIG. 14 starts when the power of the counting device 1 is turned on. The correctness determination unit 19 of the counting device 1 receives the delivery information from the production management system 5 (step S71). When the image acquisition unit 11 does not receive the captured image from the photographing device 2 (step S72; NO), the image acquisition unit 11 repeats step S72 and waits for the reception of the captured image. When the image acquisition unit 11 receives the captured image from the photographing device 2 (step S72; YES), the correctness determination unit 19 refers to the photographing result storage unit 20, and the information for identifying the component included in the delivery information and the component. It is determined whether or not there is shooting record information that matches the information that identifies the image (step S73).
 一致する撮影実績情報がない場合には(ステップS73;NO)、正否判定部19は、画像取得部11が撮影装置2から受信した撮影画像と、生産管理システム5から受信した出庫情報に含まれる部品を識別する情報とを対応付けて撮影実績情報を生成する。正否判定部19は、生成した撮影実績情報を撮影実績記憶部20に記憶し(ステップS76)、処理はステップS77に移行する。 When there is no matching shooting record information (step S73; NO), the correctness / rejection determination unit 19 is included in the shot image received by the image acquisition unit 11 from the shooting device 2 and the shipping information received from the production management system 5. The shooting record information is generated by associating it with the information that identifies the part. The correctness / rejection determination unit 19 stores the generated shooting record information in the shooting record storage unit 20 (step S76), and the process proceeds to step S77.
 一致する撮影実績情報がある場合には(ステップS73;YES)、正否判定部19は、撮影実績情報が示す部品の撮影画像と、画像取得部11が受信した撮影画像とを比較し、同じ部品であるか否かを判定する(ステップS74)。同じ部品でないと判定した場合(ステップS74;NO)、正否判定部19は、警告を示すエラー情報を生成する。情報出力部16は、正否判定部19が生成したエラー情報をユーザ端末3に送信する(ステップS75)。ユーザ端末3は計数装置1から受信したエラー情報を表示する。処理はステップS72に戻り、ステップS72~ステップS74を繰り返す。同じ部品であると判定した場合(ステップS74;YES)、処理はステップS77に移行する。ステップS77~ステップS87は、図4に示すフローチャートのステップS12~ステップS22と同様であるので説明を省略する。 When there is matching shooting record information (step S73; YES), the correctness determination unit 19 compares the shot image of the component indicated by the shooting record information with the shot image received by the image acquisition unit 11, and the same component. It is determined whether or not it is (step S74). When it is determined that the parts are not the same (step S74; NO), the correctness determination unit 19 generates error information indicating a warning. The information output unit 16 transmits the error information generated by the correctness determination unit 19 to the user terminal 3 (step S75). The user terminal 3 displays the error information received from the counting device 1. The process returns to step S72, and steps S72 to S74 are repeated. If it is determined that the parts are the same (step S74; YES), the process proceeds to step S77. Since steps S77 to S87 are the same as steps S12 to S22 of the flowchart shown in FIG. 4, the description thereof will be omitted.
 部品数算出部15が撮影画像に写っている部品Pの総数、つまり1回の出庫作業で出庫する部品Pの総数を算出すると、在庫更新部21は、在庫管理システム6が記憶している在庫情報の部品Pの在庫数から、1回の出庫作業で出庫する部品Pの総数を減算して、在庫情報を更新する(ステップS88)。計数装置1の電源がOFFになっていない場合(ステップS89;NO)、処理はステップS71に戻り、ステップS71~ステップS89を繰り返す。計数装置1の電源がOFFになると(ステップS89;YES)、処理を終了する。 When the part number calculation unit 15 calculates the total number of parts P shown in the photographed image, that is, the total number of parts P to be delivered in one delivery operation, the inventory update unit 21 calculates the inventory stored in the inventory management system 6. The inventory information is updated by subtracting the total number of parts P to be delivered in one delivery operation from the number of parts P in stock in the information (step S88). If the power of the counting device 1 is not turned off (step S89; NO), the process returns to step S71, and steps S71 to S89 are repeated. When the power of the counting device 1 is turned off (step S89; YES), the process ends.
 実施の形態4に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。また、計数装置1が部品を識別する情報と当該部品の撮影画像とを対応付けた撮影実績情報を記憶しておき、出庫する部品と撮影画像に写っている部品Pとが一致するか否かを判定し、一致しない場合はエラー情報を出力することで、誤った部品を出庫することを防止できる。さらに、出庫作業をする度に在庫情報を更新することで、在庫情報が示す部品Pの在庫数と実際の部品Pの在庫数が一致しない時間を短くすることができる。 According to the counting system 100 according to the fourth embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time. In addition, the counting device 1 stores the shooting record information in which the information for identifying the component and the captured image of the component are associated with each other, and whether or not the component to be delivered and the component P shown in the captured image match. Is determined, and if they do not match, error information is output to prevent the wrong parts from being delivered. Further, by updating the inventory information every time the warehousing operation is performed, it is possible to shorten the time during which the inventory quantity of the component P indicated by the inventory information does not match the actual inventory quantity of the component P.
(実施の形態5)
 実施の形態5では、計数装置1は撮影画像から計数範囲を規定するマーカを検出し、計数範囲の中の部品Pの数を計数する。
(Embodiment 5)
In the fifth embodiment, the counting device 1 detects a marker that defines the counting range from the captured image and counts the number of parts P in the counting range.
 図15に示すように、計数システム100は、計数装置1と、撮影装置2とユーザ端末3とを備える。ユーザが撮影範囲Cに部品Pを投入すると、撮影装置2が撮影範囲Cを撮影し、計数装置1に撮影画像を送信する。撮影範囲CにはマーカMが付与されており、計数装置1は、マーカMに囲まれた計数範囲Rの中の部品Pの数を算出する。マーカMは、予め付与されていてもよいし、ユーザが付与してもよい。計数システム100のその他の構成は、実施の形態1と同様である。 As shown in FIG. 15, the counting system 100 includes a counting device 1, a photographing device 2, and a user terminal 3. When the user puts the component P into the photographing range C, the photographing device 2 photographs the photographing range C and transmits the photographed image to the counting device 1. A marker M is assigned to the photographing range C, and the counting device 1 calculates the number of parts P in the counting range R surrounded by the marker M. The marker M may be given in advance or may be given by the user. Other configurations of the counting system 100 are the same as those in the first embodiment.
 ここで、図16を用いて実施の形態5に係る計数装置1の機能構成について説明する。図16に示すように、計数装置1は、機能構成として、画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16に加え、マーカMを示すマーカ情報を記憶するマーカ記憶部22を備える。 Here, the functional configuration of the counting device 1 according to the fifth embodiment will be described with reference to FIG. As shown in FIG. 16, the counting device 1 has functional configurations such as an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16. In addition, a marker storage unit 22 for storing marker information indicating the marker M is provided.
 二値化部12は、マーカ情報を参照し、画像取得部11が受信した撮影画像の中のマーカMを検出する。二値化部12は、画像取得部11が受信した撮影画像からマーカMに囲まれた計数範囲Rを特定する。二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16は、計数範囲Rの撮影画像に対して、実施の形態1と同様の処理を行う。 The binarization unit 12 refers to the marker information and detects the marker M in the captured image received by the image acquisition unit 11. The binarization unit 12 identifies the counting range R surrounded by the markers M from the captured image received by the image acquisition unit 11. The binarization unit 12, the reduction processing unit 13, the area calculation unit 14, the number of parts calculation unit 15, and the information output unit 16 perform the same processing as in the first embodiment on the captured image of the counting range R. ..
 ここで、計数装置1が実行する計数処理の流れを説明する。図17に示す計数処理は、計数装置1の電源が投入されたときに開始する。計数装置1の画像取得部11は、撮影装置2から撮影画像を受信しない場合(ステップS91;NO)、ステップS91を繰り返して、撮影画像の受信を待機する。画像取得部11が撮影装置2から撮影画像を受信した場合(ステップS91;YES)、二値化部12は、マーカ情報を参照し、画像取得部11が受信した撮影画像の中のマーカMを検出する。二値化部12は、画像取得部11が受信した撮影画像からマーカMに囲まれた計数範囲Rを特定する(ステップS92)。二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16は、計数範囲Rの撮影画像に対して、ステップS93~ステップS104の処理を行う。ステップS93~ステップS104は、図4に示すフローチャートのステップS12~ステップS23と同様であるので説明を省略する。 Here, the flow of the counting process executed by the counting device 1 will be described. The counting process shown in FIG. 17 starts when the power of the counting device 1 is turned on. When the image acquisition unit 11 of the counting device 1 does not receive the captured image from the photographing device 2 (step S91; NO), the image acquisition unit 11 repeats step S91 and waits for the reception of the captured image. When the image acquisition unit 11 receives the captured image from the photographing device 2 (step S91; YES), the binarization unit 12 refers to the marker information and displays the marker M in the captured image received by the image acquisition unit 11. To detect. The binarization unit 12 identifies the counting range R surrounded by the markers M from the captured image received by the image acquisition unit 11 (step S92). The binarization unit 12, the reduction processing unit 13, the area calculation unit 14, the number of parts calculation unit 15, and the information output unit 16 perform the processes of steps S93 to S104 on the captured image of the counting range R. Since steps S93 to S104 are the same as steps S12 to S23 of the flowchart shown in FIG. 4, the description thereof will be omitted.
 実施の形態5に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。また、ユーザが計数範囲を設定できるので、ユーザが意図しない範囲の部品を計数してしまったり、投入した部品Pが計数範囲の外に出てしまったりといったことを防止できる。 According to the counting system 100 according to the fifth embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time. Further, since the user can set the counting range, it is possible to prevent the parts in the range not intended by the user from being counted and the inserted parts P from going out of the counting range.
(実施の形態6)
 実施の形態6では、図18に示すように、計数システム100は、計数装置1と、撮影装置2とユーザ端末3とに加え、計数装置1の計数結果を学習する機械学習装置7を備える。機械学習装置7は、計数装置1およびユーザ端末3とネットワークを介して接続する。なお、機械学習装置7は、計数装置1に組み込まれてもよいし、クラウドサーバ上に存在してもよい。
(Embodiment 6)
In the sixth embodiment, as shown in FIG. 18, the counting system 100 includes a counting device 1, a photographing device 2, a user terminal 3, and a machine learning device 7 that learns the counting result of the counting device 1. The machine learning device 7 connects to the counting device 1 and the user terminal 3 via a network. The machine learning device 7 may be incorporated in the counting device 1 or may exist on the cloud server.
 ユーザが撮影範囲Cに部品Pを投入すると、部品Pが静止した状態で撮影装置2が撮影範囲Cを撮影し、計数装置1に撮影画像を送信する。計数装置1は、撮影装置2から受信した撮影画像に基づいて、撮影画像に写っている部品Pの総数を算出する。計数装置1は、算出した部品Pの総数を示す部品数情報を生成し、ユーザ端末3に送信する。また、計数装置1は、算出した部品Pの総数を示す部品数情報と部品Pの総数の算出に用いた撮影画像とを対応付けて機械学習装置7に送信する。ユーザ端末3が部品数情報を出力すると、ユーザは、実際の部品Pの総数を数え、計数装置1の計数結果が正しいか否かを示す正否情報をユーザ端末3に入力する。ユーザ端末3は、入力された正否情報を部品数情報に対応付けて機械学習装置7に送信する。 When the user puts the component P into the photographing range C, the photographing device 2 photographs the photographing range C while the component P is stationary, and transmits the photographed image to the counting device 1. The counting device 1 calculates the total number of parts P appearing in the captured image based on the captured image received from the photographing device 2. The counting device 1 generates component number information indicating the calculated total number of components P and transmits it to the user terminal 3. Further, the counting device 1 transmits the calculated number of parts information indicating the total number of parts P and the photographed image used for calculating the total number of parts P in association with each other to the machine learning device 7. When the user terminal 3 outputs the component number information, the user counts the total number of the actual components P and inputs the correctness information indicating whether or not the counting result of the counting device 1 is correct to the user terminal 3. The user terminal 3 associates the input correctness information with the number of parts information and transmits the input to the machine learning device 7.
 機械学習装置7は、計数装置1から受信した撮影画像および部品数情報と、ユーザ端末3から受信した正否情報とに基づいて生成されたデータセットから計数装置1の計数結果を学習する。機械学習装置7は、計数装置1の計数結果を学習した結果、入力された撮影画像に写っている部品Pの総数を出力する学習済モデルを生成する。計数システム100のその他の構成は、実施の形態1と同様である。 The machine learning device 7 learns the counting result of the counting device 1 from the data set generated based on the captured image and the number of parts information received from the counting device 1 and the correctness information received from the user terminal 3. As a result of learning the counting result of the counting device 1, the machine learning device 7 generates a learned model that outputs the total number of parts P shown in the input captured image. Other configurations of the counting system 100 are the same as those in the first embodiment.
 ここで、図19を用いて計数装置1および機械学習装置7の機能構成について説明する。計数装置1の画像取得部11は、撮影装置2から受信した撮影画像を情報出力部16に送る。情報出力部16は、画像取得部11から受け取った撮影画像と、部品数算出部15が生成した部品数情報とを、機械学習装置7に送信する。計数装置1のその他の機能構成は、実施の形態1と同様である。 Here, the functional configuration of the counting device 1 and the machine learning device 7 will be described with reference to FIG. The image acquisition unit 11 of the counting device 1 sends the captured image received from the photographing device 2 to the information output unit 16. The information output unit 16 transmits the captured image received from the image acquisition unit 11 and the component number information generated by the component number calculation unit 15 to the machine learning device 7. Other functional configurations of the counting device 1 are the same as those in the first embodiment.
 機械学習装置7は、計数装置1から撮影画像および部品数情報を受信するデータ取得部71と、ユーザ端末3から正否情報を受信する正否情報取得部72と、計数装置1の計数結果を学習して学習済モデルを生成する学習部73と、学習部73が生成した学習済モデルを記憶する記憶部74とを備える。データ取得部71は、計数装置1から受信した撮影画像および部品数情報を学習部73に送る。正否情報取得部72は、ユーザ端末3から受信した正否情報を学習部73に送る。 The machine learning device 7 learns the data acquisition unit 71 that receives the captured image and the number of parts information from the counting device 1, the correctness information acquisition unit 72 that receives the correctness information from the user terminal 3, and the counting result of the counting device 1. It includes a learning unit 73 that generates a trained model, and a storage unit 74 that stores the trained model generated by the learning unit 73. The data acquisition unit 71 sends the captured image and the number of parts information received from the counting device 1 to the learning unit 73. The correctness information acquisition unit 72 sends the correctness information received from the user terminal 3 to the learning unit 73.
 学習部73は、データ取得部71から受け取った撮影画像および部品数情報と、正否情報取得部72から受け取った正否情報とに基づいて、機械学習用のデータセットを生成する。学習部73は、生成したデータセットから計数装置1の計数結果を学習する。 The learning unit 73 generates a data set for machine learning based on the captured image and the number of parts information received from the data acquisition unit 71 and the correctness information received from the correctness information acquisition unit 72. The learning unit 73 learns the counting result of the counting device 1 from the generated data set.
 学習部73は、例えば、ニューラルネットワークモデルに従って、教師あり学習により、計数装置1の計数結果を学習する。教師あり学習とは、入力と結果(ラベル)のデータセットを大量に学習装置に与えることで、学習装置がそれらのデータセットにある特徴を学習し、入力から結果を推定するモデルをいう。学習部73が教師あり学習を行うために生成するデータセットは、正否情報が示す計数装置1の計数結果が正しかった撮影画像を入力データとし、対応する部品数情報が示す部品Pの総数をラベルデータとして関連づけたデータセットである。正否情報が示す計数装置1の計数結果が正しかった撮影画像に対応する部品数情報が示す部品Pの総数は、正否情報が示す計数装置1の計数結果が正しかった部品Pの総数である。以下、正否情報が示す計数装置1の計数結果が正しかった撮影画像を、単に計数結果が正しかった撮影画像という。正否情報が示す計数装置1の計数結果が正しかった部品Pの総数を、単に計数結果が正しかった部品Pの総数という。 The learning unit 73 learns the counting result of the counting device 1 by supervised learning according to, for example, a neural network model. Supervised learning is a model in which a learning device learns features in those data sets and estimates the result from the input by giving a large amount of data sets of inputs and results (labels) to the learning device. The data set generated by the learning unit 73 for supervised learning uses the captured image in which the counting result of the counting device 1 indicated by the correctness information is correct as input data, and labels the total number of parts P indicated by the corresponding number of parts information. It is a data set associated as data. The total number of parts P indicated by the number of parts information corresponding to the photographed image in which the counting result of the counting device 1 indicated by the correctness information was correct is the total number of parts P in which the counting result of the counting device 1 indicated by the correctness information was correct. Hereinafter, the captured image in which the counting result of the counting device 1 indicated by the correctness information is correct is simply referred to as a captured image in which the counting result is correct. The total number of parts P for which the counting result of the counting device 1 indicated by the correctness information is correct is simply referred to as the total number of parts P for which the counting result is correct.
 ニューラルネットワークは、複数のニューロンからなる入力層、複数のニューロンからなる中間層および複数のニューロンからなる出力層で構成される。中間層は、1層でもよいし、2層以上でもよい。例えば、図20に示すような3層のニューラルネットワークであれば、複数の入力データが入力層であるニューロンX1~X3に入力されると、その値に重みw11~w16を掛けて中間層であるニューロンY1~Y2に入力される。ニューロンY1~Y2から出力される結果にさらに重みw21~w26を掛けて出力層であるニューロンZ1~Z3から出力される。ニューロンZ1~Z3から出力される結果は、重みw11~w16および重みw21~w26の値によって変わる。 A neural network is composed of an input layer composed of a plurality of neurons, an intermediate layer composed of a plurality of neurons, and an output layer composed of a plurality of neurons. The intermediate layer may be one layer or two or more layers. For example, in the case of a three-layer neural network as shown in FIG. 20, when a plurality of input data are input to neurons X1 to X3, which are input layers, the values are multiplied by weights w11 to w16 to form an intermediate layer. It is input to neurons Y1 and Y2. The results output from the neurons Y1 to Y2 are further multiplied by the weights w21 to w26 and output from the output layers neurons Z1 to Z3. The results output from neurons Z1 to Z3 vary depending on the values of weights w11 to w16 and weights w21 to w26.
 学習部73が、ニューラルネットワークモデルに従って、計数装置1の計数結果を学習する場合、ニューラルネットワークの入力層に、計数結果が正しかった撮影画像が入力される。ニューラルネットワークは重みを調整して、出力層から出力される結果を、計数結果が正しかった部品Pの総数、即ちラベルデータに近づける学習を行う。学習部73は、学習が完了すると、入力された撮影画像に写っている部品Pの総数を出力するニューラルネットワークの学習済モデルを記憶部74に記憶する。学習の完了は、例えば、試験用のデータセットを入力して出力の正解率が閾値を超えた場合に学習を完了する。 When the learning unit 73 learns the counting result of the counting device 1 according to the neural network model, the captured image with the correct counting result is input to the input layer of the neural network. The neural network adjusts the weights and learns to bring the result output from the output layer closer to the total number of parts P for which the counting result is correct, that is, the label data. When the learning is completed, the learning unit 73 stores in the storage unit 74 a learned model of the neural network that outputs the total number of parts P shown in the input captured image. The learning is completed, for example, when the data set for the test is input and the correct answer rate of the output exceeds the threshold value.
 また、学習部73は、教師あり学習のクラス分類を行って、計数結果を学習してもよい。この場合、学習部73は、入力データをラベルデータに対応するクラスに分類する。つまり、計数結果が正しかった撮影画像を、計数結果が正しかった部品Pの総数に対応するクラスに分類する。学習部73は、クラスごとに、つまり、計数結果が正しかった部品Pの総数ごとに、計数結果が正しかった撮影画像の特徴を学習する。学習部73は、学習が完了すると、入力された撮影画像をいずれかのクラスに分類し、分類したクラスに対応するラベルである部品Pの総数を出力する学習済モデルを記憶部74に記憶する。 Further, the learning unit 73 may classify the supervised learning and learn the counting result. In this case, the learning unit 73 classifies the input data into classes corresponding to the label data. That is, the captured image with the correct counting result is classified into the class corresponding to the total number of parts P with the correct counting result. The learning unit 73 learns the characteristics of the captured image for which the counting result is correct for each class, that is, for each total number of parts P for which the counting result is correct. When the learning is completed, the learning unit 73 classifies the input captured image into one of the classes, and stores in the storage unit 74 a learned model that outputs the total number of parts P which are labels corresponding to the classified classes. ..
 なお、学習部73は、複数の部品に対して生成されるデータセットに従って、計数結果を学習してもよい。この場合、データ取得部71は、計数装置1から撮影画像および部品数情報に加え、部品を識別する情報を受信する。学習部73は、同一の計数装置1から収集される複数の部品の撮影画像、部品数情報および部品を識別する情報と正否情報とに基づいてデータセットを生成してもよいし、複数の計数装置1から収集される複数の部品の撮影画像、部品数情報および部品を識別する情報と正否情報とに基づいてデータセットを生成してもよい。また、データセットを収集する対象の部品を途中で追加したり、削除したりしてもよい。さらに、ある計数装置1のある部品の計数結果を学習した機械学習装置7を、別の計数装置1に取り付け、別の部品の計数結果を再学習させてもよい。 Note that the learning unit 73 may learn the counting result according to the data set generated for the plurality of parts. In this case, the data acquisition unit 71 receives the information for identifying the parts from the counting device 1 in addition to the captured image and the information on the number of parts. The learning unit 73 may generate a data set based on captured images of a plurality of parts collected from the same counting device 1, information on the number of parts, information for identifying the parts, and correctness / rejection information, or may generate a plurality of countings. A data set may be generated based on a photographed image of a plurality of parts collected from the device 1, information on the number of parts, information for identifying the parts, and correctness information. In addition, the parts for which the data set is to be collected may be added or deleted in the middle. Further, the machine learning device 7 that has learned the counting result of a certain part of a certain counting device 1 may be attached to another counting device 1 to relearn the counting result of another part.
 学習部73に用いられる学習アルゴリズムとしては、特徴量そのものの抽出を学習する、深層学習(Deep Learning)を用いることもできる。また、学習部73は、他の公知の方法、例えば遺伝的プログラミング、機能論理プログラミング、サポートベクターマシンなどに従って機械学習を実行してもよい。 As the learning algorithm used in the learning unit 73, deep learning, which learns the extraction of the feature amount itself, can also be used. In addition, the learning unit 73 may execute machine learning according to other known methods such as genetic programming, functional logic programming, and support vector machines.
 なお、機械学習装置7の正否情報取得部72は、ユーザ端末3以外から正否情報を取得してもよい。例えば、図21に示すように、計数システム100は、計数装置1と、撮影装置2と、ユーザ端末3と、機械学習装置7に加え、撮影範囲Cに投入された部品Pの総重量を計測する重量計測装置8を備えてもよい。機械学習装置7と重量計測装置8とは、有線または無線で接続している。 The correctness information acquisition unit 72 of the machine learning device 7 may acquire correctness information from other than the user terminal 3. For example, as shown in FIG. 21, the counting system 100 measures the total weight of the parts P put into the shooting range C in addition to the counting device 1, the photographing device 2, the user terminal 3, and the machine learning device 7. The weight measuring device 8 may be provided. The machine learning device 7 and the weight measuring device 8 are connected by wire or wirelessly.
 この場合、重量計測装置8は、撮影範囲Cに投入された部品Pの総重量を示す部品総重量情報を機械学習装置7に送信する。正否情報取得部72は、予め部品Pの1個の重量を示す単位重量情報を記憶しておく。正否情報取得部72は、重量計測装置8から受信した総重量情報が示す部品Pの総重量を、部品Pの1個の重量で除算して、撮影範囲Cに投入された部品Pの総数を算出する。正否情報取得部72は、データ取得部71が計測装置1から受信した部品数情報が示す部品Pの総数と、重量計測装置8から受信した総重量情報に基づいて算出した部品Pの総数とが一致する場合、計数装置1の計数結果が正しいことを示す正否情報を生成する。一致しない場合、正否情報取得部72は、計数装置1の計数結果が正しくないことを示す正否情報を生成する。 In this case, the weight measuring device 8 transmits the total part weight information indicating the total weight of the parts P put into the photographing range C to the machine learning device 7. The correctness / rejection information acquisition unit 72 stores in advance unit weight information indicating the weight of one component P. The correctness / rejection information acquisition unit 72 divides the total weight of the parts P indicated by the total weight information received from the weight measuring device 8 by the weight of one part P, and calculates the total number of parts P put into the photographing range C. calculate. In the correctness / rejection information acquisition unit 72, the total number of parts P indicated by the number of parts information received from the measuring device 1 by the data acquisition unit 71 and the total number of parts P calculated based on the total weight information received from the weight measuring device 8 are calculated. If they match, the correctness information indicating that the counting result of the counting device 1 is correct is generated. If they do not match, the correctness / rejection information acquisition unit 72 generates correctness / rejection information indicating that the counting result of the counting device 1 is incorrect.
 実施の形態6に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。また、機械学習装置7が生成した学習済モデルを用いれば、計数装置1よりも早く、撮影画像に写っている部品Pの総数を算出することができる。また、機械学習装置7が生成した学習済モデルをコンピュータが読み取り可能な記録媒体に格納して配布し、当該学習済モデルを、撮影画像を取得可能なコンピュータにインストールすることにより、計数装置1と同様の装置を簡単に実現することができる。 According to the counting system 100 according to the sixth embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time. Further, by using the trained model generated by the machine learning device 7, the total number of parts P shown in the captured image can be calculated faster than that of the counting device 1. Further, the trained model generated by the machine learning device 7 is stored in a computer-readable recording medium and distributed, and the trained model is installed in a computer capable of acquiring captured images to obtain the counting device 1. A similar device can be easily realized.
(実施の形態7)
 実施の形態7では、計数装置1が計数した部品Pを基板に配置する。図22に示すように、計数システム100は、計数装置1と、撮影装置2とユーザ端末3とに加え、部品Pを基板に配置する部品配置装置9を備える。部品配置装置9は、計数装置1およびユーザ端末3に接続する。なお、計数装置1は、部品配置装置9に組み込まれてもよい。
(Embodiment 7)
In the seventh embodiment, the component P counted by the counting device 1 is arranged on the substrate. As shown in FIG. 22, the counting system 100 includes a counting device 1, a photographing device 2, a user terminal 3, and a component arranging device 9 for arranging the component P on a substrate. The component placement device 9 is connected to the counting device 1 and the user terminal 3. The counting device 1 may be incorporated in the component arranging device 9.
 部品配置装置9は、図示しない部品供給部に計数装置1が計数した部品Pを収納する。計数装置1が計数可能で、かつ、部品配置装置9が基板に配置する部品が複数ある場合には、部品配置装置9は、部品ごとの部品供給部を備える。計数装置1が計数可能で、かつ、部品配置装置9が基板に配置する部品が複数ある場合、例えば、ユーザがユーザ端末3に撮影範囲Cに投入した部品を識別する情報を入力する。ユーザ端末3は、入力された部品を識別する情報を計数装置に送信する。計数装置1は、ユーザ端末3から受信した部品を識別する情報を、部品数情報に対応付ける。部品配置装置9は、部品供給部に収納された部品を基板に配置する。その後、部品Pが配置された基板がはんだフロー漕に送り込まれることで、部品Pがはんだ付けされて基板に実装される。あるいは、はんだ付け装置によって基板に配置された部品Pがはんだ付けされて実装される。あるいは、作業者によって基板に配置された部品Pが手作業ではんだ付けされて実装される。 The component arranging device 9 stores the component P counted by the counting device 1 in a component supply unit (not shown). When the counting device 1 can count and the component arranging device 9 has a plurality of components to be arranged on the substrate, the component arranging device 9 includes a component supply unit for each component. When the counting device 1 can count and the component arranging device 9 has a plurality of components to be arranged on the substrate, for example, the user inputs information for identifying the components put into the photographing range C into the user terminal 3. The user terminal 3 transmits information for identifying the input component to the counting device. The counting device 1 associates the information for identifying the parts received from the user terminal 3 with the parts number information. The component arranging device 9 arranges the components housed in the component supply unit on the substrate. After that, the substrate on which the component P is arranged is sent to the solder flow tank, so that the component P is soldered and mounted on the substrate. Alternatively, the component P arranged on the substrate is soldered and mounted by the soldering device. Alternatively, the component P placed on the board by the operator is manually soldered and mounted.
 計数装置1の情報出力部16は、ユーザ端末3ではなく、部品配置装置9に部品数情報を送信する。計数装置1のその他の機能構成は、実施の形態1と同様である。部品配置装置9は、計数装置1から部品数情報を受信すると、部品数情報が示す部品Pの総数を加算しながら、部品供給部に計数装置1が計数した部品Pを収納する。加算した部品Pの総数、つまり、部品供給部に収納された部品Pの数が一定数に到達すると、部品配置装置9は、ユーザ端末3に部品Pの数が一定数に到達したことを示す到達情報を送信する。一定数は、例えば、部品配置装置9が1日の稼働時間中に基板に配置することができる部品Pの数、つまり、1日あたりの部品配置装置9の部品Pの必要数としてもよいし、部品供給部が収納できる部品Pの数の上限に基づいて決められる数としてもよい。 The information output unit 16 of the counting device 1 transmits the number of parts information to the parts arranging device 9 instead of the user terminal 3. Other functional configurations of the counting device 1 are the same as those in the first embodiment. When the component arranging device 9 receives the component number information from the counting device 1, the component arranging device 9 stores the component P counted by the counting device 1 in the component supply unit while adding the total number of components P indicated by the component number information. When the total number of added parts P, that is, the number of parts P stored in the parts supply unit reaches a certain number, the parts arranging device 9 indicates to the user terminal 3 that the number of parts P has reached a certain number. Send arrival information. A certain number may be, for example, the number of parts P that the component arranging device 9 can arrange on the board during the operating time of one day, that is, the required number of components P of the component arranging device 9 per day. , The number may be determined based on the upper limit of the number of parts P that can be stored in the parts supply unit.
 ユーザ端末3は、部品配置装置9から受信した到達情報を画面表示、音声出力などの方法で出力する。ユーザ端末3が到達情報を出力すると、ユーザは撮影範囲Cへの部品Pの投入をやめる。ユーザは、計数装置1が計数可能で、かつ、部品配置装置9が基板に配置する他の部品がある場合には、次の部品を撮影範囲Cに投入する。到達情報に次の部品を識別する情報を含んでもよい。この場合、ユーザは、ユーザ端末3が出力した到達情報に含まれる次の部品を識別する情報に基づいて、次の部品を撮影範囲Cに投入する。 The user terminal 3 outputs the arrival information received from the component arranging device 9 by a method such as screen display or voice output. When the user terminal 3 outputs the arrival information, the user stops putting the component P into the shooting range C. When the counting device 1 can count and the component arranging device 9 has other components to be arranged on the substrate, the user puts the next component into the photographing range C. The arrival information may include information that identifies the next component. In this case, the user puts the next component into the shooting range C based on the information for identifying the next component included in the arrival information output by the user terminal 3.
 続いて、計数装置1が実行する計数処理および部品配置装置9が実行する部品配置処理の流れを説明する。計数処理の流れは、図4に示す実施の形態1の計数処理のフローチャートと同様である。ただし、ステップS22で、情報出力部16は、部品数情報をユーザ端末3だけでなく部品配置装置9にも送信する。図23に示す部品配置処理は、部品配置装置9の電源が投入されたときに開始する。部品配置装置9は、計数装置1から部品数情報を受信しない場合(ステップS111;NO)、ステップS111を繰り返して、部品数情報の受信を待機する。 Subsequently, the flow of the counting process executed by the counting device 1 and the component placement process executed by the component placement device 9 will be described. The flow of the counting process is the same as the flowchart of the counting process of the first embodiment shown in FIG. However, in step S22, the information output unit 16 transmits the component number information not only to the user terminal 3 but also to the component arranging device 9. The component placement process shown in FIG. 23 starts when the power of the component placement device 9 is turned on. When the component arranging device 9 does not receive the component number information from the counting device 1 (step S111; NO), the component arranging device 9 repeats step S111 and waits for the reception of the component number information.
 計数装置1から部品数情報を受信した場合(ステップS111;YES)、部品配置装置9は、部品数情報が示す部品の総数を加算し(ステップS112)、計数装置1が計数した部品を部品供給部に収納する(ステップS113)。部品配置装置9は、加算した部品の総数、つまり、部品供給部に収納された部品の数が一定数に到達したか否かを判定する(ステップS114)。一定数に到達していない場合(ステップS114;NO)、処理はステップS111に戻り、ステップS111~ステップS114を繰り返す。一定数に到達した場合(ステップS114;YES)、部品配置装置9は、ユーザ端末3に部品の数が一定数に到達したことを示す到達情報を送信する(ステップS115)。ユーザ端末3は、部品配置装置9から受信した到達情報を出力する。ユーザ端末3が到達情報を出力すると、ユーザは撮影範囲Cへの部品の投入をやめる。 When the number of parts information is received from the counting device 1 (step S111; YES), the parts arranging device 9 adds the total number of parts indicated by the number of parts information (step S112), and supplies the parts counted by the counting device 1. It is stored in the unit (step S113). The component arranging device 9 determines whether or not the total number of added components, that is, the number of components stored in the component supply unit has reached a certain number (step S114). If a certain number has not been reached (step S114; NO), the process returns to step S111, and steps S111 to S114 are repeated. When a certain number is reached (step S114; YES), the component arranging device 9 transmits arrival information indicating that the number of components has reached a certain number to the user terminal 3 (step S115). The user terminal 3 outputs the arrival information received from the component arranging device 9. When the user terminal 3 outputs the arrival information, the user stops putting the parts into the shooting range C.
 計数装置1が計数可能で、かつ、部品配置装置9が基板に配置する他の部品がある場合(ステップS116;YES)、ユーザは、次の部品を撮影範囲Cに投入する。到達情報に次の部品を識別する情報を含んでもよい。この場合、ユーザは、ユーザ端末3が出力した次の部品を識別する情報を見て、その部品を撮影範囲Cに投入する。計数装置1は、撮影範囲Cに投入された次の部品の総数を算出し、部品数情報を部品配置装置9に送信する。処理は、ステップS111に戻り、部品配置装置9は計数装置1から部品数情報を受信する(ステップS111;YES)。部品配置装置9は、ステップS111~ステップS116を繰り返す。計数装置1が計数可能で、かつ、部品配置装置9が基板に配置する他の部品がない場合(ステップS116;NO)、部品配置装置9は、部品供給部に収納された部品を基板に配置し(ステップS117)、処理を終了する。 When the counting device 1 is capable of counting and the component arranging device 9 has other components to be arranged on the substrate (step S116; YES), the user puts the next component into the photographing range C. The arrival information may include information that identifies the next component. In this case, the user sees the information for identifying the next component output by the user terminal 3 and puts the component into the shooting range C. The counting device 1 calculates the total number of the next parts put into the photographing range C, and transmits the part number information to the parts arranging device 9. The process returns to step S111, and the component arranging device 9 receives component number information from the counting device 1 (step S111; YES). The component arranging device 9 repeats steps S111 to S116. When the counting device 1 can count and the component arranging device 9 has no other components to be arranged on the board (step S116; NO), the component arranging device 9 arranges the components stored in the component supply unit on the board. (Step S117), and the process ends.
 実施の形態7に係る計数システム100によれば、撮影画像から部品Pを計数する計数装置1が、静止した部品Pを撮影した撮影画像を二値化し、二値化した撮影画像のピクセル分布を分類した集団の面積に基づいて部品Pを計数することで、部品Pの基準となる画像を記憶しておく必要がなく、かつ、より短時間で正確な計数が可能になる。また、部品配置装置9は、計数装置1が計数した部品Pを一定数に到達するまで部品供給部に収納する。これにより、部品供給部に収納された部品が不足することなく、部品配置装置9がスムーズに基板に部品を配置することが可能になる。 According to the counting system 100 according to the seventh embodiment, the counting device 1 that counts the component P from the captured image binarizes the captured image obtained by capturing the stationary component P, and obtains the pixel distribution of the binarized captured image. By counting the parts P based on the area of the classified group, it is not necessary to store the reference image of the parts P, and accurate counting can be performed in a shorter time. Further, the component arranging device 9 stores the components P counted by the counting device 1 in the component supply unit until a certain number is reached. As a result, the component arranging device 9 can smoothly arrange the components on the substrate without running out of the components stored in the component supply unit.
 計数装置1のハードウェア構成について図24を用いて説明する。図24に示すように、計数装置1は、一時記憶部101、記憶部102、計算部103、操作部104、入出力部105および表示部106を備える。一時記憶部101、記憶部102、操作部104、入出力部105および表示部106はいずれもBUSを介して計算部103に接続されている。 The hardware configuration of the counting device 1 will be described with reference to FIG. 24. As shown in FIG. 24, the counting device 1 includes a temporary storage unit 101, a storage unit 102, a calculation unit 103, an operation unit 104, an input / output unit 105, and a display unit 106. The temporary storage unit 101, the storage unit 102, the operation unit 104, the input / output unit 105, and the display unit 106 are all connected to the calculation unit 103 via the BUS.
 計算部103は、例えばCPU(Central Processing Unit)である。計算部103は、記憶部102に記憶されている制御プログラムに従って、計数装置1の二値化部12、縮小処理部13、面積算出部14、部品数算出部15、均し動作指示部18、正否判定部19、および、在庫更新部21の各処理を実行する。 The calculation unit 103 is, for example, a CPU (Central Processing Unit). According to the control program stored in the storage unit 102, the calculation unit 103 includes a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and a leveling operation instruction unit 18. Each process of the correctness determination unit 19 and the inventory update unit 21 is executed.
 一時記憶部101は、例えばRAM(Random-Access Memory)である。一時記憶部101は、記憶部102に記憶されている制御プログラムをロードし、計算部103の作業領域として用いられる。 The temporary storage unit 101 is, for example, a RAM (Random-Access Memory). The temporary storage unit 101 loads the control program stored in the storage unit 102 and is used as a work area of the calculation unit 103.
 記憶部102は、フラッシュメモリ、ハードディスク、DVD-RAM(Digital Versatile Disc - Random Access Memory)、DVD-RW(Digital Versatile Disc - ReWritable)などの不揮発性メモリである。記憶部102は、計数装置1の処理を計算部103に行わせるためのプログラムを予め記憶し、また、計算部103の指示に従って、このプログラムが記憶するデータを計算部103に供給し、計算部103から供給されたデータを記憶する。集団画像記憶部17、撮影実績記憶部20、および、マーカ記憶部22は、記憶部102に構成される。 The storage unit 102 is a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc-Random Access Memory), and a DVD-RW (Digital Versatile Disc-ReWritable). The storage unit 102 stores in advance a program for causing the calculation unit 103 to perform the processing of the counting device 1, and supplies the data stored by this program to the calculation unit 103 according to the instruction of the calculation unit 103, and the calculation unit 102. The data supplied from 103 is stored. The group image storage unit 17, the shooting record storage unit 20, and the marker storage unit 22 are configured in the storage unit 102.
 操作部104は、キーボード、ポインティングデバイスなどの入力装置と、キーボード、ポインティングデバイスなどの入力装置をBUSに接続するインターフェース装置である。例えば、計数装置1に直接情報を入力する構成の場合、操作部104を介して、入力された情報が計算部103に供給される。 The operation unit 104 is an interface device that connects an input device such as a keyboard and a pointing device and an input device such as a keyboard and a pointing device to the BUS. For example, in the case of a configuration in which information is directly input to the counting device 1, the input information is supplied to the calculation unit 103 via the operation unit 104.
 入出力部105は、ネットワークに接続する網終端装置または無線通信装置、およびそれらと接続するシリアルインターフェースまたはLAN(Local Area Network)インターフェースである。入出力部105は、画像取得部11、部品数算出部15、情報出力部16、均し動作指示部18、正否判定部19、および、在庫更新部21として機能する。 The input / output unit 105 is a network termination device or wireless communication device connected to the network, and a serial interface or LAN (Local Area Network) interface connected to them. The input / output unit 105 functions as an image acquisition unit 11, a component number calculation unit 15, an information output unit 16, a leveling operation instruction unit 18, a correctness determination unit 19, and an inventory update unit 21.
 表示部106は、CRT(Cathode Ray Tube)、LCD(Liquid Crystal Display)などの表示装置である。例えば、計数装置1に直接情報を入力する構成の場合、表示部106は、操作画面を表示する。また、情報出力部16が計数不可情報および部品数情報を表示する場合には、表示部106は、情報出力部16として機能する。 The display unit 106 is a display device such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display). For example, in the case of a configuration in which information is directly input to the counting device 1, the display unit 106 displays an operation screen. Further, when the information output unit 16 displays the non-countable information and the number of parts information, the display unit 106 functions as the information output unit 16.
 図2、図5、図9、図19および図16に示す計数装置1の画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15、情報出力部16、集団画像記憶部17、均し動作指示部18、撮影実績記憶部20、正否判定部19、在庫更新部21、および、マーカ記憶部22の処理は、制御プログラムが、一時記憶部101、計算部103、記憶部102、操作部104、入出力部105および表示部106などを資源として用いて処理することによって実行する。 Image acquisition unit 11, binarization unit 12, reduction processing unit 13, area calculation unit 14, number of parts calculation unit 15, information output unit of the counting device 1 shown in FIGS. 2, 5, 9, 19 and 16. 16. The control program controls the processing of the group image storage unit 17, the leveling operation instruction unit 18, the shooting result storage unit 20, the correctness / rejection determination unit 19, the inventory update unit 21, and the marker storage unit 22. It is executed by processing using the calculation unit 103, the storage unit 102, the operation unit 104, the input / output unit 105, the display unit 106, and the like as resources.
 その他、前記のハードウェア構成およびフローチャートは一例であり、任意に変更および修正が可能である。 In addition, the above hardware configuration and flowchart are examples, and can be changed and modified arbitrarily.
 計算部103、一時記憶部101、記憶部102、操作部104、入出力部105、表示部106などの計数装置1の処理を行う中心となる部分は、専用のシステムによらず、通常のコンピュータシステムを用いて実現可能である。例えば、前記の動作を実行するためのコンピュータプログラムを、フレキシブルディスク、CD-ROM(Compact Disc - Read Only Memory)、DVD-ROM(Digital Versatile Disc - Read Only Memory)などのコンピュータが読み取り可能な記録媒体に格納して配布し、当該コンピュータプログラムをコンピュータにインストールすることにより、前記の処理を実行する計数装置1を構成してもよい。また、インターネットなどの通信ネットワーク上のサーバ装置が有する記憶装置に当該コンピュータプログラムを格納しておき、通常のコンピュータシステムがダウンロードなどすることで計数装置1を構成してもよい。 The central part of the counting device 1, such as the calculation unit 103, the temporary storage unit 101, the storage unit 102, the operation unit 104, the input / output unit 105, and the display unit 106, is a normal computer regardless of a dedicated system. It can be realized using a system. For example, a computer program for executing the above operation can be a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc-Read Only Memory), or a DVD-ROM (Digital Versatile Disc-Read Only Memory). The counting device 1 that executes the above-mentioned processing may be configured by storing and distributing the computer program in the computer and installing the computer program in the computer. Further, the counting device 1 may be configured by storing the computer program in a storage device of a server device on a communication network such as the Internet and downloading it by a normal computer system.
 また、計数装置1の機能を、OS(Operating System)とアプリケーションプログラムの分担、またはOSとアプリケーションプログラムとの協働により実現する場合などには、アプリケーションプログラム部分のみを記録媒体、記憶装置などに格納してもよい。 Further, when the function of the counting device 1 is realized by sharing the OS (Operating System) and the application program, or by coordinating the OS and the application program, only the application program part is stored in a recording medium, a storage device, or the like. You may.
 また、搬送波にコンピュータプログラムを重畳し、通信ネットワークを介して提供することも可能である。例えば、通信ネットワーク上の掲示板(BBS, Bulletin Board System)に前記コンピュータプログラムを掲示し、通信ネットワークを介して前記コンピュータプログラムを提供してもよい。そして、このコンピュータプログラムを起動し、OSの制御下で、他のアプリケーションプログラムと同様に実行することにより、前記の処理を実行できる構成にしてもよい。 It is also possible to superimpose a computer program on a carrier wave and provide it via a communication network. For example, the computer program may be posted on a bulletin board system (BBS, Bulletin Board System) on the communication network, and the computer program may be provided via the communication network. Then, by starting this computer program and executing it in the same manner as other application programs under the control of the OS, the above processing may be executed.
 上記の実施の形態4では、計数装置1は、機能構成として、画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15、および、情報出力部16に加え、正否判定部19、撮影実績記憶部20、および、在庫更新部21を備えるが、これに限らない。計数装置1は、画像取得部11、二値化部12、縮小処理部13、面積算出部14、部品数算出部15および、情報出力部16に加え、正否判定部19、および、撮影実績記憶部20を備えるが在庫更新部21を備えない構成、または、正否判定部19、撮影実績記憶部20を備えないが在庫更新部21を備える構成であってもよい。 In the fourth embodiment, the counting device 1 has an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, and an information output unit 16 as functional configurations. In addition, a correct / reject determination unit 19, a shooting record storage unit 20, and an inventory update unit 21 are provided, but the present invention is not limited to this. The counting device 1 includes an image acquisition unit 11, a binarization unit 12, a reduction processing unit 13, an area calculation unit 14, a component number calculation unit 15, an information output unit 16, a correctness determination unit 19, and a shooting record storage. A configuration may be configured in which the unit 20 is provided but the inventory update unit 21 is not provided, or the correctness determination unit 19 and the shooting record storage unit 20 are not provided but the inventory update unit 21 is provided.
 上記の実施の形態4および5の計数装置1は、実施の形態1の計数装置1に機能を追加しているが、これに限らない。実施の形態1の計数装置1に実施の形態4および5の機能の両方の機能を追加してもよいし、実施の形態2または実施の形態3の計数装置1に、実施の形態4および5の機能の一方または両方の機能を追加してもよい。 The counting device 1 of the above-described embodiments 4 and 5 adds a function to the counting device 1 of the first embodiment, but the present invention is not limited to this. Both functions of the functions of the fourth and fifth embodiments may be added to the counting device 1 of the first embodiment, or the counting devices 1 of the second embodiment or the third embodiment may have the functions of the fourth and fifth embodiments. One or both of the functions of the above may be added.
 上記の実施の形態6の機械学習装置7は、計数装置1の計数結果を学習するが、これに限らず、計数装置1の計数結果の正否を学習してもよい。この場合、学習部73が教師あり学習を行うために生成するデータセットは、撮影画像および部品数情報が示す部品Pの総数を入力データとし、対応する正否情報が示す計数装置1の計数結果の正否をラベルデータとして関連づけたデータセットである。学習部73は、生成したデータセットから計数装置1の計数結果の正否を学習する。学習部73は、計数装置1の計数結果の正否を学習した結果、入力された撮影画像および部品Pの総数に対して、計数結果の正否を出力する学習済モデルを生成する。例えば、この学習済モデルに、計数装置1の撮影画像および部品数情報を入力し、計数装置1の計数結果が正しくないことが出力された場合は、ユーザが目視で計数するといった他の計数方法を用いることで、部品Pの総数の計数結果の正解率を上げることができる。 The machine learning device 7 of the sixth embodiment learns the counting result of the counting device 1, but the present invention is not limited to this, and the correctness of the counting result of the counting device 1 may be learned. In this case, the data set generated by the learning unit 73 for supervised learning uses the total number of parts P indicated by the captured image and the number of parts information as input data, and is the counting result of the counting device 1 indicated by the corresponding correctness information. It is a data set in which correctness is associated as label data. The learning unit 73 learns the correctness of the counting result of the counting device 1 from the generated data set. As a result of learning the correctness of the counting result of the counting device 1, the learning unit 73 generates a learned model that outputs the correctness of the counting result with respect to the total number of input captured images and parts P. For example, another counting method in which the photographed image of the counting device 1 and the number of parts information are input to this trained model, and when the counting result of the counting device 1 is output to be incorrect, the user visually counts. By using, the accuracy rate of the counting result of the total number of parts P can be increased.
 上記の実施の形態6および7の計数システム100は、実施の形態1の計数システム100に機械学習装置7または部品配置装置9を追加しているが、これに限らない。実施の形態1~5の計数システム100に、機械学習装置7または部品配置装置9を追加してもよいし、機械学習装置7と部品配置装置9との両方を追加してもよい。 The counting system 100 of the above-described embodiments 6 and 7 adds a machine learning device 7 or a component arranging device 9 to the counting system 100 of the first embodiment, but the present invention is not limited to this. The machine learning device 7 or the component placement device 9 may be added to the counting system 100 of the first to fifth embodiments, or both the machine learning device 7 and the component placement device 9 may be added.
 上記の実施の形態7の計数装置1は、ユーザ端末3ではなく、部品配置装置9に部品数情報を送信し、部品配置装置9は、ユーザ端末3に部品Pの数が一定数に到達したことを示す到達情報を送信する。ユーザ端末3に到達情報が表示されると、ユーザは撮影範囲Cへの部品Pの投入をやめる。これに限らず、計数装置1は、ユーザ端末3および部品配置装置9に部品数情報を送信し、ユーザ端末3が受信した部品数情報が示す部品Pの数を加算して、一定数に到達すると到達情報を表示してもよい。この場合、部品配置装置9は、ユーザ端末3に到達情報を送信しなくてもよい。 The counting device 1 of the above embodiment 7 transmits the component number information to the component arranging device 9 instead of the user terminal 3, and the component arranging device 9 reaches a certain number of components P to the user terminal 3. Send arrival information indicating that. When the arrival information is displayed on the user terminal 3, the user stops putting the component P into the shooting range C. Not limited to this, the counting device 1 transmits the component number information to the user terminal 3 and the component arranging device 9, adds the number of components P indicated by the component number information received by the user terminal 3, and reaches a certain number. Then, the arrival information may be displayed. In this case, the component arranging device 9 does not have to transmit the arrival information to the user terminal 3.
 上記の実施の形態では、部品の計数を行う計数システム100の例を説明したが、これに限らない。計数システム100は、物品の計数を行う計数システムであればよい。 In the above embodiment, an example of the counting system 100 that counts parts has been described, but the present invention is not limited to this. The counting system 100 may be any counting system that counts articles.
 なお、本開示は、本開示の広義の精神と範囲を逸脱することなく、様々な実施の形態及び変形が可能とされるものである。また、上述した実施の形態は、この開示を説明するためのものであり、本開示の範囲を限定するものではない。即ち、本開示の範囲は、実施の形態ではなく、請求の範囲によって示される。そして、請求の範囲内及びそれと同等の開示の意義の範囲内で施される様々な変形が、この開示の範囲内とみなされる。 It should be noted that the present disclosure allows various embodiments and modifications without departing from the broad spirit and scope of the present disclosure. Moreover, the above-described embodiment is for explaining this disclosure, and does not limit the scope of the present disclosure. That is, the scope of the present disclosure is indicated not by the embodiment but by the claims. And various modifications made within the scope of the claims and within the equivalent meaning of disclosure are considered to be within the scope of this disclosure.
 本出願は、2019年3月8日に出願された、日本国特許出願特願2019-042438号に基づく。本明細書中に日本国特許出願特願2019-042438号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 This application is based on Japanese Patent Application No. 2019-042438, which was filed on March 8, 2019. The specification of Japanese Patent Application No. 2019-042438, the scope of claims, and the entire drawing shall be incorporated into this specification as a reference.
 1 計数装置、2 撮影装置、3 ユーザ端末、4 均し動作装置、5 生産管理システム、6 在庫管理システム、7 機械学習装置、8 重量計測装置、9 部品配置装置、11 画像取得部、12 二値化部、13 縮小処理部、14 面積算出部、15 部品数算出部、16 情報出力部、17 集団画像記憶部、18 均し動作指示部、19 正否判定部、20 撮影実績記憶部、21 在庫更新部、22 マーカ記憶部、71 データ取得部、72 正否情報取得部、73 学習部、74 記憶部、100 計数システム、101 一時記憶部、102 記憶部、103 計算部、104 操作部、105 入出力部、106 表示部、C 撮影範囲、M マーカ、P 部品、R 計数範囲。 1 Counting device, 2 Imaging device, 3 User terminal, 4 Leveling operation device, 5 Production management system, 6 Inventory management system, 7 Machine learning device, 8 Weight measuring device, 9 Parts placement device, 11 Image acquisition unit, 12 2. Value conversion unit, 13 reduction processing unit, 14 area calculation unit, 15 parts number calculation unit, 16 information output unit, 17 group image storage unit, 18 leveling operation instruction unit, 19 correctness judgment unit, 20 shooting record storage unit, 21 Inventory update unit, 22 marker storage unit, 71 data acquisition unit, 72 correct / false information acquisition unit, 73 learning unit, 74 storage unit, 100 counting system, 101 temporary storage unit, 102 storage unit, 103 calculation unit, 104 operation unit, 105 Input / output unit, 106 display unit, C shooting range, M marker, P parts, R counting range.

Claims (16)

  1.  静止した物品を撮影する撮影装置、および、前記撮影装置が撮影した物品を計数する計数装置を備える計数システムであって、
     前記計数装置は、
     前記撮影装置から撮影画像を取得する画像取得部と、
     前記画像取得部が取得した前記撮影画像を二値化する二値化部と、
     前記二値化部が二値化した前記撮影画像のピクセル分布を分類した集団の面積を算出する面積算出部と、
     前記面積算出部が算出した集団の面積から集団ごとの物品の数を算出し、前記集団ごとの物品の数を合計し、前記撮影画像に写っている物品の総数を示す物品数情報を生成する物品数算出部と、
     前記物品数算出部が生成した前記物品数情報を出力する情報出力部と、
     を有する計数システム。
    A counting system including a photographing device for photographing a stationary article and a counting device for counting the articles photographed by the photographing device.
    The counting device is
    An image acquisition unit that acquires a photographed image from the photographing device, and
    A binarization unit that binarizes the captured image acquired by the image acquisition unit,
    An area calculation unit that calculates the area of a group in which the binarization unit classifies the pixel distribution of the captured image that has been binarized.
    The number of articles for each group is calculated from the area of the group calculated by the area calculation unit, the number of articles for each group is totaled, and the number of articles information indicating the total number of articles shown in the photographed image is generated. The number of goods calculation unit and
    An information output unit that outputs the article number information generated by the article number calculation unit, and
    Counting system with.
  2.  前記二値化部は、判別分析法で二値化の閾値を算出する請求項1に記載の計数システム。 The counting system according to claim 1, wherein the binarization unit calculates a binarization threshold by a discriminant analysis method.
  3.  前記計数装置は、前記二値化部が二値化した前記撮影画像を縮小する縮小処理部をさらに有し、
     前記面積算出部は、前記縮小処理部が縮小した前記撮影画像のピクセル分布を分類した集団の面積を算出する、
     請求項1または2の計数システム。
    The counting device further includes a reduction processing unit in which the binarization unit reduces the captured image that has been binarized.
    The area calculation unit calculates the area of a group that classifies the pixel distribution of the captured image reduced by the reduction processing unit.
    The counting system of claim 1 or 2.
  4.  ユーザが使用するユーザ端末をさらに備え、
     前記物品数算出部は、前記面積算出部が算出した集団の面積から物品の数を算出できない計数不可の集団がある場合には、前記計数不可の集団を示す計数不可情報を生成し、
     前記情報出力部は、前記物品数算出部が生成した前記計数不可情報を前記ユーザ端末に送信し、
     前記物品数算出部は、前記計数不可の集団の物品の数を示す数値情報を前記ユーザ端末から受信すると、前記数値情報が示す前記計数不可の集団の物品の数を採用して、前記撮影画像に写っている物品の総数を算出する、
     請求項1から3のいずれか1項に記載の計数システム。
    Further equipped with a user terminal used by the user
    When there is an uncountable group whose number of articles cannot be calculated from the area of the group calculated by the area calculation unit, the article number calculation unit generates uncountable information indicating the uncountable group.
    The information output unit transmits the non-countable information generated by the article number calculation unit to the user terminal.
    When the article number calculation unit receives numerical information indicating the number of articles in the uncountable group from the user terminal, the article number calculation unit adopts the number of articles in the uncountable group indicated by the numerical information and takes the photographed image. Calculate the total number of items shown in
    The counting system according to any one of claims 1 to 3.
  5.  前記計数装置は、
     集団の画像と当該集団の物品の数とを対応付けた集団画像情報を記憶する集団画像記憶部をさらに有し、
     前記物品数算出部は、前記計数不可の集団があって、前記計数不可の集団の画像と集団の画像が類似する前記集団画像情報がある場合には、類似する集団の画像に対応付けられた物品の数を、前記計数不可の集団の物品の数として採用して、前記撮影画像に写っている物品の総数を算出し、前記計数不可の集団の画像と集団の画像が類似する前記集団画像情報がない場合には、前記数値情報が示す前記計数不可の集団の物品の数を採用して、前記撮影画像に写っている物品の総数を算出し、前記計数不可の集団の画像と前記数値情報が示す前記計数不可の集団の物品の数とを対応付けて前記集団画像情報を生成する、
     請求項4に記載の計数システム。
    The counting device is
    It further has a group image storage unit that stores group image information in which a group image is associated with the number of articles in the group.
    When there is a non-countable group and there is the group image information in which the image of the non-countable group and the image of the group are similar, the article number calculation unit is associated with the image of the similar group. The number of articles is adopted as the number of articles in the uncountable group to calculate the total number of articles in the photographed image, and the group image in which the image of the uncountable group and the image of the group are similar. When there is no information, the total number of articles in the photographed image is calculated by adopting the number of articles in the uncountable group indicated by the numerical information, and the image of the uncountable group and the numerical value are calculated. The group image information is generated in association with the number of articles in the uncountable group indicated by the information.
    The counting system according to claim 4.
  6.  前記静止した物品の重なりを崩して計数可能にする均し動作を行う均し動作装置をさらに備え、
     前記計数装置は、
     前記面積算出部が算出した集団の面積から物品の数を算出できない計数不可の集団がある場合に、前記均し動作装置に前記均し動作を指示する均し動作指示部をさらに有する、
     請求項1から3のいずれか1項に記載の計数システム。
    Further provided with a leveling operation device that performs a leveling operation that breaks the overlap of the stationary articles and enables counting.
    The counting device is
    When there is an uncountable group whose number of articles cannot be calculated from the area of the group calculated by the area calculation unit, the leveling operation device further has a leveling operation instruction unit for instructing the leveling operation.
    The counting system according to any one of claims 1 to 3.
  7.  前記計数装置は、
     物品を識別する情報と当該物品の前記撮影画像とを対応付けた撮影実績情報を記憶する撮影実績記憶部と、
     出庫する物品を識別する情報を含む出庫情報を取得し、前記出庫する物品を識別する情報と前記物品を識別する情報が一致する前記撮影実績情報がある場合には、前記出庫する物品を識別する情報と一致する前記物品を識別する情報に対応付けられた前記撮影画像と前記画像取得部が取得した前記撮影画像とを比較して、前記出庫する物品と前記撮影画像に写っている物品とが一致するか否かを判定し、一致しない場合に警告を示すエラー情報を生成し、前記出庫する物品を識別する情報と前記物品を識別する情報が一致する前記撮影実績情報がない場合には、前記出庫する物品を識別する情報と前記画像取得部が取得した前記撮影画像とを対応付けて前記撮影実績情報を生成する正否判定部と、
     をさらに有し、
     前記情報出力部は、前記正否判定部が生成した前記エラー情報を出力する、
     請求項1から6のいずれか1項に記載の計数システム。
    The counting device is
    A shooting record storage unit that stores shooting record information in which information for identifying an article and the photographed image of the article are associated with each other.
    When the delivery information including the information for identifying the goods to be delivered is acquired and the shooting record information in which the information for identifying the goods to be delivered and the information for identifying the goods match is available, the goods to be delivered are identified. By comparing the photographed image associated with the information for identifying the article that matches the information with the photographed image acquired by the image acquisition unit, the article to be delivered and the article appearing in the photographed image are compared. It determines whether or not they match, generates error information that indicates a warning if they do not match, and if there is no shooting record information that matches the information that identifies the article to be delivered and the information that identifies the article, A correctness determination unit that generates the shooting record information by associating the information for identifying the article to be delivered with the captured image acquired by the image acquisition unit.
    Further has
    The information output unit outputs the error information generated by the correctness determination unit.
    The counting system according to any one of claims 1 to 6.
  8.  前記計数装置は、
     前記物品数算出部が算出した物品の総数を、当該物品の在庫数から減算して、物品の在庫数を示す在庫情報を更新する在庫更新部をさらに有する、
     請求項1から7のいずれか1項に記載の計数システム。
    The counting device is
    It further has an inventory update unit that updates inventory information indicating the inventory number of articles by subtracting the total number of articles calculated by the article number calculation unit from the inventory number of the articles.
    The counting system according to any one of claims 1 to 7.
  9.  前記計数装置は、
     計数範囲を規定するマーカを示すマーカ情報を記憶するマーカ記憶部をさらに有し、
     前記二値化部は、前記撮影画像から前記マーカ情報が示すマーカを検出して計数範囲を特定し、計数範囲の前記撮影画像を二値化する、
     請求項1から8のいずれか1項に記載の計数システム。
    The counting device is
    It further has a marker storage unit that stores marker information indicating a marker that defines a counting range.
    The binarization unit detects the marker indicated by the marker information from the captured image, specifies the counting range, and binarizes the captured image in the counting range.
    The counting system according to any one of claims 1 to 8.
  10.  前記撮影画像および前記物品数情報を含むデータを取得するデータ取得部と、
     前記物品数情報が示す前記撮影画像に写っている物品の総数の正否を示す正否情報を取得する正否情報取得部と、
     前記撮影画像および前記物品数情報を含むデータ、ならびに、前記正否情報に基づいて生成されるデータセットから、前記計数装置の計数結果または前記計数装置の計数結果の正否を学習する学習部と、
     をさらに有する、
     請求項1から9のいずれか1項に記載の計数システム。
    A data acquisition unit that acquires data including the photographed image and the number of articles information,
    A correctness information acquisition unit that acquires correctness information indicating the correctness of the total number of articles shown in the photographed image indicated by the item number information, and
    A learning unit that learns the correctness of the counting result of the counting device or the counting result of the counting device from the photographed image, the data including the article number information, and the data set generated based on the correctness information.
    Have more,
    The counting system according to any one of claims 1 to 9.
  11.  前記計数装置が計数する物品は、基板に実装される部品であって、
     前記計数装置が計数した前記部品を部品供給部に収納し、前記部品供給部に収納された前記部品を前記基板に配置する部品配置装置をさらに備え、
     前記情報出力部は、前記物品数情報を前記部品配置装置に出力し、
     前記部品配置装置は、前記物品数情報が示す前記撮影画像に写っている部品の総数に基づいて、前記部品供給部に一定数の前記部品を収納する、
     請求項1から10のいずれか1項に記載の計数システム。
    The article counted by the counting device is a component mounted on a substrate.
    A component arranging device for storing the parts counted by the counting device in a component supply unit and arranging the components stored in the component supply unit on the substrate is further provided.
    The information output unit outputs the article number information to the component arranging device.
    The component arranging device stores a certain number of the components in the component supply unit based on the total number of components shown in the photographed image indicated by the article number information.
    The counting system according to any one of claims 1 to 10.
  12.  静止した物品を撮影する撮影装置から撮影画像を取得する画像取得部と、
     前記画像取得部が取得した前記撮影画像を二値化する二値化部と、
     前記二値化部が二値化した前記撮影画像のピクセル分布を分類した集団の面積を算出する面積算出部と、
     前記面積算出部が算出した集団の面積から集団ごとの物品の数を算出し、前記集団ごとの物品の数を合計し、前記撮影画像に写っている物品の総数を示す物品数情報を生成する物品数算出部と、
     前記物品数算出部が生成した前記物品数情報を出力する情報出力部と、
     を備える計数装置。
    An image acquisition unit that acquires a photographed image from a photographing device that photographs a stationary article,
    A binarization unit that binarizes the captured image acquired by the image acquisition unit,
    An area calculation unit that calculates the area of a group in which the binarization unit classifies the pixel distribution of the captured image that has been binarized.
    The number of articles for each group is calculated from the area of the group calculated by the area calculation unit, the number of articles for each group is totaled, and the number of articles information indicating the total number of articles shown in the photographed image is generated. The number of goods calculation unit and
    An information output unit that outputs the article number information generated by the article number calculation unit, and
    A counting device equipped with.
  13.  請求項12に記載の計数装置の計数結果または前記計数装置の計数結果の正否を学習する機械学習装置であって、
     前記撮影画像および前記物品数情報を含むデータを取得するデータ取得部と、
     前記物品数情報が示す前記撮影画像に写っている物品の総数の正否を示す正否情報を取得する正否情報取得部と、
     前記撮影画像および前記物品数情報を含むデータ、ならびに、前記正否情報に基づいて生成されるデータセットから、前記計数装置の計数結果または前記計数装置の計数結果の正否を学習する学習部と、
     を備える機械学習装置。
    A machine learning device that learns the counting result of the counting device according to claim 12 or the correctness of the counting result of the counting device.
    A data acquisition unit that acquires data including the photographed image and the number of articles information,
    A correctness information acquisition unit that acquires correctness information indicating the correctness of the total number of articles shown in the photographed image indicated by the item number information, and
    A learning unit that learns the correctness of the counting result of the counting device or the counting result of the counting device from the photographed image, the data including the article number information, and the data set generated based on the correctness information.
    A machine learning device equipped with.
  14.  撮影装置が実行する、
     静止した物品を撮影する撮影ステップと、
     計数装置が実行する、
     前記撮影装置が撮影した撮影画像を二値化する二値化ステップと、
     前記二値化ステップで二値化した前記撮影画像のピクセル分布を分類した集団の面積を算出する面積算出ステップと、
     前記面積算出ステップで算出した集団の面積から集団ごとの物品の数を算出し、前記集団ごとの物品の数を合計し、前記撮影画像に写っている物品の総数を示す物品数情報を生成する物品数算出ステップと、
     前記物品数算出ステップで生成した物品数情報を出力する情報出力ステップと、
     を備える計数方法。
    The shooting device executes,
    Shooting steps to shoot stationary objects and
    Performed by the counting device,
    A binarization step of binarizing the captured image captured by the imaging device, and
    An area calculation step for calculating the area of a group in which the pixel distribution of the captured image binarized in the binarization step is classified, and
    The number of articles for each group is calculated from the area of the group calculated in the area calculation step, the number of articles for each group is totaled, and the number of articles information indicating the total number of articles shown in the photographed image is generated. Item number calculation step and
    An information output step that outputs information on the number of articles generated in the step for calculating the number of articles, and
    A counting method comprising.
  15.  撮影装置が実行する、
     静止した部品を撮影する撮影ステップと、
     計数装置が実行する、
     前記撮影装置が撮影した撮影画像を二値化する二値化ステップと、
     前記二値化ステップで二値化した前記撮影画像のピクセル分布を分類した集団の面積を算出する面積算出ステップと、
     前記面積算出ステップで算出した集団の面積から集団ごとの前記部品の数を算出し、前記集団ごとの前記部品の数を合計し、前記撮影画像に写っている前記部品の総数を算出する部品数算出ステップと、
     部品配置装置が実行する、
     前記部品数算出ステップで算出された前記撮影画像に写っている前記部品の総数に基づいて、部品供給部に一定数の前記部品を収納する収納ステップと、
     前記部品供給部に収納された前記部品を基板に配置する配置ステップと、
     を備える部品配置方法。
    The shooting device executes,
    Shooting steps to shoot stationary parts and
    Performed by the counting device,
    A binarization step of binarizing the captured image captured by the imaging device, and
    An area calculation step for calculating the area of a group in which the pixel distribution of the captured image binarized in the binarization step is classified, and
    The number of the parts for each group is calculated from the area of the group calculated in the area calculation step, the number of the parts for each group is totaled, and the total number of the parts shown in the photographed image is calculated. Calculation steps and
    The component placement device executes,
    A storage step for storing a certain number of the parts in the parts supply unit based on the total number of the parts shown in the photographed image calculated in the part number calculation step, and a storage step.
    An arrangement step of arranging the components housed in the component supply unit on the board, and
    Parts placement method.
  16.  コンピュータを、
     静止した物品を撮影した撮影画像を二値化する二値化部、
     前記二値化部が二値化した前記撮影画像のピクセル分布を分類した集団の面積を算出する面積算出部、
     前記面積算出部が算出した集団の面積から集団ごとの物品の数を算出し、前記集団ごとの物品の数を合計し、前記撮影画像に写っている物品の総数を示す物品数情報を生成する物品数算出部、および、
     前記物品数算出部が生成した物品数情報を出力する情報出力部、
     として機能させるプログラム。
    Computer,
    Binarization unit, which binarizes captured images of stationary objects,
    An area calculation unit that calculates the area of a group in which the binarization unit classifies the pixel distribution of the captured image that has been binarized.
    The number of articles for each group is calculated from the area of the group calculated by the area calculation unit, the number of articles for each group is totaled, and the number of articles information indicating the total number of articles shown in the photographed image is generated. Item number calculation unit and
    An information output unit that outputs information on the number of articles generated by the article number calculation unit,
    A program that functions as.
PCT/JP2019/049274 2019-03-08 2019-12-17 Counting system, counting device, machine learning device, counting method, component arrangement method, and program WO2020183837A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980093531.0A CN113518998B (en) 2019-03-08 2019-12-17 Counting system, counting device, machine learning device, counting method, component arrangement method, and recording medium
JP2021505523A JP7134331B2 (en) 2019-03-08 2019-12-17 Counting system, counting device, machine learning device, counting method, parts arrangement method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-042438 2019-03-08
JP2019042438 2019-03-08

Publications (1)

Publication Number Publication Date
WO2020183837A1 true WO2020183837A1 (en) 2020-09-17

Family

ID=72427922

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/049274 WO2020183837A1 (en) 2019-03-08 2019-12-17 Counting system, counting device, machine learning device, counting method, component arrangement method, and program

Country Status (3)

Country Link
JP (1) JP7134331B2 (en)
CN (1) CN113518998B (en)
WO (1) WO2020183837A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581016A (en) * 2020-12-28 2021-03-30 深圳硅纳智慧科技有限公司 Material management system and material management method adopting same
US20220375185A1 (en) * 2021-08-06 2022-11-24 Beijing Baidu Netcom Science Technology Co., Ltd. Method of recognizing image, electronic device, and storage medium
WO2024024090A1 (en) * 2022-07-29 2024-02-01 ヤマハ発動機株式会社 Component count device and robot system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115619811B (en) * 2022-12-16 2023-04-14 北京远舢智能科技有限公司 Cigarette quantity determining method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1019755A (en) * 1996-04-30 1998-01-23 Kunio Funemi Automatic pollen collecting/analyzing system
JPH10214322A (en) * 1997-01-29 1998-08-11 Masatake Akagawa Commodity detecting counter
JPH11306314A (en) * 1998-04-24 1999-11-05 Ishida Co Ltd Method and device for counting articles, article carrying device provided with the device, and combination counter
JP2005242896A (en) * 2004-02-27 2005-09-08 Oki Electric Ind Co Ltd Display system for handling queue and apparatus for analyzing queue
JP2007073710A (en) * 2005-09-06 2007-03-22 Matsushita Electric Ind Co Ltd Device and method for counting part
JP2012173901A (en) * 2011-02-21 2012-09-10 Midori Seimitsu:Kk Method and device for counting number of steel material in bound steel material bundle
WO2019045091A1 (en) * 2017-09-04 2019-03-07 日本電気株式会社 Information processing device, counter system, counting method, and program storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4096729B2 (en) * 2002-12-24 2008-06-04 カシオ計算機株式会社 Order confirmation system and program
CN1609894A (en) * 2004-09-10 2005-04-27 浙江大学 Steel products on-line counting system and method based on virtual multisensor fusion
US8116564B2 (en) * 2006-11-22 2012-02-14 Regents Of The University Of Minnesota Crowd counting and monitoring
CN101777140B (en) * 2010-02-08 2012-03-21 宁波大学 Method for counting number of complex overlapping cells in microscopic image
JP5414917B2 (en) * 2011-07-13 2014-02-12 パナソニック株式会社 Tablet inspection device and tablet inspection method
JP5811923B2 (en) * 2012-03-28 2015-11-11 富士通株式会社 Information processing apparatus, image processing method, and program
JP2015228094A (en) * 2014-05-30 2015-12-17 シライ電子工業株式会社 Substrate counting method, substrate counting program, and substrate counter
TW201705047A (en) * 2015-07-24 2017-02-01 Cliff Young Trading Co Ltd Image-type board counting device and method rapidly providing accurate quantity information through the real-time operation on the image information and the distance information

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1019755A (en) * 1996-04-30 1998-01-23 Kunio Funemi Automatic pollen collecting/analyzing system
JPH10214322A (en) * 1997-01-29 1998-08-11 Masatake Akagawa Commodity detecting counter
JPH11306314A (en) * 1998-04-24 1999-11-05 Ishida Co Ltd Method and device for counting articles, article carrying device provided with the device, and combination counter
JP2005242896A (en) * 2004-02-27 2005-09-08 Oki Electric Ind Co Ltd Display system for handling queue and apparatus for analyzing queue
JP2007073710A (en) * 2005-09-06 2007-03-22 Matsushita Electric Ind Co Ltd Device and method for counting part
JP2012173901A (en) * 2011-02-21 2012-09-10 Midori Seimitsu:Kk Method and device for counting number of steel material in bound steel material bundle
WO2019045091A1 (en) * 2017-09-04 2019-03-07 日本電気株式会社 Information processing device, counter system, counting method, and program storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112581016A (en) * 2020-12-28 2021-03-30 深圳硅纳智慧科技有限公司 Material management system and material management method adopting same
US20220375185A1 (en) * 2021-08-06 2022-11-24 Beijing Baidu Netcom Science Technology Co., Ltd. Method of recognizing image, electronic device, and storage medium
WO2024024090A1 (en) * 2022-07-29 2024-02-01 ヤマハ発動機株式会社 Component count device and robot system

Also Published As

Publication number Publication date
JPWO2020183837A1 (en) 2021-10-28
JP7134331B2 (en) 2022-09-09
CN113518998A (en) 2021-10-19
CN113518998B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
WO2020183837A1 (en) Counting system, counting device, machine learning device, counting method, component arrangement method, and program
CN109300263B (en) Settlement method and device for image recognition technology based on convolutional neural network
CN109840504B (en) Article taking and placing behavior identification method and device, storage medium and equipment
US10963947B2 (en) Method, a device and a system for checkout
JP6488647B2 (en) Object tracking device, object tracking system, object tracking method, display control device, object detection device, program, and recording medium
CN107679475B (en) Store monitoring and evaluating method and device and storage medium
CN110726724A (en) Defect detection method, system and device
JPWO2014087725A1 (en) Product information processing apparatus, data processing method thereof, and program
CN112100425B (en) Label labeling method and device based on artificial intelligence, electronic equipment and medium
JP2018512567A (en) Barcode tag detection in side view sample tube images for laboratory automation
JP2017171448A (en) Projection instruction device, goods assort system and projection instruction method
US11354549B2 (en) Method and system for region proposal based object recognition for estimating planogram compliance
CN107918767B (en) Object detection method, device, electronic equipment and computer-readable medium
US20220028514A1 (en) System and method for augmented reality detection of loose pharmacy items
JP2017171444A (en) Projection instruction device, goods assort system and projection instruction method
CN111539468A (en) Goods counting method of automatic vending equipment
JPWO2018179360A1 (en) Image processing apparatus, image processing method, and program
WO2021233058A1 (en) Method for monitoring articles on shop shelf, computer and system
JP2017171445A (en) Projection instruction device, goods assort system and projection instruction method
CN112163600B (en) Commodity identification method based on machine vision
JP6989178B2 (en) Transport item tracking device, transport item counting device, transport item tracking method, transport item counting method, transport item tracking system, and transport item counting system.
KR20220083347A (en) Method, apparatus, and computer program for measuring volume of objects by using image
JP7503731B2 (en) Conveyor equipment
CN109447000A (en) Biopsy method, spot detection method, electronic equipment and recording medium
JP6857373B1 (en) Information processing equipment, information processing methods, and programs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19919271

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021505523

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19919271

Country of ref document: EP

Kind code of ref document: A1