WO2018134866A1 - Camera calibration device - Google Patents
Camera calibration device Download PDFInfo
- Publication number
- WO2018134866A1 WO2018134866A1 PCT/JP2017/001337 JP2017001337W WO2018134866A1 WO 2018134866 A1 WO2018134866 A1 WO 2018134866A1 JP 2017001337 W JP2017001337 W JP 2017001337W WO 2018134866 A1 WO2018134866 A1 WO 2018134866A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- movement
- marker
- sensor
- camera
- fixed camera
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 266
- 239000003550 marker Substances 0.000 claims abstract description 226
- 238000005259 measurement Methods 0.000 claims abstract description 21
- 230000007423 decrease Effects 0.000 abstract description 2
- 230000036544 posture Effects 0.000 description 38
- 238000000034 method Methods 0.000 description 32
- 238000012545 processing Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 230000000694 effects Effects 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000013519 translation Methods 0.000 description 5
- 239000013598 vector Substances 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000010365 information processing Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
Definitions
- the present invention relates to a camera calibration device, and more particularly to a camera calibration device that automatically creates a movement plan for a calibration jig during calibration work.
- this information processing apparatus can obtain the effect of “the mutual orientation and absolute orientation processes in camera calibration can be integrated” described in paragraph 0031 of the same document.
- a camera calibration apparatus is a movement in which a fixed camera and a movement marker with a sensor are connected, and the position / posture of the movement marker is estimated from the measured value of the sensor of the movement marker with sensor.
- a marker calibration unit, a fixed camera calibration unit that estimates the position / orientation of the fixed camera from the captured image of the fixed camera and the estimated position / orientation of the moving marker, and an estimated position / position of the fixed camera A movement plan unit that creates a movement plan including a target movement position of the movement marker according to the posture, a movement instruction unit that instructs movement based on the movement plan, and outputs the estimated position and posture of the fixed camera And an output unit.
- the positions and postures of a plurality of fixed cameras can be calibrated with high accuracy in a short time regardless of the skill level of the calibration operator.
- the figure which shows the block structure of the camera calibration apparatus 100 The figure which shows an example of the fixed camera 200 and the movement marker 300 with a sensor.
- the flowchart which shows the process performed in the fixed camera calibration part 102 The figure which shows an example of the feature point 311 detected from the image 210 and the image 210
- the flowchart which shows the process performed in the movement plan part 103 The figure which shows an example of the three-dimensional histogram 220
- the figure which shows an example of the two-dimensional histogram 230 The figure which shows an example of the bin 231 of the two-dimensional histogram 230
- indication part 104 displays The figure which shows an example of the two-dimensional map which the output part 105 outputs The figure which shows the block structure of the camera calibration apparatus 400 The flowchart which shows the process performed in the stop determination part 401
- FIG. 1 is a diagram illustrating a block configuration of a camera calibration apparatus 100 according to the present embodiment.
- the camera calibration apparatus 100 is connected to a plurality of fixed cameras 200 and a movement marker 300 with a sensor, and calibrates the position and orientation of the fixed camera 200.
- the identifier of each of the fixed camera 200 and i the total number of fixed cameras 200 N i. That is, the camera calibration apparatus 100 shall be fixed camera 200 of N i stand is connected.
- the fixed camera 200 is a camera whose position and posture are fixed, such as a surveillance camera installed on the ceiling.
- the movement marker 300 with a sensor includes, for example, a sensor such as a camera that can measure its own movement and a marker such as a checkerboard pattern that can be easily detected from a captured image of the fixed camera 200. Accordingly, the camera moves in the environment where the fixed camera 200 is installed.
- the camera calibration apparatus 100 includes a movement marker calibration unit 101, a fixed camera calibration unit 102, a movement planning unit 103, a movement instruction unit 104, and an output unit 105. Note that these can be realized, for example, by operating an arithmetic device such as a CPU in accordance with a program stored in a storage device such as a semiconductor memory in the camera calibration device 100, and each is necessarily provided as hardware. There is no need.
- the moving marker calibration unit 101 estimates the position / posture of the sensor-equipped moving marker 300 in the world coordinate system from the measurement values measured by the sensor of the sensor-equipped moving marker 300.
- the fixed camera calibration unit 102 uses the image including the moving marker 300 with the sensor imaged by the fixed camera 200 and the position / posture in the world coordinate system of the moving marker 300 with the sensor estimated by the moving marker calibration unit 101. Estimate the position and orientation in 200 world coordinate systems.
- the movement planning unit 103 plans the movement of the sensor-equipped moving marker 300 according to the position / orientation of each fixed camera 200 estimated by the fixed camera calibration unit 102.
- the movement instruction unit 104 issues an instruction for causing the movement marker with sensor 300 to execute the movement plan planned by the movement planning unit 103.
- the output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102.
- the fixed camera 200, the sensor-equipped moving marker 300, and the camera calibration apparatus 100 may be connected by a wired connection such as USB or Ethernet (registered trademark) or wirelessly via a wireless network. Connection is also acceptable. Further, data recorded in a recording medium existing in the fixed camera 200 and the sensor-equipped moving marker 300 may be input to the camera calibration apparatus 100 via a storage medium such as an SD card.
- FIG. 2 is a diagram illustrating an example of the fixed camera 200 and the movement marker 300 with a sensor.
- the fixed camera 200 fixed to the ceiling or the like captures images at a predetermined cycle and outputs captured images to the camera calibration device 100.
- the sensor-equipped moving marker 300 can move in the environment where the fixed camera 200 is installed.
- the marker 310 for facilitating detection from the captured image of the fixed camera 200 and the marker 310 are fixed on the marker 310.
- the moving marker with sensor 300 composed of the moving camera 320 is mounted on the moving robot 350 that moves in the environment and moved, but the calibration operator 300 moves the moving marker with sensor by hand.
- the calibration operator may move the cart or tripod on which the sensor-equipped movement marker 300 is placed. It is assumed that the fixed position / orientation of moving camera 320 in the coordinate system (hereinafter referred to as “marker coordinate system”) installed on marker 310 is known to camera calibration apparatus 100.
- marker coordinate system the fixed position / orientation of moving camera 320 in the coordinate system
- a checkerboard pattern is used as the marker 310, but another pattern such as a circular pattern that can be easily detected from an image may be used.
- the marker 310 is not limited to a planar pattern, and may be a three-dimensional object such as a cube or a sphere, or a pattern that exists in advance in the mobile robot 350 may be used as the marker 310.
- the moving camera 320 is used as a sensor, but other sensors that can measure its own motion, such as an IMU (Inertial Measurement Unit), a wheel encoder, a steering angle meter, GPS, and a laser range finder, are used. Also good.
- IMU Inertial Measurement Unit
- the sensor typified by the moving camera 320 of the moving marker 300 with sensor measures at a predetermined cycle and outputs a captured image as a measurement result to the camera calibration device 100.
- the fixed camera 200 and the moving camera 320 are time-synchronized, and photographing and measurement are performed at the same time.
- the camera calibration device 100 executes the process of the fixed camera calibration unit 102 every time an image is input from the fixed camera 200 or a certain number of images are input. Further, the process of the movement marker calibration unit 101 is performed every time a measurement result is input from the movement marker 300 with sensor or a certain number of measurement results are input.
- the moving marker calibration unit 101 estimates the position / orientation in the world coordinate system of the moving marker 300 with sensor at each measurement time of the moving camera 320. Since the position / posture of the mobile camera 320 in the marker coordinate system is fixed and known, the position / posture of the marker 310 in the world coordinate system is estimated by estimating the position / posture of the mobile camera 320 in the world coordinate system. Can do. As shown in FIG. 2, when the moving camera 320 is used as the sensor of the sensor-equipped moving marker 300, Structure from Motion that estimates the position / orientation of the moving camera 320 at the time of capturing each image from a plurality of captured images. Method or Visual Simultaneous Localization and Mapping (vSLAM) method can be used. For example, G. Klein and D. Murray, Parallel Tracking and Mapping for Small AR Workspaces, Proc. IEEE and ACM Int. Symp. On Mixed and Augmented Reality, pp.225-234, 2007. Can do.
- vSLAM Visual Simultaneous Localization and Mapping
- the position / orientation can be estimated by integrating the acceleration and angular velocity measured by the IMU.
- the position / posture can be estimated by dead reckoning.
- a plurality of sensors such as a camera, an IMU, and a wheel encoder may be used as the sensors, and the position / posture may be estimated by using the measured values of the sensors together.
- the fixed camera calibration unit 102 uses the image including the moving marker 300 with sensor captured by the fixed camera 200 and the position / posture of the moving marker 300 with sensor estimated by the moving marker calibration unit 101 corresponding to the shooting time. 200 positions and postures are estimated. Details of the processing will be described later. Note that, as described above, since the fixed camera 200 and the moving camera 320 perform time-synchronized shooting, in the estimation process in the moving marker calibration unit 101 and the estimation process in the fixed camera calibration unit 102, captured images at the same time are used. Use.
- the movement planning unit 103 creates a movement plan for the sensor-equipped movement marker 300 according to the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102. Details of the processing will be described later.
- the movement instruction unit 104 issues an instruction for causing the movement marker 300 with a sensor to execute the movement plan created by the movement planning unit 103. Details of the processing will be described later.
- the output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102. Details of the processing will be described later. (Operation of Fixed Camera Calibration Unit 102) Next, details of the processing of the fixed camera calibration unit 102 will be described with reference to FIGS. 3 and 4.
- FIG. 3 is a flowchart showing a process that the fixed camera calibration unit 102 repeats during the operation of the camera calibration apparatus 100.
- FIG. 4 is an example of an image 210 taken by the fixed camera 200.
- step S500 the marker 310 and the feature point 311 on the marker 310 are detected from the image 210 taken by the fixed camera 200.
- the feature point 311 is a point where it is easy to detect the position from the image, such as the corners of each square included in the checkerboard pattern included in the image 210 illustrated in FIG.
- FIG. 4 only some feature points 311 are illustrated for convenience of illustration. Since detection of the checkerboard pattern and the feature points on the pattern from the image is a known technique, a detailed description thereof is omitted.
- the identifier when the images 210 are arranged in time series is j
- the total number of images taken by the i-th fixed camera 200 when the fixed camera calibration unit 102 is processed is N ij
- the identifier of the feature point 311 is k
- the total number of feature points 311 detected in the j-th image 210 photographed by the i-th fixed camera 200 is N ijk .
- the two-dimensional position in the image coordinate system of the image 210 of the k-th feature point 311 detected in the j-th image 210 photographed by the i-th fixed camera 200 is defined as (u ′ ijk , v ′ ijk ) T.
- step S510 the three-dimensional position in the world coordinate system of the feature point 311 on the marker 310 detected in step S500 is calculated using the position / orientation of the marker 310 estimated by the moving marker calibration unit 101, and the process proceeds to step S520. move on.
- the three-dimensional position p ijk M of the feature point 311 in the marker coordinate system is known from the specification of the checkerboard pattern. From the position / posture of the marker 310 at the shooting time of the j-th image 210 captured by the i-th fixed camera 200 estimated by the moving marker calibration unit 101, the three-dimensional position p ijk W of the feature point in the world coordinate system is Calculated by (Equation 1).
- R ij MW and t ij MW are a rotation matrix and a translation vector from the marker coordinate system to the world coordinate system.
- step S520 from the two-dimensional position in the image coordinate system of the feature point 311 on the marker 310 detected in step S500 and the three-dimensional position in the world coordinate system of the feature point 311 on the marker 310 calculated in step S510,
- the position / orientation of the fixed camera 200 in the world coordinate system is estimated.
- the position and orientation of the i-th fixed camera 200 in the world coordinate system is calculated by solving (Equation 2) using a known nonlinear least square method such as the Levenberg-Marquardt method or the Gauss-Newton method. To do.
- R i WCi and t i WCi are a rotation matrix and a translation vector from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200, and R ′ i WCi and t ′ i WCi are estimated.
- E i is the total reprojection error.
- the reprojection error refers to a projection position obtained by projecting the three-dimensional position of the feature point 311 onto the image 210 using camera internal parameters such as the position / posture of the fixed camera 200, focal length, and lens distortion, and the feature point in the image 210. This is the distance between 311 detection positions.
- the total re-projection error E i is calculated by (Equation 3).
- (x ijk , y ijk ) T is a position in the normalized image coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200.
- (x ′ ijk , y ′ ijk ) T projects the position in the world coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 onto the i-th fixed camera 200. Coordinates.
- the position (x ijk , y ijk ) T in the normalized image coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 uses, for example, a perspective projection model as a camera model. If there is, it is calculated by (Equation 4).
- the camera model is not limited to the perspective projection model, and other camera models such as a camera model for an omnidirectional camera may be used.
- (c ix , c iy ) T is the position of the optical center of the i-th fixed camera 200
- (f ix , f iy ) T is the focal length of the i-th fixed camera 200.
- (u ijk , v ijk ) T is the position of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 in the image coordinates from which lens distortion has been removed.
- distortion in the radial direction of the lens it is calculated according to (Equation 5).
- the lens distortion model is not limited to distortion in the radial direction of the lens, and other lens models such as distortion in the tangential direction orthogonal to the radial direction of the lens may be used.
- ⁇ 1 and ⁇ 2 are lens distortion parameters.
- FIG. 5 is a flowchart showing a process executed by the movement planning unit 103 during the operation of the camera calibration apparatus 100.
- a fixed camera 200 having a high calibration priority is selected and a target movement position is calculated.
- step S600 the movable area of the moving marker 300 with sensor is calculated.
- the reason why the movable area is obtained in advance is that if there is an area where the movement marker 300 with the sensor cannot be arranged due to the influence of an obstacle or the like, it is necessary to create a movement plan excluding the area. is there.
- the three-dimensional in the world coordinate system of the feature point 311 on the marker 310 at each time is calculated by the same calculation as step S510 in FIG. Calculate the position and fit a plane to the calculated 3D position. Since plane fitting for a three-dimensional position is a known technique, the details are omitted, but a least square method or principal component analysis can be used.
- the signed distance between each three-dimensional position and the fitted plane is calculated.
- the distances are rearranged in ascending order, and the q quantiles are calculated as w 1 and w 2 , respectively.
- q is a preset value of 0 ⁇ q ⁇ 1.
- an area sandwiched between two planes obtained by moving the fitted plane by w 1 and w 2 in the normal direction of the plane is defined as a movable area of the sensor-equipped movement marker 310.
- w 1 , W 2 should be set to 0.
- step S600 the priority of calibration is calculated for each fixed camera 200.
- step S610 the distance d i between the i-th fixed camera 200 and the sensor-equipped moving marker 300 is calculated.
- d i is estimated by the moving marker calibration unit 101, and is estimated by the rotation matrix from the latest marker coordinate system to the world coordinate system, translation vectors R ij MW and t ij MW, and the fixed camera calibration unit 102.
- the rotation matrix from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200 and the translation vectors R ′ i WCi and t ′ i WCi are used ( Equation 7.1) and ( Equation 7.2). ).
- step S620 the three-dimensional distribution a i 3D in the world coordinate system of the feature point 311 on the marker 310 photographed by the i-th fixed camera 200 is calculated.
- a three-dimensional histogram 220 of the number of feature points 311 is created using the position of the feature point 311 on the marker 310 in the image 210 and the depth from the i-th fixed camera 200.
- FIG. 6 is a diagram illustrating an example of the three-dimensional histogram 220.
- the three-dimensional histogram 220 is a histogram of the number of feature points 311 in a space composed of the position of the feature point 311 on the marker 310 in the image 210 and the depth from the i-th fixed camera 200.
- the bin 221 includes an image x coordinate minimum value x min , a maximum value x max , an image y coordinate minimum value y min , a maximum value y max , a depth minimum value z min , and a region defined by the maximum value z max .
- This bin constitutes the three-dimensional histogram 220 that holds the number of feature points 311.
- FIG. 7 is a diagram illustrating an example of the bin 221 of the three-dimensional histogram 220 in the three-dimensional space.
- an area of 160x120 pixels is generated by dividing the image 210 of 640x480 pixels into 4x4 areas, and each area is further divided in the depth direction.
- the bin 221 is defined.
- 10 bins 221 having a depth of 1 m can be defined by dividing the depth from 0 m to 10 m into 10 equal parts.
- the size of the bin 221 shown here is merely an example, and the bin 221 having an arbitrary size may be set according to the environment. Then, by counting the number of feature points 311 present in each bin, the three-dimensional histogram 220 illustrated in FIG. 6 is created.
- the number of feature points 311 detected in all the images 210 taken by the i-th fixed camera 200 is counted. That is, N ij images are used. Also, the depth Z ′ ijk Ci of the feature point 311 is calculated by (Equation 8).
- the bin 221 including the movable region of the moving marker 300 with sensor calculated in step S600 is a bin that the moving marker 300 with sensor can reach, and the bin that does not include the bin that the moving marker 300 with sensor cannot reach.
- the measured three-dimensional information is used as the moving marker 300 with sensor. This is used to determine whether or not the destination is reachable. Specifically, when there are three-dimensional measurement points in the bin 221 that are equal to or greater than a preset threshold, it is considered that there is an obstacle and the bin is determined as an unreachable bin. In addition, using the measured three-dimensional information, it is determined whether the fixed camera 200 can shoot each bin 221 of the three-dimensional histogram 220.
- the bin 221 having a depth larger than the bin 221 determined as an unreachable bin by the obstacle is determined as a bin 221 that cannot be captured from the fixed camera 200 due to the obstacle.
- the moving marker with sensor 300 is processed as a bin that cannot be reached.
- the number N i3eb of bins 221 that can be reached by the sensor-equipped moving marker 300 and the bin 221 that can be reached by the sensor-equipped moving marker 300 and the number of feature points 311 in the bin 221 is greater than a preset threshold th 3D.
- step S630 the two-dimensional distribution a i 2D in the image coordinate system of the feature point 311 on the marker 310 photographed by the i-th fixed camera 200 is calculated.
- FIG. 8 is a diagram illustrating an example of the two-dimensional histogram 230.
- the two-dimensional histogram 230 is a histogram of the number of feature points 311 in the space formed by the positions of the feature points 311 on the marker 310 in the image 210.
- the bin 231 holds the number of feature points 311 in the region defined by the minimum value x min , the maximum value x max of the image x coordinate, the minimum value y min of the image y coordinate, and the maximum value y max . This is a bin constituting the histogram 230.
- FIG. 8 is a diagram illustrating an example of the two-dimensional histogram 230.
- the two-dimensional histogram 230 is a histogram of the number of feature points 311 in the space formed by the positions of the feature points 311 on the marker 310 in the image 210.
- the bin 231 holds the number of feature points 311 in the region defined by the minimum value x min , the maximum value x max of
- the two-dimensional histogram 230 is created by using the three-dimensional histogram 220 created in step S620 and adding the number of feature points 311 included in all depth bins 221 for each region on the image.
- each bin 231 of the two-dimensional histogram 230 it is determined whether or not the movement marker 300 with sensor can be reached.
- the three-dimensional histogram 220 created in step S620 for each bin 231 of the two-dimensional histogram 230, if there is at least one bin 221 of the three-dimensional histogram 220 that can be reached with respect to the same image area, it can be reached. Judge that the bin is correct. If no reachable 3D histogram bin 221 exists, it is determined that the bin is not reachable.
- the number N i2eb of bins 231 that can be reached by the sensor-equipped moving marker 300 and the bin 231 that can be reached by the sensor-equipped moving marker 300 and the number of feature points 311 in the bin 231 is greater than a preset threshold th 2D.
- step S640 each fixed camera is calibrated based on the distance d i obtained in steps S610 to S630, the three-dimensional distribution a i 3D , the two-dimensional distribution a i 2D , and (Equation 9). Priority a i is calculated.
- ⁇ d , ⁇ 2D , and ⁇ 3D are preset weights for the distance d i from the sensor-equipped moving marker 300, the three-dimensional distribution a i 3D , and the two-dimensional distribution a i 2D .
- the fixed camera 200 having the highest priority a i is selected from all the fixed cameras 200 in step S650.
- step S660 following step S650, the target movement position of the sensor-equipped movement marker 300, which is required to calibrate the fixed camera 200 having the highest priority selected in step S650, is calculated. Details of this calculation will be described with reference to the flowchart of FIG.
- Steps S661 to S664 are processed for each bin 221 of the three-dimensional histogram 220 created in Step S620 for the fixed camera 200 selected in Step S650.
- step S661 the distance d b between the center position of the bin 221 of the three-dimensional histogram 220 and the moving marker 300 with sensor is calculated.
- the distance d b between the center position p b Ci of the bin 221 and the moving marker 300 with sensor is calculated by (Expression 11.1) and (Expression 11.2).
- step S662 the three-dimensional sufficiency b b 3D of the bin 221 of the three-dimensional histogram 220 is calculated.
- step S663 the two-dimensional sufficiency b b 2D of the bin 221 of the three-dimensional histogram 220 is calculated.
- the two-dimensional sufficiency b b 2D of the bin 221 has the same threshold value th 2D as the number of feature points 311 included in the bin 231 of the two-dimensional histogram 230 created in step S630 having the same image area as the bin 221 of the three-dimensional histogram 220. It is set to 1 in the above case, and set to 0 if less than the threshold th 2D .
- step S664 the priority b b of the bin 221 of the three-dimensional histogram 220 is calculated from the distance d b from the sensor-equipped moving marker 300, the three-dimensional sufficiency b b 3D , and the two-dimensional sufficiency b b 2D .
- the priority b b of the bin 221 is calculated by (Equation 12).
- ⁇ ′ d , ⁇ ′ 3D , and ⁇ ′ 2D are preset weights for the distance d b from the sensor-equipped moving marker 300, the three-dimensional sufficiency b b 3D , and the two-dimensional sufficiency b b 2D . is there.
- step S665 the world coordinate system of the bin 221 having the highest priority b b from among the reachable bins 221 using the reachability of the sensor-attached movement marker 300 with respect to the bin 221 of the three-dimensional histogram 220 calculated in step S620.
- the center position p b W at is output as the target movement position.
- the movement marker 300 with a sensor cannot move to the target movement position output by the movement plan part 103 .
- the movement marker 300 with the sensor cannot move through the interface of the movement instruction unit 104 that instructs the movement marker 300 with the sensor the target movement position output by the movement planning unit 103.
- the movement planning unit 103 excludes the bin 221 of the three-dimensional histogram 220 having the highest priority from the selection candidates in step S665 and moves the center position of the bin 221 having the next highest priority to the target movement. Output as position.
- step S650 If a signal indicating that the movement cannot be continuously performed for the same fixed camera 200 more than a preset number of times is received, in step S650, the fixed camera 200 with the highest priority is selected from the selection candidates. The process of step S660 is executed on the fixed camera 200 with the next highest priority. (Operation of the movement instruction unit 104) Next, the operation of the movement instruction unit 104 will be described with reference to FIG.
- the movement instruction unit 104 issues an instruction for moving the sensor-equipped movement marker 300 to the target movement position output by the movement planning unit 103.
- the movement instructing unit 104 outputs a control signal for moving the mobile robot 350 to the target movement position. 350 is moved.
- FIG. 11 is a diagram illustrating an example of a three-dimensional map created by the movement instruction unit 104.
- the current position / posture of the moving marker with sensor 300 estimated by the moving marker calibration unit 101 and the locus 360 are displayed in computer graphics.
- the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 is displayed by computer graphics, and the right fixed camera 200 to be calibrated selected in step S650 is colored and emphasized. Yes.
- the measured three-dimensional information is three-dimensional.
- it may be displayed on a two-dimensional map with computer graphics.
- the movement instruction unit 104 may instruct movement based on the amount of movement from the current position / posture of the movement marker 300 with sensor.
- the coordinate p b M of the target movement position in the marker coordinate system of the movement marker 300 with sensor is calculated by (Equation 13).
- p b M corresponds to the amount of movement from the current position / posture of the moving marker 300 with sensor, for example, 10 m forward, 5 m right, and 0.5 m upward.
- the mobile robot 350 moves the sensor-equipped movement marker 300
- the mobile robot 350 is moved by outputting a control signal for moving the movement amount.
- the calibration operator moves the sensor-equipped movement marker 300
- the movement amount is instructed by, for example, sound output from a speaker or the like, or screen output from a display. (Operation of output unit 105)
- the operation of the output unit 105 will be described with reference to FIG.
- the output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102 to a RAM in the fixed camera 200, a management server of the fixed camera 200, and the like.
- the output unit 105 may display the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 using a three-dimensional or two-dimensional map.
- FIG. 12 is a diagram illustrating an example of a two-dimensional map output by the output unit 105.
- FIG. 12 is drawn from the normal direction of the plane representing the movable region of the movement marker 300 with sensor, which is applied in step S600 of the movement planning unit 103.
- the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 is displayed together with the imageable range 240 by computer graphics.
- a trajectory 360 indicates the position of the moving marker 300 with the sensor estimated by the moving marker calibration unit 101.
- the shootable range 240 is a shootable range of the fixed camera 200, and can be calculated from the focal length and the angle of view of the fixed camera 200.
- a point 370 is a three-dimensional measurement that seems to be an obstacle measured by the external sensor when an external sensor capable of measuring surrounding three-dimensional information, such as a camera or a laser range finder, is used as the sensor of the moving marker 300 with the sensor. Is a point.
- the display method of the measurement result by the external sensor is not limited to the point.
- a plurality of planes may be applied to the three-dimensional measurement points measured by the external sensor, and the applied planes may be displayed. (Effect) According to Example 1 mentioned above, the following effects are obtained.
- the movement planning unit 103 plans the target movement position of the sensor-equipped movement marker 300 based on the position and orientation of the fixed camera in the world coordinate system estimated by the fixed camera calibration unit 102, and moves
- the instruction unit 104 instructs the target movement position to the movement marker 300 with a sensor. Therefore, regardless of the skill level of the calibration operator, the positions and postures of the plurality of fixed cameras 200 can be calibrated with high accuracy in a short time.
- the heel movement planning unit 103 includes the position / posture of the sensor-equipped moving marker 300 estimated by the moving marker calibration unit 101 in the world coordinate system and the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102 in the world coordinate system.
- the distance between the sensor-equipped moving marker 300 and the fixed camera 200 is calculated from the posture, and the fixed camera 200 with a short distance is preferentially selected as a calibration target (steps S610 and S650 in FIG. 5). Further, the distance between the movement marker with sensor 300 and the center position of the bin 221 of the selected three-dimensional histogram 220 of the fixed camera 200 is calculated, and the center position of the bin 221 having a short distance is set as the target movement position (FIG. 10 step S661, step S665). Therefore, by selecting a target movement position with a short distance from the sensor-equipped movement marker 300, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be calibrated in a short time.
- the heel movement planning unit 103 determines the tertiary of the feature point 311 on the marker 310 based on the position / posture in the world coordinate system of the moving marker with sensor 300 estimated by the moving marker calibration unit 101 and the result of the fixed camera calibration unit 102.
- the original distribution degree and the two-dimensional distribution degree are calculated, and the fixed camera 200 having a small distribution degree is preferentially selected as a calibration target (steps S620 to S650 in FIG. 5). Further, for each bin 221 of the selected three-dimensional histogram 220 of the fixed camera 200, the three-dimensional satisfaction degree and the two-dimensional satisfaction degree of the feature point 311 on the marker 310 are calculated, and the three-dimensional histogram 220 having a small satisfaction degree is calculated.
- the center position of the bin 221 is set as the target movement position (FIG. 10, step S662 to step S665). Therefore, by using information on various positions in the three-dimensional space and the two-dimensional space for calibration of the fixed camera 200, the position and orientation of the fixed camera 200 can be calibrated with high accuracy.
- the heel movement planning unit 103 uses the distance between the sensor-equipped moving marker 300 and the fixed camera 200 and the three-dimensional distribution degree and the two-dimensional distribution degree of the feature point 311 on the marker 310 at the same time.
- the fixed camera 200 having a high priority is selected and set as a calibration target (steps S610 to S650 in FIG. 5).
- the distance between the sensor-equipped moving marker 300 and the center position of the bin 221 and the three-dimensional satisfaction of the feature point 311 on the marker 310 is calculated using the two-dimensional sufficiency at the same time, and the center position of the bin 221 of the three-dimensional histogram 220 having a high priority is set as the target movement position (step S661 to step 665 in FIG. 10). Therefore, by selecting a target movement position having a short distance from the movement marker 300 with the sensor and using information on various positions in the three-dimensional space and the two-dimensional space for calibration of the fixed camera 200, a plurality of pieces can be obtained in a short time. The position / posture of the fixed camera 200 can be calibrated with high accuracy.
- the heel movement planning unit 103 calculates whether or not the movement marker 300 with sensor reaches the bin 221 of the three-dimensional histogram 220 and the bin 231 of the two-dimensional histogram 230 by calculating the movable region of the movement marker 300 with sensor. Using the reachable bin, the fixed camera 200 is selected and the target movement position is calculated (step S600 in FIG. 5, step S620 to step S650, step S665 in FIG. 10). Therefore, it is prevented that the movement marker 300 with the sensor indicates a target movement position that cannot be reached, and a plurality of fixed cameras 200 can be calibrated in a short time.
- the kite movement planning unit 103 performs a cubic operation on the bin 221 of the three-dimensional histogram 220 and the bin 231 of the two-dimensional histogram 230. Using the original information, it is determined whether or not the moving marker with sensor 300 can be reached and whether or not the fixed camera 200 can shoot, and using the reachable and shootable bin, the fixed camera 200 is selected and the target moving position is calculated. (FIG. 5 step S600, steps S620 to S650, FIG. 10 step S665). Therefore, it is possible to prevent a target movement position that cannot be reached by the sensor-equipped movement marker 300 and a target movement position that cannot be photographed by the fixed camera 200, and a plurality of fixed cameras 200 can be calibrated in a short time.
- the heel movement instruction unit 104 moves the mobile robot 350 by outputting a control signal for the mobile robot 350 to move to the target movement position when the mobile robot 350 moves the sensor-equipped movement marker 300. Let For this reason, a plurality of fixed cameras 200 can be automatically calibrated.
- the heel movement instruction unit 104 fixes the position / posture in the world coordinate system of the sensor-equipped movement marker 300 estimated by the movement marker calibration unit 101.
- the position / posture of the fixed camera 200 in the world coordinate system estimated by the camera calibration unit 102 and the target movement position output by the movement planning unit 103 on a two-dimensional or three-dimensional map movement can be achieved. Instruct (FIG. 11). Therefore, since the calibration operator who moves the movement marker 300 with the sensor can easily grasp the target movement position, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be quickly connected. Can be calibrated.
- the output unit 105 includes the position / posture in the world coordinate system of the movement marker 300 with sensor estimated by the movement marker calibration unit 101, and the fixed camera 200 estimated by the fixed camera calibration unit 102.
- the position / posture in the world coordinate system and the shootable range of the fixed camera 200 calculated from the focal length and angle of view of the fixed camera 200 are displayed on a two-dimensional or three-dimensional map (FIG. 12). ). Therefore, the calibration operator can easily confirm the calibration result of the fixed camera 200.
- the movement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220 and sets the center position of the bin 221 having the highest priority as the target movement position (step S665 in FIG. 10). .
- the output of the movement planning unit 103 is not limited to this.
- the movement planning unit 103 sets the minimum value x min of the image x coordinate, the maximum value x max , the minimum value y min of the image y coordinate, the maximum value y max , and the minimum value z min of the bin 221 having the highest priority.
- the entire area of the predetermined bin 221 having the maximum value z max may be output as the target movement position.
- the movement instruction unit 104 estimates the current position / posture of the moving marker 300 with sensor estimated by the moving marker calibration unit 101, the past position of the moving marker 300 with sensor, and the fixed camera calibration unit 102.
- the movement is instructed by displaying the position / posture of the fixed camera 200 and the entire area of the predetermined bin output by the movement planning unit 103 on a two-dimensional or three-dimensional map.
- the movement planning unit 103 outputs the entire area of the predetermined bin 221 as the target movement position, and the movement instruction unit 104 displays the area output by the movement planning unit 103 as the target movement position. Therefore, since the target movement position can be easily grasped, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be calibrated in a short time.
- the movement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220, and uses the center position or the entire area of the bin 221 having the highest priority as the target movement position. This is output (step S665 in FIG. 10).
- the output of the movement planning unit 103 is not limited to this.
- the movement planning unit 103 sets all the bins 221 whose priority is equal to or higher than a preset threshold, such as second priority and third priority.
- the center position or the entire area may be output as the target movement position.
- the center position or area of the bin 221 having a preset number of high priorities may be output as the target movement position.
- the movement instruction unit 104 instructs movement by displaying all target movement positions output by the movement planning unit 103 on a two-dimensional or three-dimensional map.
- the movement planning unit 103 may output the center position or the entire area of all the bins 221 together with the bin priority.
- the movement instructing unit 104 instructs the movement by displaying the center positions or areas of all the bins 221 output from the movement planning unit 103 on the two-dimensional or three-dimensional map by color coding according to the bin priority.
- the movement planning unit 103 outputs a plurality of target movement positions with high priority
- the movement instruction unit 104 displays a plurality of target movement positions with high priority output by the movement planning unit 103. Therefore, when the calibration operator moves the sensor-equipped movement marker 300, it can move so as to efficiently pass through a plurality of high priority target movement positions, and the position / posture of the plurality of fixed cameras 200 can be achieved in a short time. Can be calibrated with high accuracy.
- the movement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220 and outputs the center position or region of the bin 221 having the highest priority as the target movement position. (Step S665 in FIG. 10).
- the output of the movement planning unit 103 is not limited to this.
- the movement planning unit 103 may calculate a route on which the bin 221 having a high priority can be efficiently moved based on the priority of each bin 221 of the three-dimensional histogram 220, and output it as the target movement position. . Specifically, assuming that the bin 221 where the sensor-equipped movement marker 300 currently exists is a start position, the movement to the adjacent bin 221 with respect to image coordinates and depth is counted as one time, and the bin is moved a preset number of times. For all the movement paths, the sum of the priorities of the bins 221 on the movement path that can be reached by the sensor-attached movement marker is calculated, and the movement path with the highest priority sum is output as the target movement position.
- the moving route calculation method is not limited to this, and other known route planning methods can be used.
- the movement instruction unit 104 When the mobile robot 350 moves the sensor-equipped movement marker 300, the movement instruction unit 104 outputs a control signal so that the mobile robot 350 passes the route output as the target movement position by the movement planning unit 103. The mobile robot 350 is moved. On the other hand, when the calibration operator moves the movement marker with sensor 300, the movement is instructed by displaying the route output as the target movement position by the movement planning unit 103 on a two-dimensional or three-dimensional map.
- the movement planning unit 103 outputs, as the target movement position, a route for efficiently moving a plurality of movement targets having high priority
- the movement instruction unit 104 outputs the route output by the movement planning unit 103 as the target movement position. Instruct. Therefore, it can move so that it may pass through a plurality of movement targets efficiently, and the position and posture of a plurality of fixed cameras 200 can be calibrated with high accuracy in a short time.
- the camera calibration device 400 according to the second embodiment will be described with reference to FIGS. 13 to 14 for the case where the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
- the same components as those in the first embodiment are denoted by the same reference numerals, and redundant description is omitted. Differences will be mainly described.
- FIG. 13 is a diagram illustrating a block configuration of the camera calibration apparatus 400.
- the camera calibration device 400 calibrates the position and orientation of the connected fixed camera 200.
- these can be realized, for example, by operating an arithmetic device such as a CPU in accordance with a program stored in a storage device such as a semiconductor memory in the camera calibration device 400, and each is necessarily provided as hardware. There is no need.
- the stop determination unit 401 combines the image coordinates of the feature point 311 on the marker 310 and the three-dimensional coordinates in a state where the moving marker 300 with the sensor is stopped, and outputs the combination to the fixed camera calibration unit 102.
- the stop planning unit 402 plans the stop of the sensor-equipped moving marker 300 according to the results of the stop determination unit 401, the moving marker calibration unit 101, and the movement planning unit 103.
- the stop instruction unit 403 instructs the stop marker with the sensor to the movement marker 300 with the sensor.
- the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
- the movement marker 300 with sensor repeats movement and stop according to the instruction of the stop instruction unit 403.
- the sensor-equipped movement marker 300 is mounted on the mobile robot 350, a carriage, a tripod, or the like. (Operation of stop judgment unit) Details of the processing of the stop determination unit 401 will be described with reference to FIG. FIG. 14 is a flowchart illustrating processing executed by the stop determination unit 401 during the operation of the camera calibration apparatus 400.
- Steps S800 to S802 are processed for each fixed camera 200.
- step S800 the feature point 311 on the marker 310 of the sensor-equipped moving marker 300 is detected from the image 210 obtained by capturing with the fixed camera 200, and the process proceeds to step S801.
- the process of step S800 is the same as step S500 of the fixed camera calibration unit 102.
- step S801 if the marker 310 is detected in step S800, the process proceeds to step S802. If the marker 310 is not detected, the process proceeds to the next process of the fixed camera 200.
- step S802 it is determined from the image coordinates of the feature point 311 detected in step S800 whether the sensor-equipped moving marker 300 is stopped. Specifically, for each feature point 311, the distance between the position in the latest image and the position in the previous image is calculated. The average of the distances of all the feature points 311 is calculated, and when the average distance is smaller than a preset threshold, it is determined that the sensor-equipped moving marker 300 is stopped.
- Step S803 determines whether or not the sensor-equipped moving marker 300 is stopped based on the position / orientation of the sensor-equipped moving marker 300 estimated by the moving marker calibrating unit 101, and the process proceeds to step S804.
- the distance between the latest position of the moving marker 300 with sensor and the position of the moving marker 300 with sensor calculated from the measured value of the previous sensor is calculated. If the distance is smaller than a preset threshold value, the sensor It is determined that the attached movement marker 300 is stopped.
- step S804 the image coordinates of the feature point 311 and the three-dimensional coordinates are synchronized. Specifically, after the stop instruction is issued by the stop instructing unit 403, the image coordinates of the feature point 311 that is first determined that the sensor-equipped moving marker 300 is stopped in step S802, and the sensor that is first detected in step S803.
- the three-dimensional coordinates of the feature point 311 determined that the attached movement marker 300 is stopped are regarded as the coordinates in a state where the movement marker with sensor 300 is stopped at the same position, and the image coordinates of the feature point 311
- the combination of the three-dimensional coordinates is output to the fixed camera calibration unit 102. (Operation of outage planning department) Next, details of the processing of the stop planning unit 402 will be described.
- the stop planning unit 402 includes a detection result of the marker 310 from the image 210 captured by the fixed camera 200 by the stop determination unit 401, the position of the moving marker 300 with sensor estimated by the moving marker calibration unit 101, and a movement planning unit. An instruction is issued to stop or move the sensor-equipped movement marker 300 from the target movement position output by 103.
- the stop planning unit 402 instructs the moving marker with sensor to stop via the stop instructing unit 403 when the moving marker with sensor 300 is first detected in each fixed camera 200.
- the stop plan unit 402 is configured such that the distance between the position of the sensor-equipped movement marker 300 estimated by the movement marker calibration unit 101 and the target movement position output by the movement planning unit 103 is equal to or less than a preset threshold value.
- the stop instruction unit 403 is used to instruct the sensor-equipped movement marker 300 to stop.
- step S804 of the stop determination unit 401 the stop planning unit 402 cancels the stop of the sensor-equipped moving marker 300 via the stop instruction unit 403. And instruct them to move. (Operation of stop instruction section) Next, details of the processing of the stop instruction unit 403 will be described.
- the stop planning unit 403 stops or moves the mobile robot 350 by outputting a control signal to the mobile robot 350 when the mobile robot 350 moves the sensor-equipped movement marker 300.
- the stop or movement is instructed by sound output from a speaker or screen output from a display.
- the same interface as the movement instruction unit 104 may be used as an interface such as a speaker or a display. (Effect) According to the second embodiment, the following operational effects can be obtained. That is, in the camera calibration apparatus 400, the stop planning unit 402 instructs the sensor-equipped movement marker 300 to stop via the stop instruction unit 403, and the stop determination unit 401 causes the sensor-equipped movement marker 300 to have the same position / posture.
- the image coordinates of the feature point 311 on the marker 310 and the three-dimensional coordinates in a stopped state are combined and output to the fixed camera calibration unit 102. Therefore, the fixed camera 200 can be calibrated even when the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
- this invention is not limited to an above-described Example, Various modifications are included.
- the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
- Other embodiments conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention.
- a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
- Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
- Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
- Information such as programs, tables, and files for realizing each function can be stored in a recording device such as a memory, a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
When a calibration jig is moved comprehensively in order to improve calibration precision, the amount of time required for measurement becomes longer, and when the calibration jig is moved in a limited fashion in order to reduce measurement time, calibration precision decreases. To fix this, a camera calibration device according to the present invention comprises: a mobile marker calibration unit in which a fixed camera and a sensor-equipped mobile marker are connected and the position and attitude of the mobile marker are estimated on the basis of measurement values from the sensor of the sensor-equipped mobile marker; a fixed camera calibration unit in which the position and attitude of the fixed camera are estimated on the basis of a photographed image of the fixed camera and the estimated position and attitude of the mobile marker; a movement planning unit which creates a movement plan including a target movement position of the mobile marker in accordance with the estimated position and attitude of the fixed camera; a movement instruction unit which gives instructions for movement on the basis of the movement plan; and an output unit which outputs the estimated position and attitude of the fixed camera.
Description
本発明は、カメラ校正装置に関し、特に、校正作業中の校正冶具の移動計画を自動作成するカメラ校正装置に関する。
The present invention relates to a camera calibration device, and more particularly to a camera calibration device that automatically creates a movement plan for a calibration jig during calibration work.
近年、監視分野において、撮像装置で撮影した映像から、対象物の位置やサイズを検出する画像認識技術のニーズが高まっている。このような画像認識技術を実現するためには、実空間上に設置した座標系(以下「世界座標系」と称する)におけるカメラの設置位置・姿勢が必要である。
In recent years, in the surveillance field, there is an increasing need for image recognition technology that detects the position and size of an object from video captured by an imaging device. In order to realize such an image recognition technique, it is necessary to have a camera installation position / posture in a coordinate system (hereinafter referred to as “world coordinate system”) installed in a real space.
カメラの設置位置や姿勢を校正するための従来技術として、例えば、特許文献1の段落0022には、「第2指標が配された現実空間を撮影する第2の撮影装置と当該第2の撮影装置上に配された第1指標とで構成されている校正治具を、第1の撮影装置の撮影範囲内で移動させる場合に、複数の時刻において当該第1の撮影装置が撮影した複数の第1画像を取得する第1の取得手段と、前記第1画像の夫々から前記第1指標の画像座標を抽出し、前記複数の時刻における前記第1指標の画像座標を取得する第1の抽出手段と、前記複数の時刻において前記第2の撮影装置が撮影した複数の第2画像を取得する第2の取得手段と、前記第2画像の夫々から前記第2指標の画像座標を抽出し、前記複数の時刻における前記第2指標の画像座標を取得する第2の抽出手段と、前記第1の抽出手段、前記第2の抽出手段のそれぞれが抽出した前記複数の時刻における前記第1指標及び前記第2指標の画像座標を同時に利用して、未知パラメータとして、前記第1の撮影装置のカメラパラメータを求める計算手段とを備えることを特徴とする。」という情報処理装置の記載がある。
As a conventional technique for calibrating the installation position and orientation of a camera, for example, in paragraph 0022 of Patent Document 1, “second imaging device for imaging a real space in which a second index is arranged and the second imaging” When a calibration jig configured with a first index arranged on the apparatus is moved within the imaging range of the first imaging apparatus, a plurality of images captured by the first imaging apparatus at a plurality of times First acquisition means for acquiring a first image, and first extraction for extracting image coordinates of the first index from each of the first images and acquiring image coordinates of the first index at the plurality of times Means, second acquisition means for acquiring a plurality of second images taken by the second imaging device at the plurality of times, and extracting the image coordinates of the second index from each of the second images, Image coordinates of the second index at the plurality of times Utilizing simultaneously the image coordinates of the first index and the second index at the plurality of times extracted by the second extracting means to be acquired, the first extracting means, and the second extracting means, There is a description of an information processing apparatus that includes a calculation unit that obtains a camera parameter of the first photographing apparatus as an unknown parameter.
そして、この情報処理装置により、同文献の段落0031に記載の「カメラキャリブレーションにおける相互標定と絶対標定の工程を統合することができる。」との効果を得ることができる。
Further, this information processing apparatus can obtain the effect of “the mutual orientation and absolute orientation processes in camera calibration can be integrated” described in paragraph 0031 of the same document.
しかしながら、特許文献1の校正作業は、校正治具の動かし方により、計測時間と校正精度の間にトレードオフが発生する。すなわち、校正精度を向上させるために校正治具を網羅的に移動させると、計測にかかる時間が長くなり、計測時間を短くするために校正治具を限定的に移動させると、校正精度が低下するという問題がある。
However, in the calibration work of Patent Document 1, there is a trade-off between measurement time and calibration accuracy depending on how the calibration jig is moved. That is, if the calibration jig is moved comprehensively in order to improve the calibration accuracy, the time required for measurement will increase, and if the calibration jig is moved in a limited way to shorten the measurement time, the calibration accuracy will decrease. There is a problem of doing.
また、特許文献1の校正作業は、校正作業者が計画した校正冶具の動かし方により実行されるため、計画作成に時間や知識を要するのに加え、その移動計画の巧拙が校正作業者の熟練度に大きく依存するという問題もある。
Further, since the calibration work of Patent Document 1 is executed by the method of moving the calibration jig planned by the calibration operator, it takes time and knowledge to create the plan, and the skill of the movement plan is the skill of the calibration operator. There is also a problem that depends heavily on the degree.
上記の課題を解決するため、本発明にかかるカメラ校正装置は、固定カメラとセンサ付き移動マーカが接続され、前記センサ付き移動マーカのセンサの計測値から前記移動マーカの位置・姿勢を推定する移動マーカ校正部と、前記固定カメラの撮影画像と推定された前記移動マーカの位置・姿勢とから、前記固定カメラの位置・姿勢を推定する固定カメラ校正部と、推定された前記固定カメラの位置・姿勢に応じて前記移動マーカの目標移動位置を含む移動計画を作成する移動計画部と、前記移動計画に基づいて移動を指示する移動指示部と、推定された固定カメラの位置・姿勢を出力する出力部と、を備えるものとした。
In order to solve the above-described problems, a camera calibration apparatus according to the present invention is a movement in which a fixed camera and a movement marker with a sensor are connected, and the position / posture of the movement marker is estimated from the measured value of the sensor of the movement marker with sensor. A marker calibration unit, a fixed camera calibration unit that estimates the position / orientation of the fixed camera from the captured image of the fixed camera and the estimated position / orientation of the moving marker, and an estimated position / position of the fixed camera A movement plan unit that creates a movement plan including a target movement position of the movement marker according to the posture, a movement instruction unit that instructs movement based on the movement plan, and outputs the estimated position and posture of the fixed camera And an output unit.
本発明のカメラ校正装置によれば、校正作業者の熟練度に拘らず、短時間で複数の固定カメラの位置・姿勢を高精度に校正できる。
According to the camera calibration apparatus of the present invention, the positions and postures of a plurality of fixed cameras can be calibrated with high accuracy in a short time regardless of the skill level of the calibration operator.
以下、図面を用いて、本発明の実施例を説明する。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
以下、図1~図12を参照して、実施例1のカメラ校正装置100を説明する。
(ブロック構成)
図1は、本実施例のカメラ校正装置100のブロック構成を示す図である。カメラ校正装置100は、複数の固定カメラ200と、センサ付き移動マーカ300が接続され、固定カメラ200の位置・姿勢を校正する。以下の説明では、各々の固定カメラ200の識別子をiとし、Niを固定カメラ200の総数とする。すなわち、カメラ校正装置100にはNi台の固定カメラ200が接続されているものとする。 Hereinafter, thecamera calibration apparatus 100 according to the first embodiment will be described with reference to FIGS.
(Block configuration)
FIG. 1 is a diagram illustrating a block configuration of acamera calibration apparatus 100 according to the present embodiment. The camera calibration apparatus 100 is connected to a plurality of fixed cameras 200 and a movement marker 300 with a sensor, and calibrates the position and orientation of the fixed camera 200. In the following description, the identifier of each of the fixed camera 200 and i, the total number of fixed cameras 200 N i. That is, the camera calibration apparatus 100 shall be fixed camera 200 of N i stand is connected.
(ブロック構成)
図1は、本実施例のカメラ校正装置100のブロック構成を示す図である。カメラ校正装置100は、複数の固定カメラ200と、センサ付き移動マーカ300が接続され、固定カメラ200の位置・姿勢を校正する。以下の説明では、各々の固定カメラ200の識別子をiとし、Niを固定カメラ200の総数とする。すなわち、カメラ校正装置100にはNi台の固定カメラ200が接続されているものとする。 Hereinafter, the
(Block configuration)
FIG. 1 is a diagram illustrating a block configuration of a
ここで、固定カメラ200は、たとえば、天井に設置された監視カメラなど、位置・姿勢が固定されたカメラである。また、センサ付き移動マーカ300は、たとえば、自身の運動を計測可能なカメラなどのセンサと、固定カメラ200の撮影画像からの検出が容易なチェッカーボードパターンなどのマーカとからなり、後述する移動計画に従って、固定カメラ200が設置された環境内を移動する。
Here, the fixed camera 200 is a camera whose position and posture are fixed, such as a surveillance camera installed on the ceiling. The movement marker 300 with a sensor includes, for example, a sensor such as a camera that can measure its own movement and a marker such as a checkerboard pattern that can be easily detected from a captured image of the fixed camera 200. Accordingly, the camera moves in the environment where the fixed camera 200 is installed.
図1に示すように、このカメラ校正装置100は、移動マーカ校正部101と、固定カメラ校正部102と、移動計画部103と、移動指示部104と、出力部105から構成されている。なお、これらは、例えば、カメラ校正装置100内の半導体メモリ等の記憶装置に記憶されたプログラムに従って、CPU等の演算装置が動作することによって実現できるものであり、必ずしも個々をハードウェアとして具備する必要はない。
As shown in FIG. 1, the camera calibration apparatus 100 includes a movement marker calibration unit 101, a fixed camera calibration unit 102, a movement planning unit 103, a movement instruction unit 104, and an output unit 105. Note that these can be realized, for example, by operating an arithmetic device such as a CPU in accordance with a program stored in a storage device such as a semiconductor memory in the camera calibration device 100, and each is necessarily provided as hardware. There is no need.
移動マーカ校正部101は、センサ付き移動マーカ300のセンサが計測した計測値から、センサ付き移動マーカ300の世界座標系における位置・姿勢を推定する。一方、固定カメラ校正部102は、固定カメラ200が撮影したセンサ付き移動マーカ300を含む画像と、移動マーカ校正部101が推定したセンサ付き移動マーカ300の世界座標系における位置・姿勢から、固定カメラ200の世界座標系における位置・姿勢を推定する。
The moving marker calibration unit 101 estimates the position / posture of the sensor-equipped moving marker 300 in the world coordinate system from the measurement values measured by the sensor of the sensor-equipped moving marker 300. On the other hand, the fixed camera calibration unit 102 uses the image including the moving marker 300 with the sensor imaged by the fixed camera 200 and the position / posture in the world coordinate system of the moving marker 300 with the sensor estimated by the moving marker calibration unit 101. Estimate the position and orientation in 200 world coordinate systems.
移動計画部103は、固定カメラ校正部102が推定した各々の固定カメラ200の位置・姿勢に応じて、センサ付き移動マーカ300の移動を計画する。移動指示部104は、移動計画部103が計画した移動計画をセンサ付き移動マーカ300に実行させるための指示を出す。出力部105は、固定カメラ校正部102が推定した固定カメラ200の位置・姿勢を出力する。
The movement planning unit 103 plans the movement of the sensor-equipped moving marker 300 according to the position / orientation of each fixed camera 200 estimated by the fixed camera calibration unit 102. The movement instruction unit 104 issues an instruction for causing the movement marker with sensor 300 to execute the movement plan planned by the movement planning unit 103. The output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102.
なお、図1では省略しているが、固定カメラ200、センサ付き移動マーカ300と、カメラ校正装置100の接続は、USBやイーサネット(登録商標)といった有線接続でも良いし、無線ネットワークを介した無線接続でも良い。また、固定カメラ200内、センサ付き移動マーカ300内に存在する記録媒体に記録されたデータを、SDカードなどの記憶媒体を介して、カメラ校正装置100に入力しても良い。
Although not shown in FIG. 1, the fixed camera 200, the sensor-equipped moving marker 300, and the camera calibration apparatus 100 may be connected by a wired connection such as USB or Ethernet (registered trademark) or wirelessly via a wireless network. Connection is also acceptable. Further, data recorded in a recording medium existing in the fixed camera 200 and the sensor-equipped moving marker 300 may be input to the camera calibration apparatus 100 via a storage medium such as an SD card.
図2は、固定カメラ200とセンサ付き移動マーカ300の一例を示す図である。天井などに固定された固定カメラ200は、所定周期で撮影を行い、撮影画像をカメラ校正装置100に出力する。センサ付き移動マーカ300は、固定カメラ200が設置された環境内を移動できるものであり、ここでは、固定カメラ200の撮影画像からの検出を容易にするためのマーカ310と、マーカ310上に固定した移動カメラ320からなるセンサ付き移動マーカ300を、環境内を移動する移動ロボット350に搭載して移動させる例を示しているが、センサ付き移動マーカ300を校正作業者が手に持って移動させても良いし、センサ付き移動マーカ300を乗せた台車や三脚を校正作業者が移動させても良い。なお、マーカ310上に設置した座標系(以下「マーカ座標系」と称する)における、移動カメラ320の固定位置・姿勢は、カメラ校正装置100にとって既知であるものとする。
FIG. 2 is a diagram illustrating an example of the fixed camera 200 and the movement marker 300 with a sensor. The fixed camera 200 fixed to the ceiling or the like captures images at a predetermined cycle and outputs captured images to the camera calibration device 100. The sensor-equipped moving marker 300 can move in the environment where the fixed camera 200 is installed. Here, the marker 310 for facilitating detection from the captured image of the fixed camera 200 and the marker 310 are fixed on the marker 310. In this example, the moving marker with sensor 300 composed of the moving camera 320 is mounted on the moving robot 350 that moves in the environment and moved, but the calibration operator 300 moves the moving marker with sensor by hand. Alternatively, the calibration operator may move the cart or tripod on which the sensor-equipped movement marker 300 is placed. It is assumed that the fixed position / orientation of moving camera 320 in the coordinate system (hereinafter referred to as “marker coordinate system”) installed on marker 310 is known to camera calibration apparatus 100.
図2では、マーカ310としてチェッカーボードパターンを用いているが、円形パターンなど、画像からの検出が容易な他のパターンであっても良い。また、マーカ310は平面パターンに限定されず、立方体や球など三次元物体でも良いし、移動ロボット350にあらかじめ存在するパターンをマーカ310として用いても良い。さらに、図2では、センサとして移動カメラ320を用いたが、IMU(Inertial Measurement Unit)、ホイールエンコーダ、操舵角計、GPS、レーザレンジファインダなど、自身の運動を計測可能な他のセンサを用いても良い。
In FIG. 2, a checkerboard pattern is used as the marker 310, but another pattern such as a circular pattern that can be easily detected from an image may be used. In addition, the marker 310 is not limited to a planar pattern, and may be a three-dimensional object such as a cube or a sphere, or a pattern that exists in advance in the mobile robot 350 may be used as the marker 310. Further, in FIG. 2, the moving camera 320 is used as a sensor, but other sensors that can measure its own motion, such as an IMU (Inertial Measurement Unit), a wheel encoder, a steering angle meter, GPS, and a laser range finder, are used. Also good.
センサ付き移動マーカ300の移動カメラ320に代表されるセンサは、所定周期で計測を行い、計測結果である撮影画像をカメラ校正装置100に出力する。本実施例では、固定カメラ200と移動カメラ320は時刻同期がされており、同じ時刻に撮影・計測を実施する。カメラ校正装置100は、固定カメラ200から画像が入力されるたび、もしくは、一定数の画像が入力されるたびに、固定カメラ校正部102の処理を実行する。また、センサ付き移動マーカ300から計測結果が入力されるたび、もしくは、一定数の計測結果が入力されるたびに移動マーカ校正部101の処理を実施する。
The sensor typified by the moving camera 320 of the moving marker 300 with sensor measures at a predetermined cycle and outputs a captured image as a measurement result to the camera calibration device 100. In this embodiment, the fixed camera 200 and the moving camera 320 are time-synchronized, and photographing and measurement are performed at the same time. The camera calibration device 100 executes the process of the fixed camera calibration unit 102 every time an image is input from the fixed camera 200 or a certain number of images are input. Further, the process of the movement marker calibration unit 101 is performed every time a measurement result is input from the movement marker 300 with sensor or a certain number of measurement results are input.
移動マーカ校正部101は、移動カメラ320の各計測時刻におけるセンサ付き移動マーカ300の世界座標系における位置・姿勢を推定する。マーカ座標系における移動カメラ320の位置・姿勢は固定かつ既知であるため、移動カメラ320の世界座標系における位置・姿勢を推定することで、マーカ310の世界座標系における位置・姿勢を推定することができる。図2のように、センサ付き移動マーカ300のセンサとして移動カメラ320を用いた場合には、撮影された複数の画像から、各画像撮影時の移動カメラ320の位置・姿勢を推定するStructure from Motion法やVisual Simultaneous Localization and Mapping(vSLAM)法を用いることができる。たとえば、vSLAM法としては、G. Klein and D. Murray, Parallel Tracking and Mapping for Small AR Workspaces, Proc. IEEE and ACM Int. Symp. on Mixed and Augmented Reality, pp.225-234, 2007.を用いることができる。
The moving marker calibration unit 101 estimates the position / orientation in the world coordinate system of the moving marker 300 with sensor at each measurement time of the moving camera 320. Since the position / posture of the mobile camera 320 in the marker coordinate system is fixed and known, the position / posture of the marker 310 in the world coordinate system is estimated by estimating the position / posture of the mobile camera 320 in the world coordinate system. Can do. As shown in FIG. 2, when the moving camera 320 is used as the sensor of the sensor-equipped moving marker 300, Structure from Motion that estimates the position / orientation of the moving camera 320 at the time of capturing each image from a plurality of captured images. Method or Visual Simultaneous Localization and Mapping (vSLAM) method can be used. For example, G. Klein and D. Murray, Parallel Tracking and Mapping for Small AR Workspaces, Proc. IEEE and ACM Int. Symp. On Mixed and Augmented Reality, pp.225-234, 2007. Can do.
なお、センサとしてIMUを用いた場合には、IMUにより計測された加速度と角速度を積分することで、位置・姿勢を推定することができる。センサとして、ホイールエンコーダと操舵角計を用いた場合には、デッドレコニングにより、位置・姿勢を推定することができる。また、センサとして、カメラやIMU、ホイールエンコーダといった複数のセンサを用い、各センサの計測値を合わせて用いることで位置・姿勢を推定しても良い。
When an IMU is used as a sensor, the position / orientation can be estimated by integrating the acceleration and angular velocity measured by the IMU. When a wheel encoder and a steering angle meter are used as sensors, the position / posture can be estimated by dead reckoning. Further, a plurality of sensors such as a camera, an IMU, and a wheel encoder may be used as the sensors, and the position / posture may be estimated by using the measured values of the sensors together.
固定カメラ校正部102は、固定カメラ200が撮影したセンサ付き移動マーカ300を含む画像と、その撮影時刻に対応する移動マーカ校正部101が推定したセンサ付き移動マーカ300の位置・姿勢から、固定カメラ200の位置・姿勢を推定する。処理の詳細は後述する。なお、上述したように、固定カメラ200と移動カメラ320は時刻同期撮影するため、移動マーカ校正部101での推定処理と固定カメラ校正部102での推定処理には、それぞれ同時刻の撮影画像を用いる。
The fixed camera calibration unit 102 uses the image including the moving marker 300 with sensor captured by the fixed camera 200 and the position / posture of the moving marker 300 with sensor estimated by the moving marker calibration unit 101 corresponding to the shooting time. 200 positions and postures are estimated. Details of the processing will be described later. Note that, as described above, since the fixed camera 200 and the moving camera 320 perform time-synchronized shooting, in the estimation process in the moving marker calibration unit 101 and the estimation process in the fixed camera calibration unit 102, captured images at the same time are used. Use.
移動計画部103は、固定カメラ校正部102が推定した固定カメラ200の位置・姿勢に応じて、センサ付き移動マーカ300の移動計画を作成する。処理の詳細は後述する。
The movement planning unit 103 creates a movement plan for the sensor-equipped movement marker 300 according to the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102. Details of the processing will be described later.
移動指示部104は、移動計画部103が作成した移動計画をセンサ付き移動マーカ300に実行させるための指示を出す。処理の詳細は後述する。
The movement instruction unit 104 issues an instruction for causing the movement marker 300 with a sensor to execute the movement plan created by the movement planning unit 103. Details of the processing will be described later.
出力部105は、固定カメラ校正部102により推定された固定カメラ200の位置・姿勢を出力する。処理の詳細は後述する。(固定カメラ校正部102の動作)
次に、図3、図4を用いて、固定カメラ校正部102の処理の詳細を説明する。図3はカメラ校正装置100の動作中に固定カメラ校正部102が繰り返す処理を示すフローチャートである。また、図4は固定カメラ200が撮影した画像210の一例である。 Theoutput unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102. Details of the processing will be described later. (Operation of Fixed Camera Calibration Unit 102)
Next, details of the processing of the fixedcamera calibration unit 102 will be described with reference to FIGS. 3 and 4. FIG. 3 is a flowchart showing a process that the fixed camera calibration unit 102 repeats during the operation of the camera calibration apparatus 100. FIG. 4 is an example of an image 210 taken by the fixed camera 200.
次に、図3、図4を用いて、固定カメラ校正部102の処理の詳細を説明する。図3はカメラ校正装置100の動作中に固定カメラ校正部102が繰り返す処理を示すフローチャートである。また、図4は固定カメラ200が撮影した画像210の一例である。 The
Next, details of the processing of the fixed
まず、ステップS500で、固定カメラ200が撮影した画像210から、マーカ310とマーカ310上の特徴点311を検出する。ここで、特徴点311とは、図4に例示する、画像210に含まれる、チェッカーボードパターンを構成する各四角形の角など、画像から位置を検出することが容易な点である。なお、図4では、図示の都合により、一部の特徴点311のみを図示している。画像からのチェッカーボードパターンおよびパターン上の特徴点の検出は公知の技術であるためその詳細な説明は割愛する。
First, in step S500, the marker 310 and the feature point 311 on the marker 310 are detected from the image 210 taken by the fixed camera 200. Here, the feature point 311 is a point where it is easy to detect the position from the image, such as the corners of each square included in the checkerboard pattern included in the image 210 illustrated in FIG. In FIG. 4, only some feature points 311 are illustrated for convenience of illustration. Since detection of the checkerboard pattern and the feature points on the pattern from the image is a known technique, a detailed description thereof is omitted.
以下の説明では、画像210を時系列に並べたときの識別子をjとし、固定カメラ校正部102が処理される時点でi番目の固定カメラ200により撮影された画像の総数をNijとする。また、特徴点311の識別子をkとし、i番目の固定カメラ200により撮影されたj番目の画像210において検出された特徴点311の総数をNijkとする。i番目の固定カメラ200により撮影されたj番目の画像210において検出されたk番目の特徴点311の画像210の画像座標系における二次元位置を(u’ijk、v’ijk)Tとする。
In the following description, it is assumed that the identifier when the images 210 are arranged in time series is j, and the total number of images taken by the i-th fixed camera 200 when the fixed camera calibration unit 102 is processed is N ij . The identifier of the feature point 311 is k, and the total number of feature points 311 detected in the j-th image 210 photographed by the i-th fixed camera 200 is N ijk . The two-dimensional position in the image coordinate system of the image 210 of the k-th feature point 311 detected in the j-th image 210 photographed by the i-th fixed camera 200 is defined as (u ′ ijk , v ′ ijk ) T.
ステップS510では、ステップS500で検出したマーカ310上の特徴点311の世界座標系における三次元位置を、移動マーカ校正部101により推定されたマーカ310の位置・姿勢を用いて計算し、ステップS520に進む。マーカ座標系における特徴点311の三次元位置pijk
Mはチェッカーボードパターンの仕様から既知である。移動マーカ校正部101により推定されたi番目の固定カメラ200で撮影されたj番目の画像210の撮影時刻におけるマーカ310の位置・姿勢から、世界座標系における特徴点の三次元位置pijk
Wは(式1)により計算される。
In step S510, the three-dimensional position in the world coordinate system of the feature point 311 on the marker 310 detected in step S500 is calculated using the position / orientation of the marker 310 estimated by the moving marker calibration unit 101, and the process proceeds to step S520. move on. The three-dimensional position p ijk M of the feature point 311 in the marker coordinate system is known from the specification of the checkerboard pattern. From the position / posture of the marker 310 at the shooting time of the j-th image 210 captured by the i-th fixed camera 200 estimated by the moving marker calibration unit 101, the three-dimensional position p ijk W of the feature point in the world coordinate system is Calculated by (Equation 1).
ここで、Rij
MW、tij
MWはマーカ座標系から世界座標系への回転行列と平行移動ベクトルである。
Here, R ij MW and t ij MW are a rotation matrix and a translation vector from the marker coordinate system to the world coordinate system.
ステップS520では、ステップS500で検出されたマーカ310上の特徴点311の画像座標系における二次元位置と、ステップS510で計算されたマーカ310上の特徴点311の世界座標系における三次元位置から、固定カメラ200の世界座標系における位置・姿勢を推定する。具体的には、(式2)を、Levenberg-Marquardt法やガウスニュートン法などの公知の非線形最小二乗法を用いて解くことにより、i番目の固定カメラ200の世界座標系における位置・姿勢を計算する。
In step S520, from the two-dimensional position in the image coordinate system of the feature point 311 on the marker 310 detected in step S500 and the three-dimensional position in the world coordinate system of the feature point 311 on the marker 310 calculated in step S510, The position / orientation of the fixed camera 200 in the world coordinate system is estimated. Specifically, the position and orientation of the i-th fixed camera 200 in the world coordinate system is calculated by solving (Equation 2) using a known nonlinear least square method such as the Levenberg-Marquardt method or the Gauss-Newton method. To do.
ここで、Ri
WCi、ti
WCiは世界座標系からi番目の固定カメラ200の固定カメラ座標系への回転行列と平行移動ベクトルであり、R’i
WCi、t’i
WCiは推定された世界座標系からi番目の固定カメラ200の固定カメラ座標系への回転行列と平行移動ベクトルである。また、Eiは再投影誤差の総和である。再投影誤差とは、特徴点311の三次元位置を固定カメラ200の位置・姿勢,焦点距離やレンズ歪などのカメラ内部パラメータを用いて画像210へ投影した投影位置と,画像210中の特徴点311の検出位置の間の距離である。再投影誤差の総和Eiは(式3)により計算される。
Here, R i WCi and t i WCi are a rotation matrix and a translation vector from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200, and R ′ i WCi and t ′ i WCi are estimated. A rotation matrix and a translation vector from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200. E i is the total reprojection error. The reprojection error refers to a projection position obtained by projecting the three-dimensional position of the feature point 311 onto the image 210 using camera internal parameters such as the position / posture of the fixed camera 200, focal length, and lens distortion, and the feature point in the image 210. This is the distance between 311 detection positions. The total re-projection error E i is calculated by (Equation 3).
ここで、(xijk,yijk)Tはi番目の固定カメラ200でj番目に撮影された画像210におけるk番目の特徴点311の正規化画像座標系における位置である。(x’ijk,y’ijk)Tはi番目の固定カメラ200でj番目に撮影された画像210におけるk番目の特徴点311の世界座標系での位置をi番目の固定カメラ200へ投影した座標である。
Here, (x ijk , y ijk ) T is a position in the normalized image coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200. (x ′ ijk , y ′ ijk ) T projects the position in the world coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 onto the i-th fixed camera 200. Coordinates.
i番目の固定カメラ200でj番目に撮影された画像210におけるk番目の特徴点311の正規化画像座標系における位置(xijk,yijk)Tは、たとえば、カメラモデルとして透視投影モデルを用いた場合、(式4)により計算される。ただし、カメラモデルは透視投影モデルに限定されるものではなく、全方位カメラ用のカメラモデルなど、他のカメラモデルを用いても良い。
The position (x ijk , y ijk ) T in the normalized image coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 uses, for example, a perspective projection model as a camera model. If there is, it is calculated by (Equation 4). However, the camera model is not limited to the perspective projection model, and other camera models such as a camera model for an omnidirectional camera may be used.
ここで、(cix,ciy)Tはi番目の固定カメラ200の光学中心の位置、(fix,fiy)Tはi番目の固定カメラ200の焦点距離である。(uijk、vijk)Tはレンズ歪を取り除いた画像座標におけるi番目の固定カメラ200でj番目に撮影された画像210におけるk番目の特徴点311の位置であり、たとえば、レンズ歪モデルとして、レンズの半径方向の歪曲収差を用いた場合、(式5)により計算される。ただし、レンズ歪モデルは、レンズの半径方向の歪曲収差に限定されるものではなく、レンズの半径方向に直交する接線方向の歪曲収差など、他のレンズモデルを用いても良い。
Here, (c ix , c iy ) T is the position of the optical center of the i-th fixed camera 200, and (f ix , f iy ) T is the focal length of the i-th fixed camera 200. (u ijk , v ijk ) T is the position of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 in the image coordinates from which lens distortion has been removed. When distortion in the radial direction of the lens is used, it is calculated according to (Equation 5). However, the lens distortion model is not limited to distortion in the radial direction of the lens, and other lens models such as distortion in the tangential direction orthogonal to the radial direction of the lens may be used.
ここで、κ1、κ2はレンズ歪パラメータである。
Here, κ 1 and κ 2 are lens distortion parameters.
i番目の固定カメラ200でj番目に撮影された画像210におけるk番目の特徴点311の世界座標系での位置をi番目の固定カメラ200へ投影した座標(x’ijk,y’ijk)Tは(式6.1)、(式6.2)により計算される。
Coordinates (x ′ ijk , y ′ ijk ) T where the position of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 is projected to the i-th fixed camera 200. Is calculated by (Expression 6.1) and (Expression 6.2).
(移動計画部103の動作)
次に、図5~図10を用いて、移動計画部103の処理の詳細を説明する。図5はカメラ校正装置100の動作中に移動計画部103が実行する処理を示すフローチャートであり、校正優先度の高い固定カメラ200が選択されるとともに、目標移動位置が計算される。 (Operation of the movement planning unit 103)
Next, details of the processing of themovement planning unit 103 will be described with reference to FIGS. FIG. 5 is a flowchart showing a process executed by the movement planning unit 103 during the operation of the camera calibration apparatus 100. A fixed camera 200 having a high calibration priority is selected and a target movement position is calculated.
次に、図5~図10を用いて、移動計画部103の処理の詳細を説明する。図5はカメラ校正装置100の動作中に移動計画部103が実行する処理を示すフローチャートであり、校正優先度の高い固定カメラ200が選択されるとともに、目標移動位置が計算される。 (Operation of the movement planning unit 103)
Next, details of the processing of the
ステップS600では、センサ付き移動マーカ300の移動可能領域を計算する。なお、予め移動可能領域を求めておくのは、障害物などの影響によりセンサ付き移動マーカ300を配置することができない領域がある場合は、それを除外した移動計画を作成する必要があるためである。
In step S600, the movable area of the moving marker 300 with sensor is calculated. The reason why the movable area is obtained in advance is that if there is an area where the movement marker 300 with the sensor cannot be arranged due to the influence of an obstacle or the like, it is necessary to create a movement plan excluding the area. is there.
まず、移動マーカ校正部101が推定した各時刻におけるマーカ310の位置・姿勢に基づいて、図3のステップS510同様の計算により、各時刻におけるマーカ310上の特徴点311の世界座標系における三次元位置を計算し、計算された三次元位置に対して平面を当てはめる。三次元位置に対する平面当てはめは公知の技術であるため、詳細は省略するが、最小二乗法や主成分分析を用いることができる。
First, based on the position / orientation of the marker 310 at each time estimated by the moving marker calibration unit 101, the three-dimensional in the world coordinate system of the feature point 311 on the marker 310 at each time is calculated by the same calculation as step S510 in FIG. Calculate the position and fit a plane to the calculated 3D position. Since plane fitting for a three-dimensional position is a known technique, the details are omitted, but a least square method or principal component analysis can be used.
次に、各三次元位置と当てはめられた平面の間の符号付き距離を計算する。符号毎に、距離を昇順に並べ替え、q分位数を計算し、それぞれw1、w2とする。ここで、qはあらかじめ設定された0≦q≦1の値である。q分位数を用いることで、外れ値の影響を排除しながら、三次元空間における移動範囲を計算することができる。
Next, the signed distance between each three-dimensional position and the fitted plane is calculated. For each code, the distances are rearranged in ascending order, and the q quantiles are calculated as w 1 and w 2 , respectively. Here, q is a preset value of 0 ≦ q ≦ 1. By using the q quantile, it is possible to calculate the moving range in the three-dimensional space while eliminating the influence of outliers.
最後に、当てはめられた平面を平面の法線方向にw1、w2移動した二つの平面に挟まれた領域をセンサ付き移動マーカ310の移動可能領域とする。ここで、センサ付き移動マーカ310を平面上を移動するロボットや台車に載せて移動する場合など、センサ付き移動マーカ310の移動が平面に限定されることがあらかじめ分かっている場合には、w1、w2を0に設定すると良い。
Finally, an area sandwiched between two planes obtained by moving the fitted plane by w 1 and w 2 in the normal direction of the plane is defined as a movable area of the sensor-equipped movement marker 310. Here, when it is known in advance that the movement of the movement marker 310 with sensor is limited to a plane, such as when the movement marker 310 with sensor is moved on a robot or carriage that moves on a plane, w 1 , W 2 should be set to 0.
ステップS600に続くステップS610~S640のループ処理では、個々の固定カメラ200について校正の優先度を計算する。
In the loop processing of steps S610 to S640 following step S600, the priority of calibration is calculated for each fixed camera 200.
ステップS610では、i番目の固定カメラ200とセンサ付き移動マーカ300の距離diを計算する。ここで、diは、移動マーカ校正部101で推定された、最新のマーカ座標系から世界座標系への回転行列と平行移動ベクトルRij
MW、tij
MWと、固定カメラ校正部102で推定された、世界座標系からi番目の固定カメラ200の固定カメラ座標系への回転行列と平行移動ベクトルR’i
WCi、t’i
WCiを用いた(数7.1)、(数7.2)で計算される。
In step S610, the distance d i between the i-th fixed camera 200 and the sensor-equipped moving marker 300 is calculated. Here, d i is estimated by the moving marker calibration unit 101, and is estimated by the rotation matrix from the latest marker coordinate system to the world coordinate system, translation vectors R ij MW and t ij MW, and the fixed camera calibration unit 102. The rotation matrix from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200 and the translation vectors R ′ i WCi and t ′ i WCi are used ( Equation 7.1) and ( Equation 7.2). ).
ステップS620では、i番目の固定カメラ200で撮影されたマーカ310上の特徴点311の世界座標系における三次元分布度ai
3Dを計算する。
In step S620, the three-dimensional distribution a i 3D in the world coordinate system of the feature point 311 on the marker 310 photographed by the i-th fixed camera 200 is calculated.
この計算のため、まず、マーカ310上の特徴点311の画像210中での位置およびi番目の固定カメラ200からの奥行きを用いて、特徴点311の個数の三次元ヒストグラム220を作成する。図6は、三次元ヒストグラム220の一例を示す図である。三次元ヒストグラム220は、マーカ310上の特徴点311の画像210中での位置と、i番目の固定カメラ200からの奥行きと、からなる空間における特徴点311の個数のヒストグラムである。ビン221は、画像x座標の最小値xmin、最大値xmax、画像y座標の最小値ymin、最大値ymax、奥行きの最小値zmin、最大値zmaxにより規定される領域内の特徴点311の個数を保持する、三次元ヒストグラム220を構成するビンである。また、図7は、三次元空間における三次元ヒストグラム220のビン221の一例を示す図である。
For this calculation, first, a three-dimensional histogram 220 of the number of feature points 311 is created using the position of the feature point 311 on the marker 310 in the image 210 and the depth from the i-th fixed camera 200. FIG. 6 is a diagram illustrating an example of the three-dimensional histogram 220. The three-dimensional histogram 220 is a histogram of the number of feature points 311 in a space composed of the position of the feature point 311 on the marker 310 in the image 210 and the depth from the i-th fixed camera 200. The bin 221 includes an image x coordinate minimum value x min , a maximum value x max , an image y coordinate minimum value y min , a maximum value y max , a depth minimum value z min , and a region defined by the maximum value z max . This bin constitutes the three-dimensional histogram 220 that holds the number of feature points 311. FIG. 7 is a diagram illustrating an example of the bin 221 of the three-dimensional histogram 220 in the three-dimensional space.
三次元ヒストグラム220の作成処理では、まず、図7に例示するように、640x480画素の画像210を4x4個の領域に分割した160x120画素の領域を生成し、さらに、各領域を奥行き方向に分割することでビン221を規定する。たとえば、奥行き0mから10mまでを10等分することで、奥行き1mの10個のビン221を規定できる。なお、ここに示すビン221のサイズは一例であり、環境に応じて任意のサイズのビン221を設定すれば良い。そして、各ビン内に存在する特徴点311の個数を数えることで、図6に例示する、三次元ヒストグラム220を作成する。ここで、三次元ヒストグラム220の作成には、i番目の固定カメラ200において撮影されたすべての画像210において検出された特徴点311の数を数える。すなわち、Nij枚の画像を用いる。また、特徴点311の奥行きZ’ijk
Ciは(式8)で計算される。
In the process of creating the three-dimensional histogram 220, first, as illustrated in FIG. 7, an area of 160x120 pixels is generated by dividing the image 210 of 640x480 pixels into 4x4 areas, and each area is further divided in the depth direction. Thus, the bin 221 is defined. For example, 10 bins 221 having a depth of 1 m can be defined by dividing the depth from 0 m to 10 m into 10 equal parts. Note that the size of the bin 221 shown here is merely an example, and the bin 221 having an arbitrary size may be set according to the environment. Then, by counting the number of feature points 311 present in each bin, the three-dimensional histogram 220 illustrated in FIG. 6 is created. Here, in the creation of the three-dimensional histogram 220, the number of feature points 311 detected in all the images 210 taken by the i-th fixed camera 200 is counted. That is, N ij images are used. Also, the depth Z ′ ijk Ci of the feature point 311 is calculated by (Equation 8).
次に、三次元ヒストグラム220の各ビン221について、センサ付き移動マーカ300の到達可否を判定する。具体的には、ステップS600で計算したセンサ付き移動マーカ300の移動可能領域を含むビン221をセンサ付き移動マーカ300が到達可能なビン、含まないビンをセンサ付き移動マーカ300が到達不可能なビンとする。
Next, for each bin 221 of the three-dimensional histogram 220, it is determined whether or not the movement marker 300 with sensor can be reached. Specifically, the bin 221 including the movable region of the moving marker 300 with sensor calculated in step S600 is a bin that the moving marker 300 with sensor can reach, and the bin that does not include the bin that the moving marker 300 with sensor cannot reach. And
ここで、センサ付き移動マーカ300のセンサとして、移動カメラ320やレーザレンジファインダなど周囲の三次元情報を計測可能な外界センサを用いた場合には、計測された三次元情報をセンサ付き移動マーカ300の到達可否の判定に利用する。具体的には、ビン221内にあらかじめ設定された閾値以上の三次元計測点が存在する場合には、障害物が存在すると考え、そのビンを到達不可能なビンと判定する。また、計測された三次元情報を用いて、三次元ヒストグラム220の各ビン221について、固定カメラ200による撮影可否を判定する。具体的には、画像210上の各領域について、障害物により到達不可能なビンと判定されたビン221より奥行きが大きいビン221は、障害物により固定カメラ200から撮影不可能なビン221と判定する。固定カメラ200から撮影不可能なビン221は、固定カメラ200の校正に用いることができないため、センサ付き移動マーカ300が到達不可能なビンとして処理する。
Here, when an external sensor capable of measuring surrounding three-dimensional information, such as a moving camera 320 or a laser range finder, is used as the sensor of the moving marker 300 with sensor, the measured three-dimensional information is used as the moving marker 300 with sensor. This is used to determine whether or not the destination is reachable. Specifically, when there are three-dimensional measurement points in the bin 221 that are equal to or greater than a preset threshold, it is considered that there is an obstacle and the bin is determined as an unreachable bin. In addition, using the measured three-dimensional information, it is determined whether the fixed camera 200 can shoot each bin 221 of the three-dimensional histogram 220. Specifically, for each area on the image 210, the bin 221 having a depth larger than the bin 221 determined as an unreachable bin by the obstacle is determined as a bin 221 that cannot be captured from the fixed camera 200 due to the obstacle. To do. Since the bin 221 that cannot be photographed from the fixed camera 200 cannot be used for calibration of the fixed camera 200, the moving marker with sensor 300 is processed as a bin that cannot be reached.
最後に、センサ付き移動マーカ300が到達可能なビン221の個数Ni3ebと、センサ付き移動マーカ300が到達可能かつビン221内の特徴点311の個数があらかじめ設定された閾値th3Dより大きいビン221の個数Ni3bから、マーカ310上の特徴点311の世界座標系における三次元分布度ai
3Dをai
3D=Ni3b/Ni3ebと計算する。
Finally, the number N i3eb of bins 221 that can be reached by the sensor-equipped moving marker 300 and the bin 221 that can be reached by the sensor-equipped moving marker 300 and the number of feature points 311 in the bin 221 is greater than a preset threshold th 3D. from number N I3b, the three-dimensional distribution of a i 3D in the world coordinate system of the feature point 311 on the marker 310 calculates the a i 3D = N i3b / N i3eb.
ステップS620に続くステップS630では、i番目の固定カメラ200で撮影されたマーカ310上の特徴点311の画像座標系における二次元分布度ai
2Dを計算する。
In step S630 following step S620, the two-dimensional distribution a i 2D in the image coordinate system of the feature point 311 on the marker 310 photographed by the i-th fixed camera 200 is calculated.
この計算のため、まず、マーカ310上の特徴点311の画像210中での位置を用いて、特徴点311の個数の二次元ヒストグラム230を作成する。図8は、二次元ヒストグラム230の一例を示す図である。二次元ヒストグラム230は、マーカ310上の特徴点311の画像210中での位置からなる空間における特徴点311の個数のヒストグラムである。ビン231は、画像x座標の最小値xmin、最大値xmax、画像y座標の最小値ymin、最大値ymax、により規定される領域内の特徴点311の個数を保持する、二次元ヒストグラム230を構成するビンである。また、図9は、二次元ヒストグラム230のビン231の一例を示す図である。このステップでは、ステップS620で作成した三次元ヒストグラム220を用い、画像上の領域毎に、すべての奥行きのビン221に含まれる特徴点311の個数を足すことで、二次元ヒストグラム230を作成する。
For this calculation, first, a two-dimensional histogram 230 of the number of feature points 311 is created using the positions of the feature points 311 on the marker 310 in the image 210. FIG. 8 is a diagram illustrating an example of the two-dimensional histogram 230. The two-dimensional histogram 230 is a histogram of the number of feature points 311 in the space formed by the positions of the feature points 311 on the marker 310 in the image 210. The bin 231 holds the number of feature points 311 in the region defined by the minimum value x min , the maximum value x max of the image x coordinate, the minimum value y min of the image y coordinate, and the maximum value y max . This is a bin constituting the histogram 230. FIG. 9 is a diagram illustrating an example of the bin 231 of the two-dimensional histogram 230. In this step, the two-dimensional histogram 230 is created by using the three-dimensional histogram 220 created in step S620 and adding the number of feature points 311 included in all depth bins 221 for each region on the image.
次に、二次元ヒストグラム230の各ビン231について、センサ付き移動マーカ300の到達可否を判定する。ステップS620で作成した三次元ヒストグラム220を用い、二次元ヒストグラム230の各ビン231について、同じ画像領域に対して、1つでも到達可能な三次元ヒストグラム220のビン221が存在する場合には到達可能なビンと判定する。到達可能な三次元ヒストグラムのビン221が1つも存在しない場合には到達不可能なビンと判定する。
Next, for each bin 231 of the two-dimensional histogram 230, it is determined whether or not the movement marker 300 with sensor can be reached. Using the three-dimensional histogram 220 created in step S620, for each bin 231 of the two-dimensional histogram 230, if there is at least one bin 221 of the three-dimensional histogram 220 that can be reached with respect to the same image area, it can be reached. Judge that the bin is correct. If no reachable 3D histogram bin 221 exists, it is determined that the bin is not reachable.
最後に、センサ付き移動マーカ300が到達可能なビン231の個数Ni2ebと、センサ付き移動マーカ300が到達可能かつビン231内の特徴点311の個数があらかじめ設定された閾値th2Dより大きいビン231の個数Ni2bから、マーカ310上の特徴点311の画像座標系における二次元分布度ai
2Dをai
2D=Ni2b/Ni2ebと計算する。
Finally, the number N i2eb of bins 231 that can be reached by the sensor-equipped moving marker 300 and the bin 231 that can be reached by the sensor-equipped moving marker 300 and the number of feature points 311 in the bin 231 is greater than a preset threshold th 2D. from number N i2b, the two-dimensional distribution of a i 2D in the image coordinate system of the feature point 311 on the marker 310 calculates the a i 2D = N i2b / N i2eb.
ステップS630に続くステップS640では、ステップS610~S630で求めた距離di、三次元分布度ai
3D、二次元分布度ai
2Dと、(式9)に基づいて、各々の固定カメラの校正の優先度aiを計算する。
In step S640 following step S630, each fixed camera is calibrated based on the distance d i obtained in steps S610 to S630, the three-dimensional distribution a i 3D , the two-dimensional distribution a i 2D , and (Equation 9). Priority a i is calculated.
ここで、λd、λ2D、λ3Dはあらかじめ設定された、センサ付き移動マーカ300からの距離di、三次元分布度ai
3D、および二次元分布度ai
2Dに対する重みである。
Here, λ d , λ 2D , and λ 3D are preset weights for the distance d i from the sensor-equipped moving marker 300, the three-dimensional distribution a i 3D , and the two-dimensional distribution a i 2D .
すべての固定カメラ200の優先度aiが計算された後、ステップS650で、すべての固定カメラ200の中から優先度aiが最も高い固定カメラ200を選択する。
After the priorities a i of all the fixed cameras 200 are calculated, the fixed camera 200 having the highest priority a i is selected from all the fixed cameras 200 in step S650.
そして、ステップS650に続くステップS660では、ステップS650で選択された、最も優先度が高い固定カメラ200を校正するため必要とされる、センサ付き移動マーカ300の目標移動位置を計算する。この計算の詳細を、図10のフローチャートを用いて説明する。
In step S660 following step S650, the target movement position of the sensor-equipped movement marker 300, which is required to calibrate the fixed camera 200 having the highest priority selected in step S650, is calculated. Details of this calculation will be described with reference to the flowchart of FIG.
ステップS661からステップS664はステップS650で選択された固定カメラ200に対して、ステップS620で作成された三次元ヒストグラム220のビン221毎に処理される。
Steps S661 to S664 are processed for each bin 221 of the three-dimensional histogram 220 created in Step S620 for the fixed camera 200 selected in Step S650.
ステップS661では、三次元ヒストグラム220のビン221の中心位置とセンサ付き移動マーカ300の距離dbを計算する。i番目の固定カメラ200の固定カメラ座標系
In step S661, the distance d b between the center position of the bin 221 of the three-dimensional histogram 220 and the moving marker 300 with sensor is calculated. Fixed camera coordinate system of i-th fixed camera 200
におけるビン221の中心位置pb
Ciは(式10)で計算される。
The center position p b Ci of the bin 221 at is calculated by (Equation 10).
ビン221の中心位置pb
Ciとセンサ付き移動マーカ300の距離dbは(式11.1)、(式11.2)で計算される。
The distance d b between the center position p b Ci of the bin 221 and the moving marker 300 with sensor is calculated by (Expression 11.1) and (Expression 11.2).
ステップS662では、三次元ヒストグラム220のビン221の三次元充足度bb
3Dを計算する。ビン221の三次元充足度bb
3Dは、ビン221に含まれる特徴点311の個数が閾値th3D以上の場合は1、閾値th3D未満の場合は0とする。
In step S662, the three-dimensional sufficiency b b 3D of the bin 221 of the three-dimensional histogram 220 is calculated. Dimensional fullness b b 3D bin 221, if the number of feature points 311 included in the bottle 221 is equal to or greater than the threshold th 3D 1, if it is less than the threshold th 3D to 0.
ステップS663では、三次元ヒストグラム220のビン221の二次元充足度bb
2Dを計算する。ビン221の二次元充足度bb
2Dは、三次元ヒストグラム220のビン221と同じ画像領域を持つ、ステップS630で作成した二次元ヒストグラム230のビン231に含まれる特徴点311の個数が閾値th2D以上の場合は1、閾値th2D未満の場合は0とする。
In step S663, the two-dimensional sufficiency b b 2D of the bin 221 of the three-dimensional histogram 220 is calculated. The two-dimensional sufficiency b b 2D of the bin 221 has the same threshold value th 2D as the number of feature points 311 included in the bin 231 of the two-dimensional histogram 230 created in step S630 having the same image area as the bin 221 of the three-dimensional histogram 220. It is set to 1 in the above case, and set to 0 if less than the threshold th 2D .
ステップS664では、センサ付き移動マーカ300からの距離db、三次元充足度bb
3D、および、二次元充足度bb
2Dから、三次元ヒストグラム220のビン221の優先度bbを計算する。ビン221の優先度bbは(式12)により計算される。
In step S664, the priority b b of the bin 221 of the three-dimensional histogram 220 is calculated from the distance d b from the sensor-equipped moving marker 300, the three-dimensional sufficiency b b 3D , and the two-dimensional sufficiency b b 2D . The priority b b of the bin 221 is calculated by (Equation 12).
ここで、λ’d、λ’3D、λ’2Dはあらかじめ設定された、センサ付き移動マーカ300からの距離db、三次元充足度bb
3D、および二次元充足度bb
2Dに対する重みである。
Here, λ ′ d , λ ′ 3D , and λ ′ 2D are preset weights for the distance d b from the sensor-equipped moving marker 300, the three-dimensional sufficiency b b 3D , and the two-dimensional sufficiency b b 2D . is there.
ステップS665では、ステップS620で計算された三次元ヒストグラム220のビン221に対するセンサ付き移動マーカ300の到達可否を用い、到達可能なビン221の中から優先度bbが最も高いビン221の世界座標系における中心位置pb
Wを目標移動位置として出力する。
In step S665, the world coordinate system of the bin 221 having the highest priority b b from among the reachable bins 221 using the reachability of the sensor-attached movement marker 300 with respect to the bin 221 of the three-dimensional histogram 220 calculated in step S620. The center position p b W at is output as the target movement position.
ここで、移動計画部103により出力された目標移動位置へセンサ付き移動マーカ300が移動できない場合が想定される。そのような場合には、センサ付き移動マーカ300は、移動計画部103により出力された目標移動位置をセンサ付き移動マーカ300へ指示する移動指示部104のインターフェースを介して、移動が実現できないことを伝える信号を送信する。移動計画部103は、信号を受信した場合、ステップS665において、優先度の最も高い三次元ヒストグラム220のビン221を選択の候補から除外し、次に優先度の高いビン221の中心位置を目標移動位置として出力する。また、同じ固定カメラ200に対して、あらかじめ設定された回数以上連続して、移動が実現できないという信号を受信した場合には、ステップS650において、最も優先度の高い固定カメラ200を選択の候補から除外し、次に優先度の高い固定カメラ200に対して、ステップS660の処理を実行する。
(移動指示部104の動作)
次に、図11を用いて、移動指示部104の動作について説明する。移動指示部104は、移動計画部103が出力した目標移動位置へセンサ付き移動マーカ300を移動させるための指示を出す。 Here, the case where themovement marker 300 with a sensor cannot move to the target movement position output by the movement plan part 103 is assumed. In such a case, the movement marker 300 with the sensor cannot move through the interface of the movement instruction unit 104 that instructs the movement marker 300 with the sensor the target movement position output by the movement planning unit 103. Send a signal to convey. When receiving the signal, the movement planning unit 103 excludes the bin 221 of the three-dimensional histogram 220 having the highest priority from the selection candidates in step S665 and moves the center position of the bin 221 having the next highest priority to the target movement. Output as position. If a signal indicating that the movement cannot be continuously performed for the same fixed camera 200 more than a preset number of times is received, in step S650, the fixed camera 200 with the highest priority is selected from the selection candidates. The process of step S660 is executed on the fixed camera 200 with the next highest priority.
(Operation of the movement instruction unit 104)
Next, the operation of themovement instruction unit 104 will be described with reference to FIG. The movement instruction unit 104 issues an instruction for moving the sensor-equipped movement marker 300 to the target movement position output by the movement planning unit 103.
(移動指示部104の動作)
次に、図11を用いて、移動指示部104の動作について説明する。移動指示部104は、移動計画部103が出力した目標移動位置へセンサ付き移動マーカ300を移動させるための指示を出す。 Here, the case where the
(Operation of the movement instruction unit 104)
Next, the operation of the
図2のように、センサ付き移動マーカ300を移動ロボット350が移動させる場合には、移動指示部104は、移動ロボット350が目標移動位置へ移動するための制御信号を出力することで、移動ロボット350を移動させる。
As shown in FIG. 2, when the mobile robot 350 moves the sensor-equipped movement marker 300, the movement instructing unit 104 outputs a control signal for moving the mobile robot 350 to the target movement position. 350 is moved.
一方、センサ付き移動マーカ300を校正作業者が移動させる場合には、移動指示部104は、三次元もしくは二次元の地図に目標移動位置250を表示することで校正作業者に指示を出す。図11は移動指示部104によって作成された三次元地図の一例を示す図である。この例では、移動マーカ校正部101が推定したセンサ付き移動マーカ300の現在の位置・姿勢と、軌跡360がコンピュータグラフィックスで表示されている。また、固定カメラ校正部102が推定した固定カメラ200の位置・姿勢がコンピュータグラフィックスで表示されており、ステップS650で選択された校正対象の右側の固定カメラ200には色を付けて強調されている。
On the other hand, when the calibration operator moves the movement marker 300 with the sensor, the movement instruction unit 104 gives an instruction to the calibration operator by displaying the target movement position 250 on a three-dimensional or two-dimensional map. FIG. 11 is a diagram illustrating an example of a three-dimensional map created by the movement instruction unit 104. In this example, the current position / posture of the moving marker with sensor 300 estimated by the moving marker calibration unit 101 and the locus 360 are displayed in computer graphics. Further, the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 is displayed by computer graphics, and the right fixed camera 200 to be calibrated selected in step S650 is colored and emphasized. Yes.
ここで、センサ付き移動マーカ300のセンサとして、移動カメラ320やレーザレンジファインダのように、周囲の三次元情報を計測可能な外界センサを用いた場合には、計測された三次元情報を三次元もしくは二次元の地図にコンピュータグラフィックスで表示しても良い。
Here, when an external sensor capable of measuring surrounding three-dimensional information, such as a moving camera 320 or a laser range finder, is used as the sensor of the movement marker 300 with a sensor, the measured three-dimensional information is three-dimensional. Alternatively, it may be displayed on a two-dimensional map with computer graphics.
また、移動指示部104は、現在のセンサ付き移動マーカ300の位置・姿勢からの移動量により移動を指示しても良い。センサ付き移動マーカ300のマーカ座標系における目標移動位置の座標pb
Mは(式13)により計算される。
Further, the movement instruction unit 104 may instruct movement based on the amount of movement from the current position / posture of the movement marker 300 with sensor. The coordinate p b M of the target movement position in the marker coordinate system of the movement marker 300 with sensor is calculated by (Equation 13).
pb
Mは、たとえば、前方に10m、右に5m、上に0.5mというように、現在のセンサ付き移動マーカ300の位置・姿勢からの移動量に相当する。センサ付き移動マーカ300を移動ロボット350が移動させる場合には、移動量を移動させるための制御信号を出力することで、移動ロボット350を移動させる。センサ付き移動マーカ300を校正作業者が移動させる場合には、たとえば、スピーカなどによる音声出力や、ディスプレイによる画面出力により、移動量を指示する。
(出力部105の動作)
次に、図12を用いて、出力部105の動作について説明する。出力部105は、固定カメラ校正部102により推定された固定カメラ200の位置・姿勢を、固定カメラ200内のRAMや、固定カメラ200の管理サーバ等に出力する。また、出力部105は固定カメラ校正部102により推定された固定カメラ200の位置・姿勢を三次元もしくは二次元の地図を用いて表示しても良い。 p b M corresponds to the amount of movement from the current position / posture of the movingmarker 300 with sensor, for example, 10 m forward, 5 m right, and 0.5 m upward. When the mobile robot 350 moves the sensor-equipped movement marker 300, the mobile robot 350 is moved by outputting a control signal for moving the movement amount. When the calibration operator moves the sensor-equipped movement marker 300, the movement amount is instructed by, for example, sound output from a speaker or the like, or screen output from a display.
(Operation of output unit 105)
Next, the operation of theoutput unit 105 will be described with reference to FIG. The output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102 to a RAM in the fixed camera 200, a management server of the fixed camera 200, and the like. The output unit 105 may display the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 using a three-dimensional or two-dimensional map.
(出力部105の動作)
次に、図12を用いて、出力部105の動作について説明する。出力部105は、固定カメラ校正部102により推定された固定カメラ200の位置・姿勢を、固定カメラ200内のRAMや、固定カメラ200の管理サーバ等に出力する。また、出力部105は固定カメラ校正部102により推定された固定カメラ200の位置・姿勢を三次元もしくは二次元の地図を用いて表示しても良い。 p b M corresponds to the amount of movement from the current position / posture of the moving
(Operation of output unit 105)
Next, the operation of the
図12は、出力部105が出力する二次元地図の一例を示す図である。図12は移動計画部103のステップS600で当てはめられた、センサ付き移動マーカ300の移動可能領域を表わす平面の法線方向から描画されている。図12では、固定カメラ校正部102により推定された固定カメラ200の位置・姿勢が、撮影可能範囲240と共にコンピュータグラフィックスによって表示される。軌跡360は、移動マーカ校正部101により推定されたセンサ付き移動マーカ300の位置を示す。撮影可能範囲240は、固定カメラ200の撮影可能範囲であり、固定カメラ200の焦点距離や画角などから計算することができる。点370は、センサ付き移動マーカ300のセンサとしてカメラやレーザレンジファインダなど、周囲の三次元情報を計測可能な外界センサを用いた場合に、外界センサによって計測された障害物と思われる三次元計測点である。ここで、外界センサによる計測結果の表示方法は点に限定されない。たとえば、外界センサによって計測された三次元計測点に複数の平面を当てはめ、当てはめられた平面を表示しても良い。
(作用効果)
上述した実施例1によれば、次の作用効果が得られる。 FIG. 12 is a diagram illustrating an example of a two-dimensional map output by theoutput unit 105. FIG. 12 is drawn from the normal direction of the plane representing the movable region of the movement marker 300 with sensor, which is applied in step S600 of the movement planning unit 103. In FIG. 12, the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 is displayed together with the imageable range 240 by computer graphics. A trajectory 360 indicates the position of the moving marker 300 with the sensor estimated by the moving marker calibration unit 101. The shootable range 240 is a shootable range of the fixed camera 200, and can be calculated from the focal length and the angle of view of the fixed camera 200. A point 370 is a three-dimensional measurement that seems to be an obstacle measured by the external sensor when an external sensor capable of measuring surrounding three-dimensional information, such as a camera or a laser range finder, is used as the sensor of the moving marker 300 with the sensor. Is a point. Here, the display method of the measurement result by the external sensor is not limited to the point. For example, a plurality of planes may be applied to the three-dimensional measurement points measured by the external sensor, and the applied planes may be displayed.
(Effect)
According to Example 1 mentioned above, the following effects are obtained.
(作用効果)
上述した実施例1によれば、次の作用効果が得られる。 FIG. 12 is a diagram illustrating an example of a two-dimensional map output by the
(Effect)
According to Example 1 mentioned above, the following effects are obtained.
(1)カメラ校正装置100では、固定カメラ校正部102が推定した固定カメラの世界座標系における位置・姿勢を踏まえて、移動計画部103がセンサ付き移動マーカ300の目標移動位置を計画し、移動指示部104がセンサ付き移動マーカ300に目標移動位置を指示するようにした。そのため、校正作業者の熟練度に拘らず、短時間で複数の固定カメラ200の位置・姿勢を高精度に校正できる。
(1) In the camera calibration apparatus 100, the movement planning unit 103 plans the target movement position of the sensor-equipped movement marker 300 based on the position and orientation of the fixed camera in the world coordinate system estimated by the fixed camera calibration unit 102, and moves The instruction unit 104 instructs the target movement position to the movement marker 300 with a sensor. Therefore, regardless of the skill level of the calibration operator, the positions and postures of the plurality of fixed cameras 200 can be calibrated with high accuracy in a short time.
(2) 移動計画部103は、移動マーカ校正部101が推定したセンサ付き移動マーカ300の世界座標系における位置・姿勢と、固定カメラ校正部102が推定した固定カメラ200の世界座標系における位置・姿勢から、センサ付き移動マーカ300と固定カメラ200の間の距離を計算し、距離が近い固定カメラ200を優先的に選択して校正対象とする(図5 ステップS610、ステップS650)。さらに、センサ付き移動マーカ300と選択された固定カメラ200の三次元ヒストグラム220のビン221の中心位置との間の距離を計算し、距離が近いビン221の中心位置を目標移動位置とする(図10 ステップS661、ステップS665)。そのため、センサ付き移動マーカ300からの距離が短い目標移動位置を選択することで、目標移動位置までの移動にかかる時間が短くなり、短時間で複数の固定カメラ200の校正ができる。
(2) The heel movement planning unit 103 includes the position / posture of the sensor-equipped moving marker 300 estimated by the moving marker calibration unit 101 in the world coordinate system and the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102 in the world coordinate system. The distance between the sensor-equipped moving marker 300 and the fixed camera 200 is calculated from the posture, and the fixed camera 200 with a short distance is preferentially selected as a calibration target (steps S610 and S650 in FIG. 5). Further, the distance between the movement marker with sensor 300 and the center position of the bin 221 of the selected three-dimensional histogram 220 of the fixed camera 200 is calculated, and the center position of the bin 221 having a short distance is set as the target movement position (FIG. 10 step S661, step S665). Therefore, by selecting a target movement position with a short distance from the sensor-equipped movement marker 300, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be calibrated in a short time.
(3) 移動計画部103は、移動マーカ校正部101が推定したセンサ付き移動マーカ300の世界座標系における位置・姿勢と、固定カメラ校正部102の結果から、マーカ310上の特徴点311の三次元分布度および二次元分布度を計算し、分布度が小さい固定カメラ200を優先的に選択して校正対象とする(図5 ステップS620~ステップS650)。さらに、選択された固定カメラ200の三次元ヒストグラム220の各ビン221に対し、マーカ310上の特徴点311の三次元充足度および二次元充足度を計算し、充足度が小さい三次元ヒストグラム220のビン221の中心位置を目標移動位置とする(図10 ステップS662~ステップS665)。そのため、三次元空間および二次元空間の様々な位置の情報を固定カメラ200の校正に利用することにより、固定カメラ200の位置・姿勢を高精度に校正できる。
(3) The heel movement planning unit 103 determines the tertiary of the feature point 311 on the marker 310 based on the position / posture in the world coordinate system of the moving marker with sensor 300 estimated by the moving marker calibration unit 101 and the result of the fixed camera calibration unit 102. The original distribution degree and the two-dimensional distribution degree are calculated, and the fixed camera 200 having a small distribution degree is preferentially selected as a calibration target (steps S620 to S650 in FIG. 5). Further, for each bin 221 of the selected three-dimensional histogram 220 of the fixed camera 200, the three-dimensional satisfaction degree and the two-dimensional satisfaction degree of the feature point 311 on the marker 310 are calculated, and the three-dimensional histogram 220 having a small satisfaction degree is calculated. The center position of the bin 221 is set as the target movement position (FIG. 10, step S662 to step S665). Therefore, by using information on various positions in the three-dimensional space and the two-dimensional space for calibration of the fixed camera 200, the position and orientation of the fixed camera 200 can be calibrated with high accuracy.
(4) 移動計画部103は、センサ付き移動マーカ300と固定カメラ200の間の距離と、マーカ310上の特徴点311の三次元分布度および二次元分布度と、を同時に用いて固定カメラ200の優先度を計算し、優先度が高い固定カメラ200を選択して校正対象とする(図5 ステップS610~ステップS650)。さらに、選択された固定カメラ200の三次元ヒストグラム220の各ビン221に対し、センサ付き移動マーカ300とビン221の中心位置との間の距離と、マーカ310上の特徴点311の三次元充足度および二次元充足度と、を同時に用いてビン211の優先度を計算し、優先度が高い三次元ヒストグラム220のビン221の中心位置を目標移動位置とする(図10 ステップS661~ステップ665)。そのため、センサ付き移動マーカ300からの距離が短い目標移動位置を選択し、かつ、三次元空間および二次元空間の様々な位置の情報を固定カメラ200の校正に利用することにより、短時間で複数の固定カメラ200の位置・姿勢を高精度に校正できる。
(4) The heel movement planning unit 103 uses the distance between the sensor-equipped moving marker 300 and the fixed camera 200 and the three-dimensional distribution degree and the two-dimensional distribution degree of the feature point 311 on the marker 310 at the same time. The fixed camera 200 having a high priority is selected and set as a calibration target (steps S610 to S650 in FIG. 5). Furthermore, for each bin 221 of the three-dimensional histogram 220 of the selected fixed camera 200, the distance between the sensor-equipped moving marker 300 and the center position of the bin 221 and the three-dimensional satisfaction of the feature point 311 on the marker 310 Further, the priority of the bin 211 is calculated using the two-dimensional sufficiency at the same time, and the center position of the bin 221 of the three-dimensional histogram 220 having a high priority is set as the target movement position (step S661 to step 665 in FIG. 10). Therefore, by selecting a target movement position having a short distance from the movement marker 300 with the sensor and using information on various positions in the three-dimensional space and the two-dimensional space for calibration of the fixed camera 200, a plurality of pieces can be obtained in a short time. The position / posture of the fixed camera 200 can be calibrated with high accuracy.
(5) 移動計画部103は、センサ付き移動マーカ300の移動可能領域を計算することで、三次元ヒストグラム220のビン221および二次元ヒストグラム230のビン231に対し、センサ付き移動マーカ300の到達可否を判定し、到達可能なビンを用いて、固定カメラ200の選択および目標移動位置の計算を行う(図5 ステップS600、ステップS620~ステップS650、図10 ステップS665)。そのため、センサ付き移動マーカ300が到達不可能な目標移動位置を指示することが防止され、短時間で複数の固定カメラ200の校正ができる。
(5) The heel movement planning unit 103 calculates whether or not the movement marker 300 with sensor reaches the bin 221 of the three-dimensional histogram 220 and the bin 231 of the two-dimensional histogram 230 by calculating the movable region of the movement marker 300 with sensor. Using the reachable bin, the fixed camera 200 is selected and the target movement position is calculated (step S600 in FIG. 5, step S620 to step S650, step S665 in FIG. 10). Therefore, it is prevented that the movement marker 300 with the sensor indicates a target movement position that cannot be reached, and a plurality of fixed cameras 200 can be calibrated in a short time.
(6) 移動計画部103は、センサ付き移動マーカ300のセンサが三次元情報を計測可能な外界センサの場合には、三次元ヒストグラム220のビン221および二次元ヒストグラム230のビン231に対し、三次元情報を用いて、センサ付き移動マーカ300の到達可否および、固定カメラ200による撮影可否を判定し、到達可能かつ撮影可能なビンを用いて、固定カメラ200の選択および目標移動位置の計算を行う(図5 ステップS600、ステップS620~ステップS650、図10 ステップS665)。そのため、センサ付き移動マーカ300が到達不可能な目標移動位置および、固定カメラ200が撮影不可能な目標移動位置を指示することが防止され、短時間で複数の固定カメラ200の校正ができる。
(6) When the sensor of the sensor-equipped movement marker 300 is an external sensor capable of measuring three-dimensional information, the kite movement planning unit 103 performs a cubic operation on the bin 221 of the three-dimensional histogram 220 and the bin 231 of the two-dimensional histogram 230. Using the original information, it is determined whether or not the moving marker with sensor 300 can be reached and whether or not the fixed camera 200 can shoot, and using the reachable and shootable bin, the fixed camera 200 is selected and the target moving position is calculated. (FIG. 5 step S600, steps S620 to S650, FIG. 10 step S665). Therefore, it is possible to prevent a target movement position that cannot be reached by the sensor-equipped movement marker 300 and a target movement position that cannot be photographed by the fixed camera 200, and a plurality of fixed cameras 200 can be calibrated in a short time.
(7) 移動指示部104は、センサ付き移動マーカ300を移動ロボット350が移動させる場合には、移動ロボット350が目標移動位置へ移動するための制御信号を出力することで、移動ロボット350を移動させる。そのため、自動で複数の固定カメラ200の校正ができる。
(7) The heel movement instruction unit 104 moves the mobile robot 350 by outputting a control signal for the mobile robot 350 to move to the target movement position when the mobile robot 350 moves the sensor-equipped movement marker 300. Let For this reason, a plurality of fixed cameras 200 can be automatically calibrated.
(8) 移動指示部104は、センサ付き移動マーカ300を校正作業者が移動させる場合には、移動マーカ校正部101により推定されたセンサ付き移動マーカ300の世界座標系における位置・姿勢と、固定カメラ校正部102により推定された固定カメラ200の世界座標系における位置・姿勢と、移動計画部103により出力された目標移動位置と、を二次元もしくは三次元の地図に表示することで、移動を指示する(図11)。そのため、センサ付き移動マーカ300を移動させる校正作業者が、目標移動位置を容易に把握することができるため、目標移動位置までの移動にかかる時間が短くなり、短時間で複数の固定カメラ200の校正ができる。
(8) When the calibration operator moves the sensor-equipped movement marker 300, the heel movement instruction unit 104 fixes the position / posture in the world coordinate system of the sensor-equipped movement marker 300 estimated by the movement marker calibration unit 101. By displaying the position / posture of the fixed camera 200 in the world coordinate system estimated by the camera calibration unit 102 and the target movement position output by the movement planning unit 103 on a two-dimensional or three-dimensional map, movement can be achieved. Instruct (FIG. 11). Therefore, since the calibration operator who moves the movement marker 300 with the sensor can easily grasp the target movement position, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be quickly connected. Can be calibrated.
(9)カメラ校正装置100では、出力部105が、移動マーカ校正部101により推定されたセンサ付き移動マーカ300の世界座標系における位置・姿勢と、固定カメラ校正部102により推定された固定カメラ200の世界座標系における位置・姿勢と、固定カメラ200の焦点距離や画角などから計算された固定カメラ200の撮影可能範囲と、を二次元もしくは三次元の地図に表示するようにした(図12)。そのため、校正作業者は容易に固定カメラ200の校正結果を確認することができる。
(変形例1)
上述の実施例1では、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度を計算し、優先度が最も高いビン221の中心位置を目標移動位置とした(図10 ステップS665)。しかし、移動計画部103の出力はこれに限定されない。 (9) In thecamera calibration apparatus 100, the output unit 105 includes the position / posture in the world coordinate system of the movement marker 300 with sensor estimated by the movement marker calibration unit 101, and the fixed camera 200 estimated by the fixed camera calibration unit 102. The position / posture in the world coordinate system and the shootable range of the fixed camera 200 calculated from the focal length and angle of view of the fixed camera 200 are displayed on a two-dimensional or three-dimensional map (FIG. 12). ). Therefore, the calibration operator can easily confirm the calibration result of the fixed camera 200.
(Modification 1)
In the first embodiment described above, themovement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220 and sets the center position of the bin 221 having the highest priority as the target movement position (step S665 in FIG. 10). . However, the output of the movement planning unit 103 is not limited to this.
(変形例1)
上述の実施例1では、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度を計算し、優先度が最も高いビン221の中心位置を目標移動位置とした(図10 ステップS665)。しかし、移動計画部103の出力はこれに限定されない。 (9) In the
(Modification 1)
In the first embodiment described above, the
例えば、移動計画部103は、優先度が最も高いビン221の画像x座標の最小値xmin、最大値xmax、画像y座標の最小値ymin、最大値ymax、奥行きの最小値zmin、最大値zmaxからなる所定のビン221の領域全体を目標移動位置として出力しても良い。そして、移動指示部104は、移動マーカ校正部101により推定された現在のセンサ付き移動マーカ300の位置・姿勢と、過去のセンサ付き移動マーカ300の位置と、固定カメラ校正部102により推定された固定カメラ200の位置・姿勢と、移動計画部103により出力された所定のビンの領域全体と、を二次元もしくは三次元の地図に表示することで、移動を指示する。
For example, the movement planning unit 103 sets the minimum value x min of the image x coordinate, the maximum value x max , the minimum value y min of the image y coordinate, the maximum value y max , and the minimum value z min of the bin 221 having the highest priority. The entire area of the predetermined bin 221 having the maximum value z max may be output as the target movement position. The movement instruction unit 104 then estimates the current position / posture of the moving marker 300 with sensor estimated by the moving marker calibration unit 101, the past position of the moving marker 300 with sensor, and the fixed camera calibration unit 102. The movement is instructed by displaying the position / posture of the fixed camera 200 and the entire area of the predetermined bin output by the movement planning unit 103 on a two-dimensional or three-dimensional map.
この変形例1によれば、次の作用効果が得られる。すなわち、移動計画部103は、目標移動位置として所定のビン221の領域全体を目標移動位置として出力し、移動指示部104は、移動計画部103が出力した領域を目標移動位置として表示する。そのため、目標移動位置を容易に把握することができるため、目標移動位置までの移動にかかる時間が短くなり、短時間で複数の固定カメラ200の校正ができる。
(変形例2)
上述の実施例1、変形例1では、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度を計算し、優先度が最も高いビン221の中心位置もしくは領域全体を目標移動位置として出力した(図10 ステップS665)。しかし、移動計画部103の出力はこれに限定されない。 According to the first modification, the following operational effects can be obtained. That is, themovement planning unit 103 outputs the entire area of the predetermined bin 221 as the target movement position, and the movement instruction unit 104 displays the area output by the movement planning unit 103 as the target movement position. Therefore, since the target movement position can be easily grasped, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be calibrated in a short time.
(Modification 2)
In the first embodiment and the first modification described above, themovement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220, and uses the center position or the entire area of the bin 221 having the highest priority as the target movement position. This is output (step S665 in FIG. 10). However, the output of the movement planning unit 103 is not limited to this.
(変形例2)
上述の実施例1、変形例1では、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度を計算し、優先度が最も高いビン221の中心位置もしくは領域全体を目標移動位置として出力した(図10 ステップS665)。しかし、移動計画部103の出力はこれに限定されない。 According to the first modification, the following operational effects can be obtained. That is, the
(Modification 2)
In the first embodiment and the first modification described above, the
例えば、移動計画部103は、優先度が最も高いビン221の中心位置もしくは領域全体に加え、優先度第二位、第三位など、優先度があらかじめ設定された閾値以上のすべてのビン221の中心位置もしくは領域全体を目標移動位置として出力しても良い。また、あらかじめ設定された個数の優先度が高いビン221の中心位置もしくは領域を目標移動位置として出力しても良い。移動指示部104は、移動計画部103が出力したすべての目標移動位置を二次元もしくは三次元の地図に表示することで、移動を指示する。
For example, in addition to the center position or the entire region of the bin 221 having the highest priority, the movement planning unit 103 sets all the bins 221 whose priority is equal to or higher than a preset threshold, such as second priority and third priority. The center position or the entire area may be output as the target movement position. Alternatively, the center position or area of the bin 221 having a preset number of high priorities may be output as the target movement position. The movement instruction unit 104 instructs movement by displaying all target movement positions output by the movement planning unit 103 on a two-dimensional or three-dimensional map.
さらに、移動計画部103は、すべてのビン221の中心位置もしくは領域全体をビンの優先度と共に出力しても良い。移動指示部104は、移動計画部103が出力したすべてのビン221の中心位置もしくは領域を、ビンの優先度によって色分けして二次元もしくは三次元の地図へ表示することで、移動を指示する。
Furthermore, the movement planning unit 103 may output the center position or the entire area of all the bins 221 together with the bin priority. The movement instructing unit 104 instructs the movement by displaying the center positions or areas of all the bins 221 output from the movement planning unit 103 on the two-dimensional or three-dimensional map by color coding according to the bin priority.
この変形例2によれば、次の作用効果が得られる。すなわち、移動計画部103は、優先度が高い複数の目標移動位置を出力し、移動指示部104は、移動計画部103が出力した優先度が高い複数の目標移動位置を表示する。そのため、校正作業者がセンサ付き移動マーカ300を移動させる場合、複数の優先度が高い目標移動位置を効率的に通るように移動することができ、短時間で複数の固定カメラ200の位置・姿勢を高精度に校正できる。
(変形例3)
上述の実施例1、変形例1では、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度を計算し、優先度が最も高いビン221の中心位置もしくは領域を目標移動位置として出力した(図10 ステップS665)。しかし、移動計画部103の出力はこれに限定されない。 According to the second modification, the following operational effects can be obtained. That is, themovement planning unit 103 outputs a plurality of target movement positions with high priority, and the movement instruction unit 104 displays a plurality of target movement positions with high priority output by the movement planning unit 103. Therefore, when the calibration operator moves the sensor-equipped movement marker 300, it can move so as to efficiently pass through a plurality of high priority target movement positions, and the position / posture of the plurality of fixed cameras 200 can be achieved in a short time. Can be calibrated with high accuracy.
(Modification 3)
In the first embodiment and the first modification described above, themovement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220 and outputs the center position or region of the bin 221 having the highest priority as the target movement position. (Step S665 in FIG. 10). However, the output of the movement planning unit 103 is not limited to this.
(変形例3)
上述の実施例1、変形例1では、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度を計算し、優先度が最も高いビン221の中心位置もしくは領域を目標移動位置として出力した(図10 ステップS665)。しかし、移動計画部103の出力はこれに限定されない。 According to the second modification, the following operational effects can be obtained. That is, the
(Modification 3)
In the first embodiment and the first modification described above, the
例えば、移動計画部103は、三次元ヒストグラム220の各ビン221の優先度に基づいて、優先度が高いビン221を効率的に移動可能な経路を計算し、目標移動位置として出力しても良い。具体的には、現在、センサ付き移動マーカ300が存在するビン221を開始位置とすると、画像座標および奥行きについて隣接するビン221への移動を1回と数え、あらかじめ設定された回数ビンを移動するすべての移動経路について、センサ付き移動マーカが到達可能かつ移動経路上のビン221の優先度の総和を計算し、優先度の総和が最も高い移動経路を目標移動位置として出力する。ただし、移動経路の計算方法はこれに限定されず、他の公知の経路計画法を用いることができる。
For example, the movement planning unit 103 may calculate a route on which the bin 221 having a high priority can be efficiently moved based on the priority of each bin 221 of the three-dimensional histogram 220, and output it as the target movement position. . Specifically, assuming that the bin 221 where the sensor-equipped movement marker 300 currently exists is a start position, the movement to the adjacent bin 221 with respect to image coordinates and depth is counted as one time, and the bin is moved a preset number of times. For all the movement paths, the sum of the priorities of the bins 221 on the movement path that can be reached by the sensor-attached movement marker is calculated, and the movement path with the highest priority sum is output as the target movement position. However, the moving route calculation method is not limited to this, and other known route planning methods can be used.
移動指示部104は、センサ付き移動マーカ300を移動ロボット350が移動させる場合には、移動計画部103が目標移動位置として出力した経路を移動ロボット350が通るように、制御信号を出力することで、移動ロボット350を移動させる。一方、センサ付き移動マーカ300を校正作業者が移動させる場合には、二次元もしくは三次元の地図に移動計画部103が目標移動位置として出力した経路を表示することで、移動を指示する。
When the mobile robot 350 moves the sensor-equipped movement marker 300, the movement instruction unit 104 outputs a control signal so that the mobile robot 350 passes the route output as the target movement position by the movement planning unit 103. The mobile robot 350 is moved. On the other hand, when the calibration operator moves the movement marker with sensor 300, the movement is instructed by displaying the route output as the target movement position by the movement planning unit 103 on a two-dimensional or three-dimensional map.
この変形例3によれば、次の作用効果が得られる。すなわち、移動計画部103は、優先度が高い複数の移動目標を効率的に移動する経路を目標移動位置として出力し、移動指示部104は、移動計画部103が目標移動位置として出力した経路を指示する。そのため、複数の移動目標を効率的に通るように移動することができ、短時間で複数の固定カメラ200の位置・姿勢を高精度に校正できる。
According to the third modification, the following effects can be obtained. That is, the movement planning unit 103 outputs, as the target movement position, a route for efficiently moving a plurality of movement targets having high priority, and the movement instruction unit 104 outputs the route output by the movement planning unit 103 as the target movement position. Instruct. Therefore, it can move so that it may pass through a plurality of movement targets efficiently, and the position and posture of a plurality of fixed cameras 200 can be calibrated with high accuracy in a short time.
以下、図13~14を参照して、固定カメラ200とセンサ付き移動マーカ300のセンサの計測時刻が同期していない場合を対象とした、実施例2のカメラ校正装置400を説明する。なお、以下の説明では、実施例1と同じ構成要素には同じ符号を付して重複する説明を省略し、相違点を主に説明する。
Hereinafter, the camera calibration device 400 according to the second embodiment will be described with reference to FIGS. 13 to 14 for the case where the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and redundant description is omitted. Differences will be mainly described.
実施例1では、固定カメラ200と移動カメラ320が時刻同期撮影するため、同時刻の固定カメラ200と移動カメラ320の撮影画像の組を用いて移動計画を作成できたが、本実施例では、同時刻の固定カメラ200と移動カメラ320の撮影画像を時刻情報に基づいて抽出することができない。そのため、後述する各構成を付加することで、固定カメラ200と移動カメラ320の相対位置が同条件であるときの撮影画像の組を抽出し、これらを用いて移動計画を作成できるようにした。
(ブロック構成)
図13は、カメラ校正装置400のブロック構成を示す図である。カメラ校正装置400は、接続された固定カメラ200の位置・姿勢を校正するものであり、実施例1のカメラ校正装置100が備える機能に加えて、停止判定部401と、停止計画部402と、停止指示部403と、をさらに備える。なお、これらは、例えば、カメラ校正装置400内の半導体メモリ等の記憶装置に記憶されたプログラムに従って、CPU等の演算装置が動作することによって実現できるものであり、必ずしも個々をハードウェアとして具備する必要はない。 In the first embodiment, since the fixedcamera 200 and the moving camera 320 perform time-synchronized shooting, a movement plan can be created using a pair of captured images of the fixed camera 200 and the moving camera 320 at the same time. Images taken by the fixed camera 200 and the moving camera 320 at the same time cannot be extracted based on the time information. Therefore, by adding each configuration described later, a set of captured images when the relative positions of the fixed camera 200 and the moving camera 320 are the same condition is extracted, and a moving plan can be created using these.
(Block configuration)
FIG. 13 is a diagram illustrating a block configuration of thecamera calibration apparatus 400. The camera calibration device 400 calibrates the position and orientation of the connected fixed camera 200. In addition to the functions provided in the camera calibration device 100 of the first embodiment, the stop determination unit 401, the stop plan unit 402, A stop instruction unit 403. Note that these can be realized, for example, by operating an arithmetic device such as a CPU in accordance with a program stored in a storage device such as a semiconductor memory in the camera calibration device 400, and each is necessarily provided as hardware. There is no need.
(ブロック構成)
図13は、カメラ校正装置400のブロック構成を示す図である。カメラ校正装置400は、接続された固定カメラ200の位置・姿勢を校正するものであり、実施例1のカメラ校正装置100が備える機能に加えて、停止判定部401と、停止計画部402と、停止指示部403と、をさらに備える。なお、これらは、例えば、カメラ校正装置400内の半導体メモリ等の記憶装置に記憶されたプログラムに従って、CPU等の演算装置が動作することによって実現できるものであり、必ずしも個々をハードウェアとして具備する必要はない。 In the first embodiment, since the fixed
(Block configuration)
FIG. 13 is a diagram illustrating a block configuration of the
停止判定部401は、センサ付き移動マーカ300が停止している状態の、マーカ310上の特徴点311の画像座標と三次元座標を組み合わせて、固定カメラ校正部102へ出力する。停止計画部402は、停止判定部401と、移動マーカ校正部101と、移動計画部103の結果に応じて、センサ付き移動マーカ300の停止を計画する。停止指示部403は、停止計画部402が計画した停止を、センサ付き移動マーカ300へ指示する。
The stop determination unit 401 combines the image coordinates of the feature point 311 on the marker 310 and the three-dimensional coordinates in a state where the moving marker 300 with the sensor is stopped, and outputs the combination to the fixed camera calibration unit 102. The stop planning unit 402 plans the stop of the sensor-equipped moving marker 300 according to the results of the stop determination unit 401, the moving marker calibration unit 101, and the movement planning unit 103. The stop instruction unit 403 instructs the stop marker with the sensor to the movement marker 300 with the sensor.
実施例2では、固定カメラ200とセンサ付き移動マーカ300のセンサの計測時刻は同期していない。センサ付き移動マーカ300は、停止指示部403の指示にしたがい、移動と停止を繰り返す。三次元空間における停止を実現するため、センサ付き移動マーカ300は移動ロボット350や、台車、三脚などに搭載されていることが望ましい。
(停止判定部の動作)
図14を用いて、停止判定部401の処理の詳細を説明する。図14はカメラ校正装置400の動作中に停止判定部401が実行する処理を示すフローチャートである。 In the second embodiment, the measurement times of the sensors of the fixedcamera 200 and the sensor-equipped moving marker 300 are not synchronized. The movement marker 300 with sensor repeats movement and stop according to the instruction of the stop instruction unit 403. In order to realize the stop in the three-dimensional space, it is desirable that the sensor-equipped movement marker 300 is mounted on the mobile robot 350, a carriage, a tripod, or the like.
(Operation of stop judgment unit)
Details of the processing of thestop determination unit 401 will be described with reference to FIG. FIG. 14 is a flowchart illustrating processing executed by the stop determination unit 401 during the operation of the camera calibration apparatus 400.
(停止判定部の動作)
図14を用いて、停止判定部401の処理の詳細を説明する。図14はカメラ校正装置400の動作中に停止判定部401が実行する処理を示すフローチャートである。 In the second embodiment, the measurement times of the sensors of the fixed
(Operation of stop judgment unit)
Details of the processing of the
ステップS800~ステップS802は固定カメラ200毎に処理される。
Steps S800 to S802 are processed for each fixed camera 200.
ステップS800では、固定カメラ200が撮影して得られた画像210から、センサ付き移動マーカ300のマーカ310上の特徴点311を検出し、ステップS801に進む。ステップS800の処理は固定カメラ校正部102のステップS500と同じである。
In step S800, the feature point 311 on the marker 310 of the sensor-equipped moving marker 300 is detected from the image 210 obtained by capturing with the fixed camera 200, and the process proceeds to step S801. The process of step S800 is the same as step S500 of the fixed camera calibration unit 102.
ステップS801はステップS800でマーカ310が検出された場合、ステップS802に進み、マーカ310が検出されなかった場合、次の固定カメラ200の処理に進む。
In step S801, if the marker 310 is detected in step S800, the process proceeds to step S802. If the marker 310 is not detected, the process proceeds to the next process of the fixed camera 200.
ステップS802は、ステップS800で検出された特徴点311の画像座標から、センサ付き移動マーカ300が停止しているかどうかを判定する。具体的には、各特徴点311について、最新の画像中での位置と、一つ前の画像中での位置の間の距離を計算する。すべての特徴点311の距離の平均を計算し、あらかじめ設定された閾値より距離の平均が小さい場合はセンサ付き移動マーカ300が停止していると判定する。
In step S802, it is determined from the image coordinates of the feature point 311 detected in step S800 whether the sensor-equipped moving marker 300 is stopped. Specifically, for each feature point 311, the distance between the position in the latest image and the position in the previous image is calculated. The average of the distances of all the feature points 311 is calculated, and when the average distance is smaller than a preset threshold, it is determined that the sensor-equipped moving marker 300 is stopped.
ステップS803は、移動マーカ校正部101により推定されたセンサ付き移動マーカ300の位置・姿勢から、センサ付き移動マーカ300が停止しているかどうかを判定し、ステップS804に進む。最新のセンサ付き移動マーカ300の位置と、一つ前のセンサの計測値から計算されたセンサ付き移動マーカ300の位置の間の距離を計算し、距離があらかじめ設定された閾値より小さい場合はセンサ付き移動マーカ300が停止していると判定する。
Step S803 determines whether or not the sensor-equipped moving marker 300 is stopped based on the position / orientation of the sensor-equipped moving marker 300 estimated by the moving marker calibrating unit 101, and the process proceeds to step S804. The distance between the latest position of the moving marker 300 with sensor and the position of the moving marker 300 with sensor calculated from the measured value of the previous sensor is calculated. If the distance is smaller than a preset threshold value, the sensor It is determined that the attached movement marker 300 is stopped.
ステップS804では、特徴点311の画像座標と三次元座標を同期させる。具体的には、停止指示部403により停止指示が出た後、最初にステップS802によりセンサ付き移動マーカ300が停止していると判定された特徴点311の画像座標と、最初にステップS803によりセンサ付き移動マーカ300が停止していると判定された特徴点311の三次元座標を、センサ付き移動マーカ300が同じ位置に停止している状態の座標であると見なし、特徴点311の画像座標と三次元座標の組み合わせを固定カメラ校正部102へ出力する。
(停止計画部の動作)
次に、停止計画部402の処理の詳細を説明する。 In step S804, the image coordinates of thefeature point 311 and the three-dimensional coordinates are synchronized. Specifically, after the stop instruction is issued by the stop instructing unit 403, the image coordinates of the feature point 311 that is first determined that the sensor-equipped moving marker 300 is stopped in step S802, and the sensor that is first detected in step S803. The three-dimensional coordinates of the feature point 311 determined that the attached movement marker 300 is stopped are regarded as the coordinates in a state where the movement marker with sensor 300 is stopped at the same position, and the image coordinates of the feature point 311 The combination of the three-dimensional coordinates is output to the fixed camera calibration unit 102.
(Operation of outage planning department)
Next, details of the processing of thestop planning unit 402 will be described.
(停止計画部の動作)
次に、停止計画部402の処理の詳細を説明する。 In step S804, the image coordinates of the
(Operation of outage planning department)
Next, details of the processing of the
停止計画部402は、停止判定部401による固定カメラ200により撮影された画像210からのマーカ310の検出結果と、移動マーカ校正部101により推定されたセンサ付き移動マーカ300の位置と、移動計画部103により出力された目標移動位置と、からセンサ付き移動マーカ300に停止もしくは移動するように指示を出す。
The stop planning unit 402 includes a detection result of the marker 310 from the image 210 captured by the fixed camera 200 by the stop determination unit 401, the position of the moving marker 300 with sensor estimated by the moving marker calibration unit 101, and a movement planning unit. An instruction is issued to stop or move the sensor-equipped movement marker 300 from the target movement position output by 103.
停止計画部402は、各固定カメラ200において最初にセンサ付き移動マーカ300が検出された場合、停止指示部403を介して、センサ付き移動マーカに停止を指示する。
The stop planning unit 402 instructs the moving marker with sensor to stop via the stop instructing unit 403 when the moving marker with sensor 300 is first detected in each fixed camera 200.
また、停止計画部402は、移動マーカ校正部101により推定されたセンサ付き移動マーカ300の位置と、移動計画部103により出力された目標移動位置と、の間の距離があらかじめ設定された閾値以下になった場合、停止指示部403を介して、センサ付き移動マーカ300に停止を指示する。
Further, the stop plan unit 402 is configured such that the distance between the position of the sensor-equipped movement marker 300 estimated by the movement marker calibration unit 101 and the target movement position output by the movement planning unit 103 is equal to or less than a preset threshold value. In this case, the stop instruction unit 403 is used to instruct the sensor-equipped movement marker 300 to stop.
停止計画部402は、停止判定部401のステップS804において、特徴点311の画像座標と三次元座標の同期がなされた場合、停止指示部403を介して、センサ付き移動マーカ300に停止を解除して移動するように指示を出す。
(停止指示部の動作)
次に、停止指示部403の処理の詳細を説明する。 When the image coordinate of thefeature point 311 and the three-dimensional coordinate are synchronized in step S804 of the stop determination unit 401, the stop planning unit 402 cancels the stop of the sensor-equipped moving marker 300 via the stop instruction unit 403. And instruct them to move.
(Operation of stop instruction section)
Next, details of the processing of thestop instruction unit 403 will be described.
(停止指示部の動作)
次に、停止指示部403の処理の詳細を説明する。 When the image coordinate of the
(Operation of stop instruction section)
Next, details of the processing of the
停止計画部403は、センサ付き移動マーカ300を移動ロボット350が移動させる場合には、移動ロボット350に制御信号を出力することで、移動ロボット350を停止もしくは移動させる。また、センサ付き移動マーカ300を校正作業者が移動させる場合には、たとえば、スピーカによる音声出力や、ディスプレイによる画面出力により、停止もしくは移動を指示する。スピーカやディスプレイなどのインターフェースは、移動指示部104と同じインターフェースを用いても良い。
(作用効果)
この実施例2によれば、次の作用効果が得られる。すなわち、カメラ校正装置400では、停止計画部402がセンサ付き移動マーカ300に、停止指示部403を介して停止するように指示を出し、停止判定部401によりセンサ付き移動マーカ300が同じ位置・姿勢で停止している状態の、マーカ310上の特徴点311の画像座標と三次元座標を組み合わせて固定カメラ校正部102に出力するようにした。そのため、固定カメラ200とセンサ付き移動マーカ300のセンサの計測時刻が同期していない場合でも、固定カメラ200の校正ができる。 Thestop planning unit 403 stops or moves the mobile robot 350 by outputting a control signal to the mobile robot 350 when the mobile robot 350 moves the sensor-equipped movement marker 300. When the calibration operator moves the sensor-equipped movement marker 300, for example, the stop or movement is instructed by sound output from a speaker or screen output from a display. The same interface as the movement instruction unit 104 may be used as an interface such as a speaker or a display.
(Effect)
According to the second embodiment, the following operational effects can be obtained. That is, in thecamera calibration apparatus 400, the stop planning unit 402 instructs the sensor-equipped movement marker 300 to stop via the stop instruction unit 403, and the stop determination unit 401 causes the sensor-equipped movement marker 300 to have the same position / posture. The image coordinates of the feature point 311 on the marker 310 and the three-dimensional coordinates in a stopped state are combined and output to the fixed camera calibration unit 102. Therefore, the fixed camera 200 can be calibrated even when the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
(作用効果)
この実施例2によれば、次の作用効果が得られる。すなわち、カメラ校正装置400では、停止計画部402がセンサ付き移動マーカ300に、停止指示部403を介して停止するように指示を出し、停止判定部401によりセンサ付き移動マーカ300が同じ位置・姿勢で停止している状態の、マーカ310上の特徴点311の画像座標と三次元座標を組み合わせて固定カメラ校正部102に出力するようにした。そのため、固定カメラ200とセンサ付き移動マーカ300のセンサの計測時刻が同期していない場合でも、固定カメラ200の校正ができる。 The
(Effect)
According to the second embodiment, the following operational effects can be obtained. That is, in the
なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。本発明の技術的思想の範囲内で考えられるその他の態様も本発明の範囲内に含まれる。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記録装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。
In addition, this invention is not limited to an above-described Example, Various modifications are included. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described. Other embodiments conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention. Further, a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment. Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor. Information such as programs, tables, and files for realizing each function can be stored in a recording device such as a memory, a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
In addition, this invention is not limited to an above-described Example, Various modifications are included. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described. Other embodiments conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention. Further, a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment. Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor. Information such as programs, tables, and files for realizing each function can be stored in a recording device such as a memory, a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
100、400…カメラ校正装置、101…移動マーカ校正部、102…固定カメラ校正部、103…移動計画部、104…移動指示部、105…出力部、200…固定カメラ、210…画像、300…センサ付き移動マーカ、310…マーカ、311…特徴点、320…移動カメラ、350…移動ロボット401…停止判定部、402…停止計画部、403…停止指示部、
DESCRIPTION OF SYMBOLS 100, 400 ... Camera calibration apparatus, 101 ... Movement marker calibration part, 102 ... Fixed camera calibration part, 103 ... Movement plan part, 104 ... Movement instruction part, 105 ... Output part, 200 ... Fixed camera, 210 ... Image, 300 ... Moving marker with sensor, 310 ... marker, 311 ... feature point, 320 ... moving camera, 350 ... moving robot 401 ... stop determination unit, 402 ... stop planning unit, 403 ... stop instruction unit,
Claims (10)
- 固定カメラとセンサ付き移動マーカが接続されるカメラ校正装置であって、
前記センサ付き移動マーカのセンサの計測値から前記移動マーカの位置・姿勢を推定する移動マーカ校正部と、
前記固定カメラの撮影画像と推定された前記移動マーカの位置・姿勢とから、前記固定カメラの位置・姿勢を推定する固定カメラ校正部と、
推定された前記固定カメラの位置・姿勢に応じて前記移動マーカの目標移動位置を含む移動計画を作成する移動計画部と、
前記移動計画に基づいて移動を指示する移動指示部と、
推定された前記固定カメラの位置・姿勢を出力する出力部と、
を備えることを特徴とするカメラ校正装置。 A camera calibration device in which a fixed camera and a moving marker with a sensor are connected,
A moving marker calibration unit that estimates the position / posture of the moving marker from the measurement value of the sensor of the moving marker with sensor;
A fixed camera calibration unit that estimates the position / posture of the fixed camera from the captured image of the fixed camera and the estimated position / posture of the moving marker;
A movement planning unit that creates a movement plan including a target movement position of the movement marker according to the estimated position / posture of the fixed camera;
A movement instruction unit for instructing movement based on the movement plan;
An output unit for outputting the estimated position and orientation of the fixed camera;
A camera calibration device comprising: - 請求項1に記載のカメラ校正装置において、
前記移動計画部は、前記センサ付き移動マーカからの距離を用いて、前記目標移動位置を設定することを特徴とするカメラ校正装置。 The camera calibration device according to claim 1,
The said movement plan part sets the said target movement position using the distance from the said movement marker with a sensor, The camera calibration apparatus characterized by the above-mentioned. - 請求項1に記載のカメラ校正装置において、
前記移動計画部は、前記固定カメラの撮影画像における前記移動マーカ上の特徴点の二次元位置の分布と、三次元空間における前記移動マーカ上の特徴点の三次元位置の分布と、を用いて、前記目標移動位置を設定することを特徴とするカメラ校正装置。 The camera calibration device according to claim 1,
The movement planning unit uses a distribution of two-dimensional positions of feature points on the movement marker in a captured image of the fixed camera and a distribution of three-dimensional positions of feature points on the movement marker in a three-dimensional space. A camera calibration apparatus that sets the target movement position. - 請求項1に記載のカメラ校正装置において、
前記移動計画部は、前記センサ付き移動マーカからの距離と、前記固定カメラの撮影画像における前記移動マーカ上の特徴点の二次元位置の分布と、三次元空間における前記移動マーカ上の特徴点の三次元位置の分布と、を同時に用いて、前記目標移動位置を設定することを特徴とするカメラ校正装置。 The camera calibration device according to claim 1,
The movement planning unit includes a distance from the movement marker with the sensor, a distribution of two-dimensional positions of feature points on the movement marker in a captured image of the fixed camera, and a feature point on the movement marker in a three-dimensional space. A camera calibration apparatus characterized in that the target movement position is set by simultaneously using a three-dimensional position distribution. - 請求項1から4のいずれか一項に記載のカメラ校正装置において、
前記移動計画部は、前記センサ付き移動マーカの移動可能領域から判定される前記センサ付き移動マーカの到達可否に基づき、前記目標移動位置を設定することを特徴とするカメラ校正装置。 In the camera calibration device according to any one of claims 1 to 4,
The said movement plan part sets the said target movement position based on the arrival possibility of the said movement marker with a sensor determined from the movement possible area | region of the said movement marker with a sensor, The camera calibration apparatus characterized by the above-mentioned. - 請求項5に記載のカメラ校正装置において、
前記センサ付き移動マーカのセンサが周囲の三次元情報を計測可能な外界センサの場合には、
前記移動計画部は、前記外界センサによって計測された三次元情報から判定される前記センサ付き移動マーカの到達可否と、前記外界センサによって計測された三次元情報から判定される前記固定カメラによる撮影可否と、に基づき、前記目標移動位置を設定することを特徴とするカメラ校正装置。 The camera calibration device according to claim 5,
When the sensor of the moving marker with sensor is an external sensor capable of measuring surrounding three-dimensional information,
The movement planning unit is configured to determine whether or not the movement marker with sensor is determined based on the three-dimensional information measured by the external sensor, and whether or not the fixed camera is used to determine whether or not the fixed marker is determined based on the three-dimensional information measured by the external sensor. And the target movement position is set based on the camera calibration apparatus. - 請求項1から6のいずれか一項に記載のカメラ校正装置において、
前記移動指示部は、前記センサ付き移動マーカを搭載した移動ロボットに対し、前記目標移動位置へ移動するための制御信号を出力することを特徴とするカメラ校正装置。 In the camera calibration apparatus according to any one of claims 1 to 6,
The camera calibration apparatus, wherein the movement instruction unit outputs a control signal for moving to the target movement position to a mobile robot equipped with the movement marker with sensor. - 請求項1から6のいずれか一項に記載のカメラ校正装置において、
前記移動指示部は、推定された前記センサ付き移動マーカの位置・姿勢と、推定された前記固定カメラの位置・姿勢と、前記目標移動位置と、を二次元もしくは三次元の地図に表示することを特徴とするカメラ校正装置。 In the camera calibration apparatus according to any one of claims 1 to 6,
The movement instruction unit displays the estimated position / posture of the movement marker with sensor, the estimated position / posture of the fixed camera, and the target movement position on a two-dimensional or three-dimensional map. A camera calibration device characterized by - 請求項1から8のいずれか一項に記載のカメラ校正装置において、
前記出力部は、推定された前記センサ付き移動マーカの位置・姿勢と、推定された前記固定カメラの位置・姿勢と、前記固定カメラの撮影可能範囲と、を二次元もしくは三次元の地図に表示することを特徴とするカメラ校正装置。 In the camera calibration device according to any one of claims 1 to 8,
The output unit displays the estimated position / orientation of the moving marker with sensor, the estimated position / orientation of the fixed camera, and the photographing range of the fixed camera on a two-dimensional or three-dimensional map. A camera calibration device. - 請求項1から9のいずれか一項に記載のカメラ校正装置において、
前記センサ付き移動マーカが停止している状態の、前記移動マーカ上の特徴点の画像座標と三次元座標を組み合わせる停止判定部と、
前記センサ付き移動マーカの停止を計画する停止計画部と、
前記停止計画部が計画した停止を指示する停止指示部と、
をさらに備えることを特徴とするカメラ校正装置。 In the camera calibration device according to any one of claims 1 to 9,
A stop determination unit that combines the image coordinates and three-dimensional coordinates of the feature points on the moving marker in a state where the moving marker with sensor is stopped;
A stop planning unit for planning a stop of the moving marker with the sensor;
A stop instruction unit for instructing a stop planned by the stop planning unit;
A camera calibration device further comprising:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/001337 WO2018134866A1 (en) | 2017-01-17 | 2017-01-17 | Camera calibration device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/001337 WO2018134866A1 (en) | 2017-01-17 | 2017-01-17 | Camera calibration device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018134866A1 true WO2018134866A1 (en) | 2018-07-26 |
Family
ID=62908484
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/001337 WO2018134866A1 (en) | 2017-01-17 | 2017-01-17 | Camera calibration device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018134866A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2019193859A1 (en) * | 2018-04-04 | 2021-05-13 | コニカミノルタ株式会社 | Camera calibration method, camera calibration device, camera calibration system and camera calibration program |
CN113066134A (en) * | 2021-04-23 | 2021-07-02 | 深圳市商汤科技有限公司 | Calibration method and device of visual sensor, electronic equipment and storage medium |
DE102021204363A1 (en) | 2021-04-30 | 2022-11-03 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for calibrating a sensor using a means of transportation |
WO2024217572A1 (en) * | 2023-04-21 | 2024-10-24 | 北京极智嘉科技股份有限公司 | Device adjustment method and apparatus based recognition identifiers, and computing device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003050107A (en) * | 2001-08-07 | 2003-02-21 | Matsushita Electric Ind Co Ltd | Camera calibration device |
JP2010172986A (en) * | 2009-01-28 | 2010-08-12 | Fuji Electric Holdings Co Ltd | Robot vision system and automatic calibration method |
JP2010276603A (en) * | 2009-05-29 | 2010-12-09 | Mori Seiki Co Ltd | Calibration method and calibration device |
-
2017
- 2017-01-17 WO PCT/JP2017/001337 patent/WO2018134866A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003050107A (en) * | 2001-08-07 | 2003-02-21 | Matsushita Electric Ind Co Ltd | Camera calibration device |
JP2010172986A (en) * | 2009-01-28 | 2010-08-12 | Fuji Electric Holdings Co Ltd | Robot vision system and automatic calibration method |
JP2010276603A (en) * | 2009-05-29 | 2010-12-09 | Mori Seiki Co Ltd | Calibration method and calibration device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2019193859A1 (en) * | 2018-04-04 | 2021-05-13 | コニカミノルタ株式会社 | Camera calibration method, camera calibration device, camera calibration system and camera calibration program |
JP7173133B2 (en) | 2018-04-04 | 2022-11-16 | コニカミノルタ株式会社 | Camera calibration method, camera calibration device, camera calibration system and camera calibration program |
CN113066134A (en) * | 2021-04-23 | 2021-07-02 | 深圳市商汤科技有限公司 | Calibration method and device of visual sensor, electronic equipment and storage medium |
DE102021204363A1 (en) | 2021-04-30 | 2022-11-03 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for calibrating a sensor using a means of transportation |
WO2024217572A1 (en) * | 2023-04-21 | 2024-10-24 | 北京极智嘉科技股份有限公司 | Device adjustment method and apparatus based recognition identifiers, and computing device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10825198B2 (en) | 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images | |
CN112258567B (en) | Visual positioning method and device for object grabbing point, storage medium and electronic equipment | |
US9953461B2 (en) | Navigation system applying augmented reality | |
JP6658001B2 (en) | Position estimation device, program, position estimation method | |
US9927222B2 (en) | Position/orientation measurement apparatus, measurement processing method thereof, and non-transitory computer-readable storage medium | |
JP5746477B2 (en) | Model generation device, three-dimensional measurement device, control method thereof, and program | |
JP5992184B2 (en) | Image data processing apparatus, image data processing method, and image data processing program | |
US20170337701A1 (en) | Method and system for 3d capture based on structure from motion with simplified pose detection | |
JP6503906B2 (en) | Image processing apparatus, image processing method and image processing program | |
US20040176925A1 (en) | Position/orientation measurement method, and position/orientation measurement apparatus | |
JP6321202B2 (en) | Method, apparatus and system for determining movement of a mobile platform | |
JP6589636B2 (en) | 3D shape measuring apparatus, 3D shape measuring method, and 3D shape measuring program | |
WO2018134866A1 (en) | Camera calibration device | |
KR102263152B1 (en) | Method and apparatus for object detection in 3d point clouds | |
WO2020195875A1 (en) | Information processing device, information processing method, and program | |
JP4227037B2 (en) | Imaging system and calibration method | |
US11758100B2 (en) | Portable projection mapping device and projection mapping system | |
EP3392748B1 (en) | System and method for position tracking in a virtual reality system | |
JP2003006618A (en) | Method and device for generating three-dimensional model and computer program | |
WO2019186677A1 (en) | Robot position/posture estimation and 3d measurement device | |
JPWO2021111613A1 (en) | 3D map creation device, 3D map creation method, and 3D map creation program | |
KR102555269B1 (en) | Posture estimation fusion method and system using omnidirectional image sensor and inertial measurement sensor | |
EP4292777A1 (en) | Assistance system, image processing device, assistance method and program | |
KR101746792B1 (en) | Method and apparatus for estimating transformation between distance sensor and rotating platform | |
KR20230130024A (en) | Method and system for determining the status of a camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17893037 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17893037 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |