SE544405C2 - Method and control arrangement for extrinsic calibration of a camera arranged at a vehicle - Google Patents
Method and control arrangement for extrinsic calibration of a camera arranged at a vehicleInfo
- Publication number
- SE544405C2 SE544405C2 SE2051140A SE2051140A SE544405C2 SE 544405 C2 SE544405 C2 SE 544405C2 SE 2051140 A SE2051140 A SE 2051140A SE 2051140 A SE2051140 A SE 2051140A SE 544405 C2 SE544405 C2 SE 544405C2
- Authority
- SE
- Sweden
- Prior art keywords
- vehicle
- calibration
- pose
- camera
- control arrangement
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
The present disclosure relates to a method and a control arrangement for extrinsic calibration of a camera (11) arranged at a vehicle (1). According to a first aspect, the disclosure relates to a method for extrinsic calibration of a camera (11) arranged at a vehicle (1). The method comprises capturing S1, using the camera (11), one or more images of a known pattern at a calibration surface (21) of a calibration board (2) and obtaining S2, from a first orientation sensor (23), attached to the calibration board (2), data indicative of a pose of the calibration surface (21) when capturing the respective one or more images. The method further comprises obtaining S3, from a second orientation sensor (12) arranged at the vehicle (1), orientation data indicative of a pose of the vehicle (1), and determining S4 a pose of the camera (11) in relation to the vehicle (1), based on at least one of the one or more captured images, the pose of the calibration surface (21) and the pose of the vehicle (1). The disclosure also relates to a corresponding computer program, computer-readable medium and to a vehicle comprising the control arrangement.
Description
Method and control arrangement for extrinsic calibration of a camera arranged at avehicle Technical field The present disclosure relates to a method and a control arrangement for extrinsiccalibration of a camera arranged at a vehicle. The disclosure also relates to acorresponding computer program, computer-readable medium and to a vehicle comprising the control arrangement.
BackgroundAutonomous mobile platforms, such as autonomous driving vehicles, require real time information about static and dynamic features and objects in its surroundingenvironment. A sensor system comprising various types of sensors (e.g. cameras,lidars and radars) are used for collecting this information to cover the whole areaaround the vehicle. The sensors are typically mounted in different poses to get agood view. To make the best out of all the information, the data from all sensors istranslated to the vehicle coordinate system. To do this transformation correctly it isvital to know the relative poses of all the with respect to the vehicle, so that thesensor data can be accurately translated into the vehicle coordinate system. Hence,accurate knowledge about the sensors' poses is typically required for autonomousdriving. Therefore, sophisticated calibration methods are typically employed toestimate the sensors' relative poses with respect to the vehicle.
The sensor system comprises different sensors including cameras for environmentperception, mounted at different positions at the vehicle. The calibration of thecameras comprises an intrinsic and an extrinsic calibration step. The intrinsiccalibration step involves finding parameters for the mathematical model describingthe sensor internally (e.g. focal length, distortion parameters etc.) and the extrinsiccalibration step involves determination of each camera's pose in vehiclecoordinates. ln particular the extrinsic calibration step may be rather complex. Thecomplexity usually depends on, among other things, the number of cameras, thetype of the camera, the field of view, mounting position and orientation with respectto the vehicle, collectively called pose of the camera, and the size of the vehicle.
Another factor that affects complexity is the overlap of the fields of view of theinvolved cameras in the setup.
Some existing calibration approaches require cameras to be mountedperpendicular to or aligned with vehicle sides and excludes other, more complicatedcamera poses. Some of the more complicated approaches even involve building acalibration room or driving the vehicle around, sometimes incorporating orientationsensor and lidars to calculate the pose (aka online calibration). Online calibration isfor example described in the article “Automatic intrinsic and extrinsic calibration ofa rig with multiple generic cameras and odometry” by Lionel Heng, Bo Li, MarcPollefeys, which was published in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.
A relatively simpler approach to extrinsic camera calibration step involves placingsome type of calibration pattern with a known pattern (e.g. a checker board) on theground and then finding the camera's pose by first finding the pattern's pose in thevehicle frame (pattern~vehicle), and then finding the camera's pose in the patternframe (camera~pattern). This approach introduces an intermediate step of findingthe pose of the pattern on the way towards finding the camera's pose, which makesextrinsic calibration very time-consuming, and difficult for someone inexperienced to accurately calibrate the cameras.
Furthermore, during the operation or required service of a vehicle, the cameras canmove slightly, making the possibility of accurate aftermarket calibration arequirement. Consequently, there is a need for improved methods for extrinsic camera calibration.
Summary lt is an object of the disclosure to alleviate at least some of the drawbacks with the prior art. Thus, it is an object of this disclosure to provide a less complicated andtime-consuming method to perform extrinsic camera calibration. lt is a further objective to provide a method for extrinsic calibration that can be performed by someone with little experience in calibration and that is suitable for aftermarket calibration.
To meet these objectives, this disclosure proposes techniques where intrinsic andextrinsic calibration are combined. According to a first aspect, the disclosure relatesto a method for extrinsic calibration of a camera arranged at a vehicle. The methodcomprises capturing, using the camera, one or more images of a known pattern ata calibration surface of a calibration board and obtaining, from a first orientationsensor, attached to the calibration board, data indicative of a pose of the calibrationsurface when capturing the respective one or more images. The method furthercomprises obtaining, from a second orientation sensor arranged at the vehicle,orientation data indicative of a pose of the vehicle, and determining a pose of thecamera in relation to the vehicle, based on at least one of the one or more capturedimages, the pose of the calibration surface and the pose of the vehicle. Thecalibration process will be very fast and simple to perform, and will require only asmall amount of training. Furthermore, no constraints are posed on the mountingposes of cameras to ease the extrinsic calibration process, which opens up thepossibility of having fewer sensors with less cost and effort in installation and cabfing. ln some embodiments, the method comprises performing intrinsic calibration of thecamera based on at least one of the one or more images. By combining intrinsic and extrinsic calibration, it is possible to calibrate a vehicle in a fraction of the time. ln some embodiments, the method comprises providing the determined pose to acontrol arrangement of the vehicle for use in autonomous driving. ln case of sensorsmounted on autonomous vehicles, one can eliminate the need to build large,sophisticated calibration rooms. This, together with the simplicity of the process,can solve the problem of calibration on the aftermarket. The proposed method willwork also for the much more complicated camera poses typical for heavy-duty vehicles. ln some embodiments, the data comprises orientation data and position data.
Thereby, the calibration may have better accuracy. ln some embodiments, a relation between orientation data obtained from the firstand second orientation sensors is known. Thereby, it is easier to compare twoorientation sensor readings to find the pose than to find a sophisticated extrinsiccalibration algorithm, which in turn translates to less effort spent in development. ln some embodiments, the capturing comprises capturing a plurality of images fromsimilar and/or different directions. Accuracy will typically improve if more images are used. ln some embodiments, the determining comprises identifying a pose of the knownpattern in a coordinate system of the vehicle and calibrating the pose of the camerain the coordinate system of the vehicle. ln this way the pose of the camera can be determined in a simple way.
According to a second aspect, the disclosure relates to a calibration board forenabling extrinsic calibration of a camera arranged at a vehicle. The calibrationboard comprises a calibration surface having a known pattern thereon, a firstorientation sensor attached to the calibration board and a first control arrangementconfigured to provide orientation data indicative of a pose of the calibration surface to a second control arrangement. ln some embodiments, the calibration board comprises a user interface deviceconfigured to communicate user data between a user and the second controlarrangement, via the first control arrangement. Thereby, the calibration board mayalso be used to instruct a user during the calibration procedure, to for example assure that the calibration is performed in a correct way.
According to a third aspect, the disclosure relates to a control arrangement configured to perform the method according to the first aspect.
According to a fourth aspect, the disclosure relates to a vehicle comprising thecontrol arrangement of the third aspect.
According to a fifth aspect, the disclosure relates to a computer program comprisinginstructions which, when the program is executed by a computer, cause thecomputer to carry out the method according to the first aspect.
According to a sixth aspect, the disclosure relates to a computer-readable mediumcomprising instructions which, when executed by a computer, cause the computer to carry out the method according to the first aspect.
Brief description of the drawinqs Fig. 1 i||ustrates a vehicle 1, where the proposed method for extrinsic calibration ofa camera may be implemented.
Fig. 2 i||ustrates a side-view of the vehicle of Fig. 1.
Fig. 3 i||ustrates a computer implemented method extrinsic calibration of a cameraarranged at a vehicle Fig. 4 i||ustrates a first control arrangement of a calibration board according to anexample embodiment.
Fig. 5 i||ustrates a second control arrangement of the vehicle of Fig. 1 according to an example embodiment.
Detailed descriptionThis disclosure proposes an orientation sensor-based method which utilizes an orientation sensor arranged at a calibration board to determine the calibrationpattern's pose in the vehicle frame (pattern~vehicle relation). The proposed methodmay eliminate the need for an additional extrinsic calibration step completely bycomputing poses of cameras mounted on a vehicle during the intrinsic camera calibration procedure.
More specifically, it is herein proposed to have an orientation sensor attached on acalibration board. A similar orientation sensor is typically already mounted on thevehicle. While images are taken by cameras arranged on the vehicle, readings from the orientation sensors attached to the calibration pattern and to the vehicle arerecorded. By comparing the two orientation sensors' readings synchronously in time(i.e. one reading from the calibration board and one reading from the vehicle) therelative pose of the pattern with respect to the vehicle can be determined. Thiscorresponds to the intermediate step of finding the pattern~vehicle pose mentionedabove.
The normal approach to the intrinsic calibration step of a camera is to take imagesofa calibration pattern (e.g. checkerboard) and process them online or offline. lf anorientation sensor is mounted on the calibration board which is used in the intrinsiccalibration step, then it is possible to perform the extrinsic calibration during theintrinsic calibration. More specifically, since each camera is separately intrinsicallycalibrated, each camera can be extrinsically calibrated at the same time, withoutextra effort.
The intrinsic calibration is not dependent on the camera's pose and may thereforebe performed even before mounting the sensors in the vehicle. However, in orderto be able to combine the intrinsic and the extrinsic calibration it is of course requiredthat the intrinsic calibration takes place after mounting the cameras at the vehicle.
Fig. 1 conceptually illustrates a vehicle 1, here a truck, where the proposed methodfor extrinsic calibration of a camera arranged at a vehicle may be implemented. Fig.2 illustrates a side-view of the vehicle of Fig. 1. The vehicle 1 may comprise ameans for transportation in broad sense. More specifically, the proposed techniqueis both applicable to single body vehicles (e.g. buses or truck) and to articulatedvehicles (e.g. tractor-trailer combinations and articulated buses). The proposedtechnique is herein presented with reference to autonomous driving. However, itmust be anticipated that the technique is applicable to any camera arranged on thevehicle 1. ln Fig. 1 the vehicle 1 is arranged for combined intrinsic and extrinsic cameracalibration using a calibration board 2. The calibration board 2 is positioned in thefield of view of at least one camera 11 arranged at the vehicle 1.
The calibration board 2 is typically a board shaped, stand-alone device. Thecalibration board 2 comprises a calibration surface 21, a first orientation sensor 23and first control arrangement 22. The calibration board 2 may be handheld or fixed to the earth via a stand, a wall or similar.
The calibration surface 21 has a known pattern, also referred to as a calibrationpattern, thereon. ln other words, the pattern is known to a control arrangement 10that shall perform the calibration. The calibration pattern is typically a two-dimensional pattern. The calibration pattern is detectable using a camera. ln theillustrated example, the known pattern is a chessboard pattern. Alternatively, aChArUco board can be used. A ChArUco board is a planar board where the markersare placed inside the white squares of a chessboard. The benefits of ChArUcoboards is that they provide both versatility and chessboard corner precision, whichis important for calibration and pose estimation. ln other words, in someembodiments, the pattern is indicative of an orientation of the calibration board 2.
The first orientation sensor 23 is attached to the calibration board 2. The firsterientation sensor 23 is configured te measure the orientation ef the known patternof the cattbratiort surface 21 relative a reference coerdinate systern. The firstorientation sensor 23 cornprises eg. ntagnetemeter and an accelerorneter. The firstorientation sensor 23 is typically an absolute orientation sensor, which provides theorientation of the known pattern in relation to an absolute reference, such as aninertial frame. ln some embodiments the first orientation sensor 23 is an lnertialmeasurement unit, IMU, configured to provide an absolute orientation. The firstorientation sensor 23 is attached to the calibration board 2 such that the relationbetween the pattern and the first orientation sensor 23 is known or predictable. Forexample, the relation between the first orientation sensor 23 and the calibrationsurface 21 having a pattern thereon is fixed. The relation has for example beencalibrated in advance.
The first control arrangement 22 is configured to provide orientation data indicativeof a pose of the calibration surface 21 to a second control arrangement 10, which is e.g. arranged in the vehicle 1 (see Fig. 2). ln some embodiments the first controlarrangement 22 is also configured to provide orientation data to other controlarrangements, such as to any off-board unit. More specifically, the first controlarrangement 22 is configured to read orientation data from the first orientationsensor 23 and to transmit the orientation data using a communication interface 223(see Fig. 4). lfthe relation between the calibration surface 21 and the first orientationsensor 23 is known, the pose of the calibration surface 21 corresponds to the poseof the first orientation sensor 23. Typically, both the first orientation sensor 23 andthe calibration surface 21 are fixed to the calibration board 2 and their poses willthen all be the same. ln some embodiments, the first control arrangement 22 andfirst orientation sensor 23 are implemented as one integrated unit. ln some embodiments, the calibration board comprises a user interface device 24.The user interface device 24 comprises for example a display and an input device,such as buttons. ln some embodiments the user interface device 24 comprises atouch screen. The user interface device 24 is configured to communicate user databetween a user and the second control arrangement 10, via the first controlarrangement 22. ln this way, instructions to a person performing the calibration maybe provided via the calibration board as will be further described below.
The vehicle 1 comprises equipment required for autonomous driving. Such avehicle 1 typically comprises an autonomous driving system, a navigation system,sensors, meters etc. The autonomous driving system is configured to operate thevehicle autonomously. The sensors and meters are configured to provide vehicleparameters for use by the autonomous driving system. The navigation system isconfigured to determine a curvature of an upcoming road. For simplicity, only partsof the vehicle 1 affected by the proposed technique are illustrated in Figs. 1 and 2and described herein.
The illustrated vehicle 1 comprises one or more cameras 11, a second orientation sensor 12 and a second control arrangement 10. The one or more cameras 11 are for example for use in autonomous driving. The cameras are for example image sensors or any camera that can detect the calibration pattern.
The second orientation sensor 12 is a sensor configured to measurean orientation of the vehicle 1 relativa a coordinate system. Typicaiiy, the first andsecond orientation sensors 12, 23 provide the orientation in the same coordinatesystem. The second orientation sensor 12 is for example an IMU, which is a component that is typically already present in many vehicles.
The second control arrangement 10 is a device configured to perform the proposedmethod for extrinsic calibration using the calibration board. The second controlarrangement 10 will be described in detail in connection to Fig. 5.
The proposed method for extrinsic calibration of a camera 11 arranged at a vehicle1 will now be described with reference to the flow chart of Fig. 3 and the vehicle ofFigs. 1-2. The method is performed in a second control arrangement 10. ln thisexample, the second control arrangement 10 is arranged in the vehicle 1. However,it must be appreciated that the second control arrangement 10 may alternatively beat least partly arranged outside the vehicle as long as it can communicate with thecamera 11 and the orientation sensors 12, 23.
The method may be implemented as a computer program comprising instructionswhich, when the program is executed by a computer (e.g. a processor in the secondcontrol arrangement 10 (Fig. 5)), cause the computer to carry out the method.According to some embodiments the computer program is stored in a computer-readable medium (e.g. a memory or a compact disc) that comprises instructionswhich, when executed by a computer, cause the computer to carry out the method.
The method is for example performed during manufacturing after most componentsbeing assembled. Alternatively, or in addition the method may be performed aftermarket for re-calibrating cameras that may have been tilted during use. The methodis performed for one camera 11, but it must be appreciated that the method is typically repeated for all of the cameras. lO Before performing the calibration, the vehicle 1 is typically parked in a suitableplace, such as in a garage. The calibration board 2 is then positioned in the field ofview of the camera 11. For example, instructions regarding how to position thecalibration board 2 are provided via the user interface device 24 of the calibrationboard 2. An example of such an instruction is for example “Move the board to theright x cm or tilt the board forward from the top side 5 degrees”. These instructionswill primarily be provided during the calibration, but it could of course also be givenbefore starting. The calibration is then started, for example the calibration is startedwhen the user (such as a technician) gives a command to the second controlarrangement 10. ln some embodiments the command is also provided via the user interface device 24 of the calibration board.
The proposed method for extrinsic calibration now starts. The proposed methodcomprises capturing S1, using the camera 11, one or more images of the knownpattern at the calibration surface 21 of the calibration board 2. ln other words, oneor more images picturing the known pattern are taken using the camera 11. Hence,in some embodiments, the capturing S1, comprises capturing a plurality of imagesfrom similar and/or different directions.
To perform the extrinsic calibration the orientation of the pattern in the vehicle framehas to be determined. This is done using the orientations sensors 12, 23 arrangedat the vehicle 1 and at the calibration board 2. ln other words, the method furthercomprises obtaining S2, from a first orientation sensor 23 attached to the calibrationboard 2, orientation data indicative of a pose of the calibration surface 21 whencapturing the respective one or more images and obtaining S3 from a secondorientation sensor 12 arranged at the vehicle 1 orientation data indicative of a poseof the vehicle 1. The order of these sensor readings is typically unimportant. Theorientation data is indication of an orientation in relation to a reference frame.Further data may be provided during these readings, such as position data,acceleration data, rotation data etc., which may further enhance the calibration. Forexample, the distance between the cameras is typically also required for thetranslation, but it is typically not changing as much as the angles, as the cameras ll are typically attached to the vehicle at pre-defined positions. Therefore, onlyobtaining the orientation of the sensors may be enough for determining the pose ofthe camera. Then the positions have to be acquired in a different way (potentiallyusing the CAD designs of the vehicle itself). However, further data that can beprovided by the orientation sensor (e.g. an IMU) may also be used for determiningthe pose when available. For example, an IMU may provide information aboutacceleration, position etc.
A pose of the camera 11 in relation to the vehicle 1 is then determined S4 basedon at least one of the one or more captured images, the pose of the calibrationsurface and the pose of the vehicle. The pose of the calibration surface and thepose of the vehicle are available from the orientation data obtained in the previoussteps S2, S3. ln some embodiments, a relation between orientation data obtainedfrom the first and second orientation sensors 12, 23 is known. For example, the same coordinate system is used or other known relation.
One way of determining the pose will now be described, but it must be appreciatedthat this may be performed in different ways. From each captured image of thecalibration pattern, it is possible to determine a camera~pattern pose. Acamera~pattern pose refers to the camera's pose in a coordinate system of thepattern 8 (Fig. 1). ln other words, the camera~pattern pose defines a relation between the pose of the calibration pattern 21 and the pose of the camera 11.
For each image, the first and second orientation sensor readings are thencompared to obtain a pattern~vehicle pose for each image. A pattern~vehicle pose,or relation, refers to the pattern's pose in the coordinate system 9 of the vehicle 1.ln other words, the pattern~vehicle pose defines a relation between the coordinatesystem of the pattern 8 and the coordinate system 9 of the vehicle 1.
By combining the camera~pattern and pattern~vehicle poses, it is then possible toobtain a camera~vehicle pose for each image. Hence, a camera~vehicle poserefers to the camera's pose in the coordinate system 9 of the vehicle 1. Thecamera~vehicle pose is the information desired in the extrinsic calibration. ln other 12 words, the camera~vehic|e pose defines a relation between the pose of the cameraand the pose of the vehicle, which can be used e.g. for object detection in autonomous driving.
An average of camera~vehic|e poses calculated for all the images then gives thecamera~vehic|e pose. ln other words, in some embodiments, the determining S4comprises identifying a pose of the known pattern in a coordinate system of thevehicle 1 and calibrating the pose of the camera 11 in the coordinate system of thevehicle 1, based on the pose of the known pattern in a coordinate system of thevehicle 1.
As mentioned above, the same images may be used for both intrinsic and extrinsiccalibration. ln other words, in some embodiments, the method comprisesperforming S5 intrinsic calibration of the camera based on at least one of the oneor more images. The intrinsic and extrinsic calibration may be performed in anyorder or even in parallel. Hence, this step may be performed before after or at the same time as steps S2-S4. lf the calibration is not performed by the control arrangement that controlsautonomous driving, then the calibrated pose should be provided to theautonomous driving system. ln other words, in some embodiments, the methodcomprises providing S6 the determined pose to a second control arrangement 10 of the vehicle for use in autonomous driving.
Now turning to Fig. 4 which illustrates an example first control arrangement 22arranged at a calibration board 2 in more detail. The first control arrangement 22comprises hardware and software. The hardware basically comprises variouselectronic components on Printed Circuit Board, PCB. The most important of thosecomponents is typically one or more processors 221 e.g. a microprocessor, alongwith memory 222 e.g. EPROM or a Flash memory chip.
The first control arrangement 22 also comprises a communication interface 223 for communicating with an orientation sensor and a second control arrangement 10 on 13 the vehicle configured to perform extrinsic calibration. The communication interfacemay be wireless or wired. Typically, a wireless interface is more convenient. Thecommunication interface 103 may be implemented using any suitablecommunication protocol such as Ethernet, Bluetooth, WiFi, a 3GPP protocol or aproprietary protocol. ln other words, the first control arrangement 22 is configuredto read orientation data from the first orientation sensor 23 and to provideorientation data indicative of a pose of a known pattern at a calibration surface 21calibration board 2 to a second control arrangement 10, using the communication interface 223.
Now turning to Fig. 5 which illustrates the second control arrangement 10configured for extrinsic calibration of a camera 11 arranged at a vehicle 1 in moredetail. ln some embodiments, the second control arrangement 10 is a “unit” in afunctional sense. Hence, in some embodiments the second control arrangement 10is a control arrangement comprising several physical control devices that operate in corporation.
The second control arrangement 10 comprises hardware and software. Thehardware basically comprises various electronic components on Printed CircuitBoard, PCB. The most important of those components is typically one or moreprocessors 101 e.g. a microprocessor, along with memory 102 e.g. EPROM or aFlash memory chip. For simplicity only one processor 101 and memory 102 isillustrated in the second control arrangement 10, but in a real implementation it could of course be more.
The second control arrangement 10 may comprise one or more ECUs. An ECU isbasically a digital computer that controls one or more electrical systems (orelectrical sub systems) of the vehicle 1 based on e.g. information read from sensorsand meters placed at various parts and in different components of the vehicle 1.ECU is a generic term that is used in automotive electronics for any embeddedsystem that controls one or more functions of the electrical system or sub systemsin a transport vehicle. The second control arrangement 10 comprises for example 14 an Automated-Driving Control Unit, ADAS or any other suitable ECU. The controlarrangement may also or alternatively comprise hardware and software located off- board.
The second control arrangement 10, or more specifically the processor 101 of thesecond control arrangement 10, is configured to cause the second controlarrangement 10 to perform all aspects of the method described above. This istypically done by running computer program code stored in the data storage ormemory 102 in the processor 101 of the second control arrangement 10. The datastorage 102 may also be configured to store semi-static vehicle parameters such as vehicle dimensions.
The second control arrangement 10 also comprises a communication interface 103for communicating with for example a first control arrangement 22 of a calibrationboard. The communication interface may be wireless or wired. Typically, a wirelessinterface would be more convenient. The communication interface 103 may beimplemented using any suitable communication protocol such as Bluetooth, WiFi, a 3GPP protocol or a proprietary protocol.
More specifically the second control arrangement 10 is configured to capture, usingthe camera 11, one or more images of a known pattern at a calibration surface 21of a calibration board 2 and to obtain, from a first lnertial Measurement Unit, IMU,23 attached to the calibration board 2, data indicative of a pose of the calibrationboard 2 when capturing the respective one or more images. The second controlarrangement 10 is configured to obtain, from a second orientation sensor 12arranged at the vehicle 1, sensor data indicative of a pose of the vehicle 1, and todetermine a pose of the camera 11 in relation to the vehicle 1, based on at leastone of the one or more captured images, the pose of the calibration board and thepose of the vehicle. ln some embodiments, the second control arrangement 10 is configured to perform intrinsic calibration of the camera based on at least one or more images. ln some embodiments, the second control arrangement 10 is configured to providethe determined pose to a control arrangement of the vehicle for use in autonomous driving.
The terminology used in the description of the embodiments as illustrated in theaccompanying drawings is not intended to be limiting of the described method,control arrangement or computer program. Various changes, substitutions and/oralterations may be made, without departing from disclosure embodiments asdefined by the appended claims.
The term “or” as used herein, is to be interpreted as a mathematical OR, i.e., as aninclusive disjunction; not as a mathematical exclusive OR (XOR), unless expresslystated otherwise. ln addition, the singular forms "a", "an" and "the" are to beinterpreted as “at least one”, thus also possibly comprising a plurality of entities ofthe same kind, unless expressly stated otherwise. lt will be further understood thatthe terms "includes", "comprises", "including" and/ or "comprising", specifies thepresence of stated features, actions, integers, steps, operations, elements, and/ orcomponents, but do not preclude the presence or addition of one or more otherfeatures, actions, integers, steps, operations, elements, components, and/ orgroups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims.
Claims (11)
1. A method for extrinsic calibration of a camera (11) arranged at a vehicle (1 ), the method comprising: - capturing (S1), using the camera (11), one or more images of a knownpattern at a calibration surface (21) of a calibration board (2), - obtaining (S2), from a first orientation sensor (23) attached to thecalibration board (2), data indicative of a pose of the calibration surface(21) when capturing the respective one or more images, - obtaining (S3), from a second orientation sensor (12) arranged at thevehicle (1 ), orientation data indicative of a pose of the vehicle (1 ), and - determining (S4) a pose of the camera (11) in relation to the vehicle (1),based on at least one of the one or more captured images, the pose ofthe calibration surface and the pose of the vehicle. _ The method according to claim 1, wherein the method comprises: - performing (S5) intrinsic calibration of the camera (2) based on at least one of the one or more images. _ The method according to claim 1 or 2, wherein the method comprises: - providing (S6) the determined pose to a control arrangement (249) of the vehicle (1) for use in autonomous driving. _ The method according to any of the preceding claims, wherein the data comprises orientation data and/or position data. _ The method according to any of the preceding claims, wherein a relation between orientation data obtained from the first and second orientationsensors (12, 23) is known. _ The method according to any of the preceding claims, wherein the capturing (S1), comprises capturing a plurality of images from similar and/or differentdirections. 17 7. The method according to any of the preceding claims, wherein thedetermining (S4) comprises identifying a pose of the known pattern in acoordinate system of the vehicle (1) and calibrating the pose of the camera(11) in the coordinate system of the vehicle (1 ). 8. A computer program comprising instructions which, when the computerprogram is executed by a computer, cause the computer to carry out themethod according to any one of the preceding claims. 9. A computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to any one of the preeeding-claimsï. 10.A second control arrangement (10) configured for extrinsic calibration of acamera (11) arranged at a vehicle (1), the second control arrangement (10)being configured to perform the method according to any one of claims 1 to7. 11.A vehicle (1) comprising at least a part of the second control arrangement(10) according to claim Ilêfl.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2051140A SE544405C2 (en) | 2020-09-30 | 2020-09-30 | Method and control arrangement for extrinsic calibration of a camera arranged at a vehicle |
DE102021123201.5A DE102021123201A1 (en) | 2020-09-30 | 2021-09-08 | Method and control arrangement for the extrinsic calibration of a camera arranged on a vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2051140A SE544405C2 (en) | 2020-09-30 | 2020-09-30 | Method and control arrangement for extrinsic calibration of a camera arranged at a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
SE2051140A1 SE2051140A1 (en) | 2022-03-31 |
SE544405C2 true SE544405C2 (en) | 2022-05-10 |
Family
ID=80624194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE2051140A SE544405C2 (en) | 2020-09-30 | 2020-09-30 | Method and control arrangement for extrinsic calibration of a camera arranged at a vehicle |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102021123201A1 (en) |
SE (1) | SE544405C2 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2237223A1 (en) * | 2009-03-31 | 2010-10-06 | Aisin Seiki Kabushiki Kaisha | Calibrating apparatus for on-board camera of vehicle |
US20180040141A1 (en) * | 2015-04-23 | 2018-02-08 | Applications Solutions (Electronic and Vision) Ltd | Camera extrinsic parameters estimation from image lines |
WO2019103721A2 (en) * | 2017-11-21 | 2019-05-31 | Ford Global Technologies, Llc | Object location coordinate determination |
US20200264625A1 (en) * | 2019-02-19 | 2020-08-20 | Crown Equipment Corporation | Systems and methods for calibration of a pose of a sensor relative to a materials handling vehicle |
-
2020
- 2020-09-30 SE SE2051140A patent/SE544405C2/en unknown
-
2021
- 2021-09-08 DE DE102021123201.5A patent/DE102021123201A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2237223A1 (en) * | 2009-03-31 | 2010-10-06 | Aisin Seiki Kabushiki Kaisha | Calibrating apparatus for on-board camera of vehicle |
US20180040141A1 (en) * | 2015-04-23 | 2018-02-08 | Applications Solutions (Electronic and Vision) Ltd | Camera extrinsic parameters estimation from image lines |
WO2019103721A2 (en) * | 2017-11-21 | 2019-05-31 | Ford Global Technologies, Llc | Object location coordinate determination |
US20200264625A1 (en) * | 2019-02-19 | 2020-08-20 | Crown Equipment Corporation | Systems and methods for calibration of a pose of a sensor relative to a materials handling vehicle |
Also Published As
Publication number | Publication date |
---|---|
SE2051140A1 (en) | 2022-03-31 |
DE102021123201A1 (en) | 2022-03-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Pandey et al. | Extrinsic calibration of a 3d laser scanner and an omnidirectional camera | |
CN101582165B (en) | Camera array calibration algorithm based on gray level image and spatial depth data | |
US20160063704A1 (en) | Image processing device, image processing method, and program therefor | |
US20180039278A1 (en) | Method for supporting a vehicle docking operation and a support system | |
CN111476106B (en) | Monocular camera-based straight road relative gradient real-time prediction method, system and device | |
CN107449459A (en) | Automatic debugging system and method | |
KR101880185B1 (en) | Electronic apparatus for estimating pose of moving object and method thereof | |
JP2005300230A (en) | Measuring instrument | |
KR102006291B1 (en) | Method for estimating pose of moving object of electronic apparatus | |
JP2019003613A (en) | Method and device for calibrating vehicle camera of vehicle | |
KR101503046B1 (en) | inertial measurement unit and method for calibrating the same | |
EP3227634B1 (en) | Method and system for estimating relative angle between headings | |
US10914572B2 (en) | Displacement measuring apparatus and displacement measuring method | |
KR101735325B1 (en) | Apparatus for registration of cloud points | |
CN111025330A (en) | Target inclination angle detection method and device based on depth map | |
SE544405C2 (en) | Method and control arrangement for extrinsic calibration of a camera arranged at a vehicle | |
KR101575934B1 (en) | Apparatus and method for motion capture using inertial sensor and optical sensor | |
JP7400076B2 (en) | Vision-based blade positioning | |
CN114789439B (en) | Slope positioning correction method, device, robot and readable storage medium | |
CN113392909A (en) | Data processing method, data processing device, terminal and readable storage medium | |
CN112272757A (en) | External parameter calibration method and device for detection device and movable platform | |
Li et al. | Indoor Localization for an Autonomous Model Car: A Marker-Based Multi-Sensor Fusion Framework | |
US20240337747A1 (en) | Sensor apparatus with multiple sensors for moving agent | |
WO2023162017A1 (en) | Position and posture estimation device, position and posture estimation system, and sensor installation method | |
CN112020730A (en) | Method for detecting the arrangement of cameras of a moving carrier platform relative to each other and for detecting the arrangement of cameras relative to an object outside the moving carrier platform |