SE2051312A1 - Method and control arrangement for determining displacement of a sensor in relation to a reference pose - Google Patents
Method and control arrangement for determining displacement of a sensor in relation to a reference poseInfo
- Publication number
- SE2051312A1 SE2051312A1 SE2051312A SE2051312A SE2051312A1 SE 2051312 A1 SE2051312 A1 SE 2051312A1 SE 2051312 A SE2051312 A SE 2051312A SE 2051312 A SE2051312 A SE 2051312A SE 2051312 A1 SE2051312 A1 SE 2051312A1
- Authority
- SE
- Sweden
- Prior art keywords
- sensor
- sensors
- vehicle
- pose
- displacement
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000006073 displacement reaction Methods 0.000 title claims abstract description 41
- 238000004590 computer program Methods 0.000 claims abstract description 9
- 238000001514 detection method Methods 0.000 claims description 5
- 238000005259 measurement Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 description 1
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/40—Control within particular dimensions
- G05D1/43—Control of position or course in two dimensions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D53/00—Tractor-trailer combinations; Road trains
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Aviation & Aerospace Engineering (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
- Measurement Of Length, Angles, Or The Like Using Electric Or Magnetic Means (AREA)
- Transmission And Conversion Of Sensor Element Output (AREA)
Abstract
The present disclosure relates to a method and a control arrangement for displacement of a sensor in relation to a reference pose. A first aspect of the disclosure relates to a method for determining displacement of a first sensor in relation to a reference pose of the first sensor, wherein the first sensor is included in a set of sensors arranged at a vehicle, wherein the set of sensors are configured for use in autonomous operation of the vehicle. The method comprises obtaining S1, using one or more second sensors from the set of sensors, sensor data indicative of a pose of the first sensor; and determining S2 the displacement of the first sensor, based on the sensor data obtained by the one or more second sensors. The disclosure also relates to a computer program, to a computer-readable medium to a control arrangement and to a vehicle.
Description
Method and control arrangement for determining displacement of a sensor in relation to a reference pose Technical field The present disclosure relates to a method and a control arrangement for determining displacement of a sensor in relation to a reference pose. ln particular, the disclosure relates to determining a pose of a first sensor included in a set of sensors arranged at a vehicle, wherein the set of sensors is configured for use in autonomous operation of the vehicle. The disclosure also relates to a computer program, to a computer-readable medium and to a vehicle.
Background Autonomous operation of vehicles within reasonably unrestricted operational design domains requires utilization of sensors mounted at the vehicle to perceive the environment. For accurate use of the sensor data, the poses of these sensors have to be known. ln addition, relations between the sensors and the environment and in-between the sensors also need to be known.
These relations can be measured and compensated for by means of calibration and adjustments of the sensors. lt is also possible to measure the pose of the vehicle by means of, for example, inertial measurement units (ll\/IU). The ll\/lUs measurements are taken in relation to the magnetic field of the earth. However, sometimes the parts of the vehicle at which the sensors are mounted are not rigid but are flexing during operation. The parts may for example move relative to each other and they can also change in shape. These effects are often caused by the movement of the vehicle over uneven terrain or by change in longitudinal and lateral position. l\/lovement between parts of the vehicle can be estimated using displacement sensors, for example ll\/lUs, arranged at different parts of the vehicle. For example, patent application US2019163201 A1 proposes that data indicative of a displacement of the cab relative to the chassis can be obtained from a displacement sensor configured to obtain measurements of a displacement.
However, to accurately measure the movement and shape change of several different parts of a vehicle, multiple displacement sensors have to be deployed in order to detect how the different parts move in relation to each other. However, this makes the sensor pose determination ineffective, because of long tolerance chains and lot of expensive ll\/lUs. Thus, there is a need for other ways of positioning of sensors arranged at a vehicle.
Summary lt is an object of the disclosure to alleviate at least some of the drawbacks with the prior art. Thus, it is an object of this disclosure to provide less complex ways of determining poses of sensors in relation to a vehicle without addition of additional hardware. lt is a further objective to provide a way of determining the poses that can be used during operation of the vehicle.
To meet these objectives, this disclosure proposes techniques where sensors used for autonomous operation perception are also utilized to perceive poses of other SGFISOFS.
According to a first aspect, the disclosure relates to a method for determining displacement of a first sensor in relation to a reference pose of the first sensor. The first sensor is included in a set of sensors arranged at a vehicle. The set of sensors is configured for use in autonomous operation of the vehicle. The method comprises obtaining, using one or more second sensors from the set of sensors, sensor data indicative of a pose of the first sensor. The method further comprises determining the displacement of the first sensor based on the sensor data obtained by the one or more second sensors. Thereby, sensors that are anyway required for autonomous operation are used for determining sensor poses, which reduces the cost, complexity, and uncertainties of the system. ln this way, fewer sensors are required as a need for a plurality of ll\/lUs is eliminated, and a more direct measuring method is achieved. ln some embodiments, there is a Line-of-Sight between the at least one second sensor and the first sensor, or between the at least one second sensor and a part of the vehicle on which the first sensor is attached, when obtaining the sensor data indicative of the pose. Thereby, a movement or rotation of the first sensor may be directly (or indirectly when being a part of the vehicle) detected. ln some embodiments, the determining comprises determining a pose of a part of the vehicle on which the first sensor is attached. This is beneficial as it is often easier to detect a feature of a part of the vehicle, such as a corner, than of the sensor itself. ln some embodiments, the determining is based on reference poses of the individual sensors of the set of sensors. By using information about reference poses the determination of a deviation is facilitated. ln some embodiments, the determining comprises using a vehicle model defining how the individual sensors, or parts of the vehicle on which the sensors are attached, can move in relation to each other. Thereby, determination of sensor poses is facilitated, as possible sensor movement is limited by the model. ln some embodiments, the determining comprises detecting at least one feature of the first sensor and/or at least one feature of a part of the vehicle on which the first sensor is arranged and comparing the pose of the at least one feature to a reference pose of the at least one feature. Feature detection is one possible technique that is commonly available and that may be used to implement the proposed method. ln some embodiments, the at least one feature comprises one or more of a surface, an edge, a corner, a shape and a colour. These features are typically easy to detect using a feature detection algorithm. ln some embodiments, the determining the displacement comprises determining the pose in six degrees of freedom. Thus, the pose of the first sensor may be fully determined using the method. ln some embodiments, the method comprises determining a displacement of the first sensor based on fused sensor data obtained by a plurality of second sensors.
Thereby, better accuracy may be achieved. ln some embodiments, one of the plurality of second sensors is configured to measure distance with a resolution that meets a first resolution criteria and wherein another sensor of the plurality of second sensors is configured to measure angles with a resolution that meets a second resolution criteria. ln this way good accuracy is achieved in terms of position and rotation of the first sensor. ln some embodiments, the at least one second sensor comprises a distance sensor and/or an image sensor. Distance sensors are typically good at measuring position while image sensors are good at measuring rotation. ln some embodiments, the method comprises compensating for the determined displacement of the first sensor during autonomous operation of the vehicle. Thereby, the vehicle can be autonomously operated in a secure way.
According to a second aspect, the disclosure relates to a control arrangement controlling a vehicle, the control arrangement being configured to perform the method according to the first aspect.
According to a third aspect, the disclosure relates to a vehicle comprising the control arrangement of the second aspect.
According to a fourth aspect, the disclosure relates to a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to the first aspect.
According to a fifth aspect, the disclosure relates to a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to the first aspect.
Brief description of the drawinqs Fig. 1 illustrates a vehicle 1, where the proposed method for determining displacement of a first sensor may be implemented.
Figs. 2a and 2b illustrate cabin movement of the vehicle in Fig. 1.
Fig. 3 illustrates a second sensor obtaining sensor data indicative of a pose of a first sensor.
Fig. 4 illustrates determination of displacement of sensors 11 caused by skewing of a chassis of the vehicle of Fig. 1.
Fig. 5 illustrates a computer implemented method for determining displacement of a first sensor in relation to a reference pose of the first sensor.
Fig. 6 illustrates a control arrangement of the vehicle of Fig. 1 according to an example embodiment.
Detailed description This disclosure proposes utilizing a set of sensors for use in autonomous operation perception to also position (i.e. perceive poses) of each other. ln other words, this disclosure proposes to let sensors in an autonomy sensor suite determine each other"s poses. The disclosure also proposes using the set sensors to perceive shape of bodies or parts of the vehicle and the environment. lnformation about shape of bodies or parts of the vehicle and the environment can be used for perceiving the poses. ln the following disclosure, embodiments of a method for determining displacement of a first sensor in relation to a reference pose of the first sensor will be explained with reference to the figures 1 to 6.
Fig. 1 conceptually illustrates a vehicle 1, here a truck, where the proposed method for determining displacement of a first sensor in relation to a reference pose may be implemented. The proposed technique is applicable to any vehicle implementing any type of perception for autonomous operation. Note that autonomous operation is herein not limited to fully autonomous operation, but also includes autonomous functions such as lane keeping and parking assistance.
The vehicle 1 comprises equipment required for autonomous driving, such as an autonomous driving system, a navigation system, sensors, meters etc. The autonomous driving system is configured to operate the vehicle autonomously. The sensors and meters are configured to provide vehicle parameters for use by the autonomous driving system. The navigation system is configured to determine a curvature of an upcoming road. For simp|icity, only parts of the vehicle 1 that are related to the proposed technique are illustrated in Fig. 1 and described herein.
The illustrated vehicle 1 comprises a set of sensors 11 and a control arrangement 10. The set of sensors 11 are configured for use in autonomous operation of the vehicle 1 and are sometimes referred to as an autonomy sensor suite. The autonomous sensors play an essential role in automated driving. For example, the autonomy sensor suite allows the vehicle to monitor its surroundings, detect oncoming obstacles, and safely plan its paths. ln other words, the set of sensors 11 are configured for use in autonomous driving. For example, the set of sensors 11 is configured for use in object detection, ego-vehicle localisation or odometry during autonomous operation. ln other words, the set of sensors 11 are for use in perceiving the vehicle"s own position and movement as well as its surroundings during autonomous driving. The sensors 11 may comprise, but are not limited to, lidars or radars for distance measurements and sensors for angular measurements. As mentioned above, for accurate use of the sensors 11, the poses (orientation and position) of the sensors 11 in the vehicle coordinate frame needs to be known. The vehicle coordinate frame is typically defined in relation to the centre of the rear wheel axle.
The set of sensors 11 is arranged at different parts 1a, 1b of the vehicle 1. ln the illustrated example one sensor (denoted first sensor 11a) is mounted at a first part 1a, here the cabin, and two other sensors (denoted second sensors 11b) are mounted at a second part 1b, here the chassis. ln many situations there is a Line- of-Sight between individual sensors, within the set of sensors 11. Alternatively, there is a Line-of-Sight between one sensor and a part of the vehicle on which another sensor is attached. For example, in this example there is a line of sight between the second sensors 11b (mounted at the chassis) and a part 1a of the vehicle 1 (here the cabin) on which the first sensor 11a is attached. Consequently, the individual sensors of the set of sensors may be used to detect or sense each other. For example, the second sensors 11b may capture images picturing the second sensor or picturing the part on which the first sensor is attached. lt is for example assumed that any displacement of the part 1a on which the first sensor 11a is attached corresponds to the displacement of the first sensor 11a.
To improve driver comfort and stability, the cabin is typically flexibly suspended to the chassis. Hence, when the vehicle 1 is driving on an uneven road the relation between the cabin and the chassis will typically vary, as illustrated in Figs. 2a and 2b. Consequently, the pose of the first sensor 11a in the vehicle coordinate system may also vary while driving vehicle 1.
This disclosure is based on the insight that the other sensors in the set of sensors 11 can be used to determine for example such a deviation. For example, a second sensor 11b mounted on the chassis can be used to obtain sensor data indicative of a pose of a first sensor 11a mounted at the cabin, as illustrated in the example embodiment of Fig. 3. For example, the second sensor can be used to detect the movement of the part of the vehicle on which the first sensor is attached (here referred to as a first part 1a). ln the example of Fig. 3 the first part 1a is the cabin. The pose is determined by detecting movement of the upper left corner of the cabin using feature detection in one or more images. Hence, a deviation d of the cabin 1a from a reference pose (e.g. pre-configured as straight up) may then be determined based on the deviation of the corner. ln the example of Fig. 3, the second sensor is assumed to be fixed in the vehicle coordinate frame. Thus, the deviation d may be determined by detecting a movement of a position pz of the upper right corner of the cabin and comparing it by its position pl in the reference position in the vehicle coordinate frame.
The proposed method may be used in even more complex scenarios, where it is possible that several or all of the sensors move in relation to each other. For example, it is possible that the chassis of the vehicle is not 100% solid but can skew if forces are applied. Fig. 4 illustrates determination of displacement of sensors 11 caused by skewing of a chassis of the vehicle of Fig. 1. ln the example of Fig. 4 four sensors 11 are positioned in different corners of the chassis, such that each sensor 11 is in line-of-sight with at least two other sensors 11. Skewing of the chassis may cause the sensors to move in relation to each other. However, with the proposed method that will be described below, the positions of the sensors 11 may be calculated based on combined sensor data obtained by all the sensors. ln general, three measurements are required for accurate positioning of an object. However, provided that the dynamics of the chassis is known to some extent, fewer measurements may be required as the dynamics may serve as a restriction in possible movement. For example, a mathematical model may be used to describe the shape of the chassis. Sensor data from one or more of the sensors viewing each other (i.e. the sensors are line-of-sight of each other) may then be used to determine a present state of the model and thereby also the present position of the individual sensors.
The proposed method for determining displacement of a first sensor 11a in relation to a reference pose will now be described with reference to the flow chart of Fig. 5 and the vehicle of Figs. 1-4. The method is performed in a control arrangement 10. ln this example, the control arrangement 10 is arranged in the vehicle 1. However, it must be appreciated that the control arrangement 10 may alternatively be at least partly arranged off-board as long as it can communicate with the sensors 11.
The method may be implemented as a computer program comprising instructions which, when the program is executed by a computer (e.g. a processor in the control arrangement 10 (Fig. 5)), cause the computer to carry out the method. According to some embodiments the computer program is stored in a computer-readable medium (e.g. a memory or a compact disc) that comprises instructions which, when executed by a computer, cause the computer to carry out the method.
A reference pose of a sensor 11 herein refers to a pre-configured reference pose which for example corresponds to a pose (position and/or rotation) of the sensor 11 when the vehicle is standing on a flat and horizontal surface and when there are no external forces acting of the vehicle 1. Thus, the pre-configured reference pose may correspond to a predefined mounting pose. The pre-configured reference pose is typically known, i.e. it is stored in the control arrangement 10.
The method is for example performed during operation of the vehicle 1 while the parts of the vehicle 1 are deformed and/or move in relation to each other, which might cause the sensors 11 to deviate from their reference poses as explained above. A deviation is herein referred to as a deviation in any degree of freedom (in three-dimensional space). Hence, it might be a rotation around any axis and/or a movement in any direction.
As discussed above, the set of sensors 11 are typically mounted at certain positions at the vehicle 1. The positions are suitable for use in autonomous operation but will also allow the sensors to perceive features of relevant bodies of the vehicle. The relevant bodies are for example other sensors or parts on which other sensors are attached. The features may be natural features, such as, corners, edges, surfaces, shapes colours and/or augmented features, such as markers placed for the purpose of positioning a sensor. l\/lany of these features are indicative of a pose of the sensor. The sensors may also be configured to perceive suitable features of themselves and of the environment.
The proposed method utilises the fact that the sensors 11 can determine each other"s poses. l\/lore specifically, the proposed method comprises obtaining S1, using one or more second sensors 11b from the set of sensors, sensor data indicative of a pose of a first sensor 1 1a from the set of sensors 11. ln other words, sensor data indicative of a pose of a first sensor 11 in the set of sensors is recorded using one or more of the other sensors in the set. The sensor data is for example images, distance, or position measurements etc. lO The measurements may require Line-Of-Sight measurements. Hence, in some embodiments there is a Line-of-Sight between the at least one second sensor 11b and the first sensor 11a, or between the at least one second sensor 11b and a part 1a of the vehicle 1 on which the first sensor 1 1a is attached, when obtaining S1 the sensor data indicative of the pose.
The pose of the first sensor 11a is then determined based on the obtained sensor data. The pose of the first sensor is defined in relation to a reference pose of the sensor. The reference pose is as mentioned above typically pre-programmed. ln other words, the method further comprises determining S2 a displacement d of the first sensor, based on the sensor data obtained by the one or more second sensors 1 1 b.
The determining may be performed in different ways. ln a very simple example, the pose of the second sensor 1 1 b in the vehicle coordinate system is known. The pose of the first sensor 1 1a in the vehicle coordinate system may then be directly derived from sensor data recorded by the second sensor 1 1 b. One example would be using feature recognition in an image to detect lateral deviation of the cabin in the example of Fig. 2. ln other words, the determining comprises determining S2 a pose of a part 1a of the vehicle 1 on which the first sensor 11a is attached. Longitudinal movement of a cabin may in the same way be detected using a distance measurement using e.g. lidar or radar. ln other words, in some embodiments, the determining S2 comprises detecting at least one feature of a part 1a of the vehicle 1 on which the first sensor 11a is arranged and comparing a pose of the feature to a reference pose p1 of the feature. Attached to herein refers for example to that the sensor 11a is rigidly mounted or flexibly mounted to the part with an at least partly known relation.
Alternatively, the first sensor 11a itself may be detected by the second sensor 1 1 b. This may be done by detecting the shape of the first sensor 11a or by detecting a marker or patter (e.g. a cross in a certain colour) arranged on the first sensor 11a. ln other words, in some embodiments, the determining S2 comprises detecting at ll least one feature of the first sensor 11a and comparing the pose of the feature to a reference pose p1 of the feature. ln some embodiments, the sensor data is evaluated based on the pose of the second sensor 11b. The pose of the second sensor 11b may be known, e.g. fix in the vehicle coordinate system. Alternatively, only a reference pose 11b of the second sensor is known. ln such a scenario the pose of the second sensor 11b might also need to be determined using for example one or more sensors from the set of sensors 11 to determine a deviation of the second sensor 11b from its reference pose. Alternatively, it may be assumed that the second sensor 11b is positioned at its reference pose. ln other words, in some embodiments, the determining S2 is based on reference poses of the individual sensors of the set of sensors. For example, the determining is based on the reference poses of the first and second sensors 11a, 11b. ln cases where several or all of the sensors are expected to deviate from their reference poses knowledge about the vehicle"s dynamics can be used to determine the deviations of the sensors 11. Typically, the sensors can only deviate in certain directions. There might for example be known relations between deviations of two or more individual sensors such as mechanical constraints or common rigid fixation points. For example, two sensors that are attached to the same part will always deviate (move) in the same direction. ln some embodiments, the determining S2 a displacement comprises using a vehicle model defining how the individual sensors, or parts of the vehicle 1 on which the sensors are attached, can move in relation to each other. The vehicle model may describe how the sensors 11 may move in relation to each other. The movement may include both rotation and position.
As mentioned above, three measurements are typically required for positioning an object. ln some embodiments, one of these measurements may be in relation to an external reference. ln the example of Fig. 4 it is possible that each sensor sees a common reference e.g. a feature on the ground and that it also sees two other sensors from the set. This information will typically be enough to position all sensors 12 in relation to each other, as there are then three measurements related to each SGFISOF.
The determined displacement may involve position and/or rotation of the first sensor. ln some embodiment, a displacement in any direction is determined. ln other words, in some embodiments, determining S2 the displacement comprises determining the pose in six degrees of freedom. ln other words, a system can be developed that can accurately calculate and compensate for the pose and shape changes. This is done by analysing pose in three directions x, y, z and in addition rotation described by roll, pitch, and yaw. Alternatively, in some embodiment only deviation in some selected directions are determined. ln some embodiments, the determining S2 a displacement d of the first sensor based on fused sensor data obtained by a plurality of second sensors 11b. This in practice means that more information is used to determine the deviation. Sensor fusion is a well-known technique to bring together inputs from multiple sensors such as radars, lidars and cameras to form a single model or image of an object. The resulting model is more accurate because it balances the strengths of the different types of sensors. ln general, different types of sensors are suitable for measuring rotation than for measuring distance. For example, a Lidar can be used to determine a pose with high accuracy e.g. with resolution above a certain level or threshold. On the other hand, an image sensor is typically better at estimating an angle or rotation. Stated differently, an image sensor can measure angle or rotation with high resolution, e.g. a resolution above a certain level or threshold. ln other words, in some embodiments, one of the plurality of second sensors 11b is configured to measure distance with a resolution that meets a first resolution criteria and wherein another sensor of the plurality of second sensors 11b is configured to measure angles with a resolution that meets a second resolution criteria.
The determined deviation of the first sensor can then be used to calibrate sensor data recorded by the first sensor for example when autonomously operating the vehicle. Hence, in some embodiments, the method comprises compensating S3 for 13 the determined displacement of the first sensor 11a during autonomous operation of the vehicle. This typically involves calibrating all sensors to the same coordinate frame. ln some embodiments, one sensor 11 is not a single sensor but a sensor module comprising several sensors that are mounted such that their internal relations are fixed and known. For example, the sensors are rigidly mounted to a rigid structure of the sensor module. ln these embodiments, the displacement of the entire sensor module (from a reference pose of the sensor module) may be determined using the technique described above. For example, the relations between the sensors of the sensor module are pre-calibrated (before installation) or the sensors are mounted to a structure or frame of the sensor module with a certain accuracy. ln some embodiments, one sensor is a sensor module comprising several sensors of different types. For example, one sensor module may comprise one radar, one lidar and one image sensor.
Now turning to Fig. 6 which illustrates the control arrangement 10 configured to implement the proposed method in more detail. ln some embodiments, the control arrangement 10 is a "unit" in a functional sense. Hence, in some embodiments the control arrangement 10 is a control arrangement comprising several physical control devices that operate in corporation.
The control arrangement 10 comprises one or more ECUs. An ECU is basically a digital computer that controls one or more electrical systems (or electrical sub systems) of the vehicle 1 based on e.g. information read from sensors and meters placed at various parts and in different components of the vehicle 1. ECU is a generic term that is used in automotive electronics for any embedded system that controls one or more functions of the electrical system or sub systems in a transport vehicle. The control arrangement 10 comprises for example an Automated-Driving Control Unit, ADAS or any other suitable ECU.
The control arrangement 10, or more specifically the processor 101 of the control arrangement 10, is configured to cause the control arrangement 10 to perform all 14 aspects of the method described above and below. This is typically done by running computer program code stored in the data storage or memory 102 in the processor 101 of the control arrangement 10. The data storage 102 may also be configured to store semi-static vehicle parameters such as vehicle dimensions.
The control arrangement 10 may also comprise a communication interface (not shown) for communicating with other control units of the vehicle and/or with off- board systems.
More specifically the control arrangement is configured to obtaining S1, using one or more second sensors 11b from the set of sensors, sensor data indicative of a pose of the first sensor 11a; and to determine S2 the displacement d of the first sensor, based on the sensor data obtained by the one or more second sensors11b. ln some embodiments, the control arrangement 10 is configured to compensate S3 for the determined displacement of the first sensor 11a during autonomous operation of the vehicle.
The proposed technique has herein been explained with reference to a tractor- trailer vehicle. However, the proposed technique may also be applied to more complex vehicles, such as vehicles with multiple actuated steering axles, and articulated vehicles composed of a tractor, a dolly, and a trailer. The proposed technique can be readily extended to consider alternative desired driving behaviours besides that of centring the vehicle on the road. Based on the current traffic situation, it might be beneficial to plan paths that maximize the distance between the vehicle swept area and oncoming traffic. To further validate the approach, we plan to implement the proposed methods on real world tests using autonomous heavy-duty vehicles.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method, control arrangement or computer program. Various changes, substitutions and/or alterations may be made, without departing from disclosure embodiments as defined by the appended claims.
The term "or" as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one", thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. lt will be further understood that the terms "includes", "comprises", "including" and/ or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/ or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims.
Claims (15)
1. A method for determining displacement of a first sensor (11a) inrelation to a reference pose (pl) of the first sensor (11a), wherein the first sensor(11a) is included in a set of sensors (11) arranged at a vehicle, wherein the set ofsensors (11) are configured for use in autonomous operation of the vehicle (1 ), themethod comprising: - obtaining (S1), using one or more second sensors (11b) from the set of sensors, sensor data indicative of a pose of the first sensor (11a); and- determining (S2) the displacement (d) of the first sensor, based on the sensor data obtained by the one or more second sensors (11b).
2. The method according to claim 1, wherein there is a Line-of-Sightbetween the at least one second sensor (1 1 b) and the first sensor (1 1a), or betweenthe at least one second sensor (1 1 b) and a part (1 a) of the vehicle (1) on which thefirst sensor (11a) is attached, when obtaining (S1) the sensor data indicative of the pose.
3. The method according to claim 1 or 2, wherein the determiningcomprises determining (S2) a pose of a part (1 a) of the vehicle (1) on which the first sensor (1 1 a) is attached.
4. The method according to any of the preceding claims, wherein thedetermining (S2) is based on reference poses of the individual sensors of the set of SGFISOFS.
5. The method according to any of the preceding claims, wherein thedetermining (S2) a displacement comprises using a vehicle model defining how theindividual sensors, or parts of the vehicle (1) on which the sensors are attached, can move in relation to each other.
6. The method according to any of the preceding claims, wherein thedetermining (S2) comprises detecting at least one feature of the first sensor (11a)and/or at least one feature of a part (1 a) of the vehicle (1) on which the first sensor(11a) is arranged and comparing the pose of the feature to a reference pose (pl) of the feature.
7. The method according to claim 6, wherein the at least one featurecomprises one or more of a surface, an edge, a corner, a shape and a colour.
8. The method according to claim 7, wherein the determining (S2) the displacement comprises determining the pose in six degrees of freedom.
9. The method according to any of the preceding claims, determining (S2)a displacement (d) of the first sensor based on fused sensor data obtained by a plurality of second sensors (1 1 b).
10. The method according to claim 9, wherein one of the plurality ofsecond sensors (11b) is configured to measure distance with a resolution thatmeets a first resolution criteria and wherein another sensor of the plurality of secondsensors (11b) is configured to measure angles with a resolution that meets a second resolution criteria.
11. The method according to any of the preceding claims, wherein themethod comprises:- compensating (S3) for the determined displacement of the first sensor (11a) during autonomous operation of the vehicle.
12. A method according to any of the preceding claims wherein the set ofsensors (11) are configured for use in at least one of object detection, ego vehiclelocalisation and odometry during autonomous operation of the vehicle.
13. A computer program comprising instructions which, when the computerprogram is executed by a computer, cause the computer to carry out the method according to any one of the preceding claims.
14. A control arrangement (1 O) configured for contro||ing a vehicle (1 ), thecontrol arrangement (10) being configured to perform the method according to any one of claims 1 to
15. A vehicle (1) comprising a control arrangement (1 O) according to claim 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2051312A SE545062C2 (en) | 2020-11-10 | 2020-11-10 | Method and control arrangement for determining displacement of a vehicle sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2051312A SE545062C2 (en) | 2020-11-10 | 2020-11-10 | Method and control arrangement for determining displacement of a vehicle sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
SE2051312A1 true SE2051312A1 (en) | 2022-05-11 |
SE545062C2 SE545062C2 (en) | 2023-03-21 |
Family
ID=81851771
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE2051312A SE545062C2 (en) | 2020-11-10 | 2020-11-10 | Method and control arrangement for determining displacement of a vehicle sensor |
Country Status (1)
Country | Link |
---|---|
SE (1) | SE545062C2 (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180284243A1 (en) * | 2017-03-31 | 2018-10-04 | Uber Technologies, Inc. | Autonomous Vehicle Sensor Calibration System |
US20190066323A1 (en) * | 2017-08-24 | 2019-02-28 | Trimble Inc. | Excavator Bucket Positioning Via Mobile Device |
US20190163201A1 (en) * | 2017-11-30 | 2019-05-30 | Uber Technologies, Inc. | Autonomous Vehicle Sensor Compensation Using Displacement Sensor |
US20190163189A1 (en) * | 2017-11-30 | 2019-05-30 | Uber Technologies, Inc. | Autonomous Vehicle Sensor Compensation By Monitoring Acceleration |
EP3699630A1 (en) * | 2019-02-25 | 2020-08-26 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | System and method for compensating a motion of a vehicle component |
US10798303B1 (en) * | 2019-09-09 | 2020-10-06 | Tusimple, Inc. | Techniques to compensate for movement of sensors in a vehicle |
DE102019205504A1 (en) * | 2019-04-16 | 2020-10-22 | Zf Friedrichshafen Ag | Control device and method as well as computer program product |
-
2020
- 2020-11-10 SE SE2051312A patent/SE545062C2/en unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180284243A1 (en) * | 2017-03-31 | 2018-10-04 | Uber Technologies, Inc. | Autonomous Vehicle Sensor Calibration System |
US20190066323A1 (en) * | 2017-08-24 | 2019-02-28 | Trimble Inc. | Excavator Bucket Positioning Via Mobile Device |
US20190163201A1 (en) * | 2017-11-30 | 2019-05-30 | Uber Technologies, Inc. | Autonomous Vehicle Sensor Compensation Using Displacement Sensor |
US20190163189A1 (en) * | 2017-11-30 | 2019-05-30 | Uber Technologies, Inc. | Autonomous Vehicle Sensor Compensation By Monitoring Acceleration |
EP3699630A1 (en) * | 2019-02-25 | 2020-08-26 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | System and method for compensating a motion of a vehicle component |
DE102019205504A1 (en) * | 2019-04-16 | 2020-10-22 | Zf Friedrichshafen Ag | Control device and method as well as computer program product |
US10798303B1 (en) * | 2019-09-09 | 2020-10-06 | Tusimple, Inc. | Techniques to compensate for movement of sensors in a vehicle |
Non-Patent Citations (1)
Title |
---|
'Modularised data driven modelling and real time estimation of cabin dynamics in heavy-duty vehicles'; In: ip.com Prior Art Database Technical Disclosure; 2019-09-02. * |
Also Published As
Publication number | Publication date |
---|---|
SE545062C2 (en) | 2023-03-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11125881B2 (en) | Lidar-based trailer tracking | |
CN109211249B (en) | Method and system for vehicle localization | |
US9902425B2 (en) | System for guiding trailer along target route during reversing maneuver | |
US10046803B2 (en) | Vehicle control system | |
US10011296B2 (en) | Synchronizing vehicle steering | |
US20210215505A1 (en) | Vehicle sensor calibration | |
CN113805145B (en) | Dynamic lidar alignment | |
CN111795692B (en) | Method and apparatus for parallel tracking and positioning via a multi-mode SLAM fusion process | |
US11958485B2 (en) | Vehicle control method and apparatus | |
US10798303B1 (en) | Techniques to compensate for movement of sensors in a vehicle | |
US11912360B2 (en) | Vehicle control method, vehicle control system, and vehicle | |
CN111284477A (en) | System and method for simulating steering characteristics | |
GB2571587A (en) | Vehicle control method and apparatus | |
SE2051312A1 (en) | Method and control arrangement for determining displacement of a sensor in relation to a reference pose | |
US12086996B2 (en) | On-vehicle spatial monitoring system | |
Baer et al. | EgoMaster: A central ego motion estimation for driver assist systems | |
Záhora et al. | Perception, planning and control system for automated slalom with Porsche Panamera | |
US20240202970A1 (en) | Object angle detection | |
US20240087329A1 (en) | Determining a trailer orientation | |
US20230096655A1 (en) | Method for reversing an articulated vehicle combination | |
Kasper et al. | Sensor-data-fusion for an autonomous vehicle using a Kalman-filter | |
WO2024225951A1 (en) | Calibration of sensors on articulated vehicle | |
KR20240030846A (en) | Apparatus for estimating a vehicel by using a lidar sensor and method thereof | |
CN114527738A (en) | Semi-autonomous backing of following vehicles | |
GB2580401A (en) | A control system, system and method for providing assistance to an occupant of a vehicle |