WO2023276025A1 - Information integration device, information integration method, and information integration program - Google Patents
Information integration device, information integration method, and information integration program Download PDFInfo
- Publication number
- WO2023276025A1 WO2023276025A1 PCT/JP2021/024695 JP2021024695W WO2023276025A1 WO 2023276025 A1 WO2023276025 A1 WO 2023276025A1 JP 2021024695 W JP2021024695 W JP 2021024695W WO 2023276025 A1 WO2023276025 A1 WO 2023276025A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- feature
- features
- feature information
- correction
- Prior art date
Links
- 230000010354 integration Effects 0.000 title claims abstract description 118
- 238000000034 method Methods 0.000 title claims description 24
- 238000012937 correction Methods 0.000 claims abstract description 101
- 238000004891 communication Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 34
- 238000012545 processing Methods 0.000 description 30
- 230000005540 biological transmission Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y10/00—Economic sectors
- G16Y10/40—Transportation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/10—Detection; Monitoring
Definitions
- the present disclosure relates to an information integration device, an information integration method, and an information integration program.
- Driving support systems such as lane departure warning (LDW) systems, pedestrian detection (PD) systems, and adaptive cruise control (ACC) systems have been developed for the purpose of driving support and preventive safety for vehicle drivers.
- LDW latitude and low-latency warning
- PD pedestrian detection
- ACC adaptive cruise control
- an automatic driving system has been developed that performs part or all of the driving to the destination on behalf of the driver.
- the position, type, state e.g., moving speed, moving direction, etc.
- features e.g., people, objects, vehicles, etc.
- movement It is necessary to accurately recognize the position, type, and state (eg, the lighting state of a traffic light) of features (eg, traffic lights, lanes, etc.) related to the body over a wide range.
- Patent Literature 1 proposes generating wide-range and highly accurate feature information based on feature information acquired by recognition devices mounted on a plurality of vehicles.
- the recognition error is the difference between the actual position and size of the feature and the position and size of the feature indicated by the feature information as the recognition result. Therefore, when integrating feature information representing features recognized by a plurality of recognition devices, one feature may be processed as a plurality of features. is difficult.
- An object of the present disclosure is to provide an information integration device, an information integration method, and an information integration program that make it possible to generate feature information of a wide range of features with high accuracy.
- the information integration device of the present disclosure acquires sensor information acquired by the sensor from a sensor provided in a mobile body, and based on the acquired sensor information, one or more features around the mobile body a recognition unit that recognizes one or more first features and generates first feature information indicating the one or more first features, and from an external device or from acquirable map information or, from both the external device and the map information, second feature information indicating one or more second features which are one or more features recognized to exist around the moving object.
- the first feature information and the a correction information generating unit that generates correction information used to correct at least one of the second feature information; and at least one of the first feature information and the second feature information using the correction information.
- an information integration unit that generates integrated feature information by integrating the first feature information and the second feature information after correcting the above.
- An information integration method of the present disclosure is a method implemented by an information integration device, in which sensor information acquired by the sensor is acquired from a sensor provided in a mobile body, and based on the acquired sensor information, the a step of recognizing one or more first features that are one or more features around a moving object and generating first feature information indicating the one or more first features; One or more second features that are one or more features recognized to exist around the moving object from an external device, from map information that can be acquired, or from both the external device and the map information second feature information indicating features is obtained, and the one or more first features indicated by the first feature information and the one or more first features indicated by the second feature information are obtained; identifying the same feature from among the second features; generating correction information used to correct at least one of the first feature information and the second feature information; and using the correction information to correct the first feature information and the second feature information. and generating integrated feature information by integrating the first feature information and the second feature information after correcting at least one of the information.
- FIG. 5 is a diagram showing second feature information indicating a plurality of features (second features) recognized by the recognition device;
- A is a diagram showing first feature information indicating a plurality of first features recognized by a recognition unit of an information integrating device of a vehicle;
- FIG. 10 is a diagram showing corrected first feature information obtained by correcting one piece of feature information;
- A is a diagram showing first feature information indicating a plurality of first features recognized by a recognition unit of an information integration device of a vehicle;
- B is a diagram showing first feature information recognized by a recognition device of a roadside unit;
- FIG. 10C is a diagram showing second feature information indicating a plurality of second features that have been generated, and
- FIG. FIG. 11 is a diagram showing a process of correcting the first feature information in FIG.
- FIG. 10B is a diagram showing a process of correcting the first feature information in FIG. 1A based on the sign information included in the first feature information and the second feature information in FIG. .
- (A) is a diagram showing actual road conditions at a certain time
- (B) is first feature information indicating a plurality of first features recognized by a recognition unit of an information integrating device of a vehicle.
- FIG. 10C is a diagram showing second feature information (before correction) showing a plurality of second features recognized by the recognition device of the roadside unit (before correction);
- (A) is a diagram showing the first feature information obtained by correcting the first feature information of FIG. 8 (B) (that is, the corrected first feature information);
- B) is a diagram showing second feature information (corrected second feature information) obtained by correcting the feature information in FIG. 8C
- FIG. 10A is a diagram showing integrated feature information obtained by integrating the first feature information in FIG. 9A and the second feature information in FIG. 9B;
- FIG. It is a flow chart which shows operation of the information integration device concerning an embodiment.
- FIG. 1 is a functional block diagram showing the configuration of an information integration device 10 according to this embodiment.
- the information integration device 10 is a device capable of implementing the information integration method according to the embodiment.
- the information integration device 10 includes a recognition unit 11, an information transmission/reception unit 12, a same feature identification unit 13, a correction information generation unit 16, an information integration unit 17, a map information storage unit 14, and a correction information storage unit. 15. Details of these configurations will be described later.
- FIG. 1 shows an information integration device 10 mounted on a vehicle 100 as a moving body that is an object with a moving function (that is, an object that can travel).
- the information integration device 10 can also be mounted on a moving body other than the vehicle 100 (eg, ship, robot, personal mobility, etc.).
- the information integration device 10 can be mounted on a fixed body (for example, infrastructure equipment, etc.) that does not have a mobile function.
- the infrastructure equipment is, for example, a traffic light or a roadside device.
- the information integration device 10 may be installed in a server that can communicate with a mobile object, a fixed object, or the like via a network.
- a vehicle 100 is equipped with an information integration device 10 and an in-vehicle sensor 20 .
- the in-vehicle sensor 20 is composed of, for example, a sensor such as a camera, millimeter wave radar, Lidar (Light Detection and Ranging), sonar, or a combination of two or more of these sensors.
- the in-vehicle sensor 20 and the recognition unit 11 constitute a recognition device (also referred to as “recognition means”) that recognizes surrounding features.
- a vehicle 100 is equipped with an information integration device 10 and an on-vehicle sensor 20.
- the vehicle 100 includes an advanced driving support device that performs emergency automatic braking, lane maintenance, etc.
- a device such as an automatic driving device that performs part or all on behalf of the driver may be installed.
- the information integration device 10 may be part of a device that constitutes an advanced driving assistance device, an automatic driving device, or the like.
- the information integration device 10 acquires sensor information acquired by the vehicle-mounted sensor 20 from the vehicle-mounted sensor 20 provided in the vehicle 100, which is a moving object, and based on the acquired sensor information, detects information around the vehicle 100 (for example, on the moving route). ), and generates first feature information indicating the one or more first features, and an external device One or more features recognized to exist in the vicinity of the vehicle 100 (e.g. roadside unit 200), from available map information (e.g.
- map information 14a maps second feature information indicating one or more second features indicated by the first feature information and the one or more first features indicated by the first feature information and the second feature information indicated by the second feature information a same feature identification unit 13 for identifying the same feature from among the one or more second features identified by the second feature.
- the information integration device 10 generates the first feature information and the second feature information based on the positional difference between the first feature and the second feature identified as the same feature.
- a correction information generation unit 16 that generates correction information used to correct at least one of the information;
- an information integration unit 17 for generating integrated feature information by integrating the first feature information and the second feature information.
- FIG. 2 is a diagram showing the hardware configuration of the information integration device 10 according to this embodiment.
- the information integration device 10 is, for example, a computer, a dedicated computing device, or a device combining a computer and a dedicated computing device.
- the information integration device 10 includes a processor 31 as an information processing section, a memory 32 as a storage section, a storage device 33 as a non-volatile storage section, an interface 34 and a communication section 35 .
- the processor 31 is connected to other hardware via a system bus and controls these other hardware.
- the processor 31 is an IC (Integrated Circuit) that performs processing. Specific examples of the processor 31 are CPU (Central Processing Unit), DSP (Digital Signal Processor), GPU (Graphics Processing Unit), or FPGA (Field Programmable Gate Array).
- the processor 31 executes programs stored in the memory 32 .
- the program includes an information integration program according to this embodiment.
- the functions of the information integration device 10 are implemented by the processor 31 that executes programs.
- Processor 31 is an example of processing circuitry.
- the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 are realized by the processor 31, and the map information storage unit 14 and the correction information storage unit 15 are Implemented in memory 32 .
- the CPU performs processes such as program execution and data calculation.
- DSPs perform digital signal processing such as arithmetic operations and data movement. For example, processing such as sensing of sensor data obtained from a millimeter-wave radar is desirably processed at high speed by a DSP rather than by a CPU.
- a GPU is a processor that specializes in image processing.
- a GPU is capable of high-speed image processing by processing a plurality of pieces of pixel data in parallel.
- a GPU can perform template matching processing, which is frequently used in image processing, at high speed. For example, sensing of sensor data obtained from a camera is desirably processed by a GPU. If the CPU processes the sensing of sensor data obtained from the camera, the processing time becomes enormous.
- the GPU is not simply used as a processor for image processing, but is used to perform general-purpose calculations (GPGPU: General Purpose Computing on Graphics Processing Units) using the computational resources of the GPU.
- GPGPU General Purpose Computing on Graphics Processing Units
- An FPGA is a processor that can program the configuration of logic circuits.
- FPGAs are both dedicated hardware arithmetic circuits and programmable software. Complex calculations and parallel processing can be executed at high speed by FPGA.
- the memory 32 is, for example, a volatile memory. Volatile memory can move data at high speed when the information integration device 10 operates. Specific examples of volatile memory are DDR2-SDRAM (Double-Data-Rate2 Synchronous Dynamic Random Access Memory) and DDR3-SDRAM (Double-Data-Rate3 Synchronous Dynamic Random Access Memory).
- DDR2-SDRAM Double-Data-Rate2 Synchronous Dynamic Random Access Memory
- DDR3-SDRAM Double-Data-Rate3 Synchronous Dynamic Random Access Memory
- the storage device 33 is a non-volatile memory, and can continue to hold execution programs and data even while the power of the information integration device 10 is off.
- nonvolatile memory are HDD (Hard Disk Drive), SSD (Solid State Drive), and flash memory.
- the non-volatile memory may be SD (Secure Digital) memory card, CF (Compact Flash (registered trademark)), NAND flash, flexible disk, optical disk, compact disk, Blu-ray (registered trademark) disk, DVD, or other portable storage medium. good.
- the memory 32 is connected to the processor 31 via a memory interface (not shown).
- the memory interface centrally manages memory accesses from processors and performs efficient memory access control.
- the memory interface is used for data transfer in the information integration device 10 .
- the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 shown in FIG. 1 are realized by the processor 31 that executes the program.
- the memory 32 stores programs for realizing the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17.
- FIG. Processor 31 reads these programs from memory 32 and executes these programs.
- the memory 32 is also used to temporarily store intermediate information for each program.
- the functions of the recognition unit 11, the information transmission/reception unit 12, the identical feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 may be realized by logic circuits that are hardware.
- the memory 32 stores logic circuit information. Logic circuit information is read and executed by the processor 31 .
- the processor 31 may also be configured with a plurality of processors. In this case, even if a plurality of processors cooperate to execute programs that implement the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17, good.
- the memory 32 stores map information 14a and correction information 15a.
- the map information 14a is highly accurate road environment information including types, positions, widths, sizes, etc. of road markings, signs, traffic lights, and the like.
- the correction information 15a is information used to correct erroneous recognition and recognition error of feature information.
- the map information 14a and the correction information 15a may be stored in a storage device external to the information integrating apparatus 10. FIG.
- One or more in-vehicle sensors 20 are attached to the vehicle 100 .
- the in-vehicle sensor 20 outputs sensor information for the information integration device 10 to recognize the surrounding situation of the vehicle 100 to the information integration device 10 .
- the program that realizes the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 can be a magnetic disk, a flexible disk, an optical disk, a compact disk, a Blu-ray ( It may be stored in a portable recording medium such as a registered trademark) disk, DVD, or the like. Then, a portable recording medium storing a program for realizing the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 is commercially distributed. good too.
- the “units” of the recognition unit 11, the information transmission/reception unit 12, the identical feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 are replaced with “circuit”, “process”, “procedure”, or “processing”. can be read as
- the recognition unit 11 recognizes one or more objects to be recognized that exist around the vehicle 100 (for example, the lane in which the vehicle 100 is traveling, other lanes that line up with this lane, etc.). , are recognized at a plurality of positions on the movement route as the vehicle 100 moves on the movement route.
- the recognition unit 11 may recognize the positions and attributes of one or more recognition objects existing around the movement route.
- the recognition target may be either a feature existing near the movement route or a feature existing on the movement route.
- a recognition target may be either a three-dimensional feature or a two-dimensional feature.
- Features that are objects to be recognized are, for example, road markings, signs, traffic lights, other vehicles, and pedestrians.
- the recognition unit 11 recognizes the position and attribute of the feature as the feature information of the recognition target.
- the attribute of the recognition target includes at least one of the type of the recognition target and the state of the recognition target.
- the type of recognition target object is, for example, information indicating whether the recognized object is a person, another vehicle, a sign, or the like. Also, if the recognition target is a sign, the type of the recognition target may include detailed information including the type of sign. If the object to be recognized is a vehicle, the type of the object to be recognized may include information that finely classifies vehicle types such as ordinary automobiles, trucks, and light automobiles. For example, if the recognition target is a traffic light, the recognition target may include information indicating whether the recognition target is a red light, a green light, or a yellow light.
- the processing performed by the recognition unit 11 corresponds to recognition processing in the information integration method.
- the map information storage unit 14 is a storage unit that stores high-precision road environment map information 14a including the types, positions, widths, sizes, etc. of road markings, signs, and traffic lights.
- the map information storage unit 14 may be a storage device managed by a server that can communicate via a network.
- the map information includes map information of an area in which vehicle 100 is scheduled to travel.
- the same feature identification unit 13 identifies feature information (also referred to as “first feature information”) recognized by the recognition unit 11 and a recognition device mounted on a device other than the vehicle 100 (for example, a roadside device).
- feature information also referred to as “second feature information” recognized by the first feature information, and one or more features included in the first feature information (also referred to as “first feature information”).
- second features one or more features included in the second feature information (also referred to as “second features”) to identify the same feature.
- FIG. 3 is a diagram showing the actual road conditions at a certain time T.
- FIG. 3 shows, at time T, a plurality of features that actually exist on the road.
- a vehicle 100 has an information integration device 10
- a roadside device 200 has a recognition device for recognizing features.
- the recognition device of the roadside device 200 various recognition devices can be used, and the same recognition device as the recognition device mounted on the vehicle 100 may be used.
- FIG. 4(A) shows a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 at time T with solid rectangular frames.
- FIG. 4(B) shows a plurality of second features recognized by the recognition device of the roadside device 200 at time T with broken-line rectangular frames. Since the recognition area by the recognition unit 11 of the information integration device 10 of the vehicle 100 and the recognition area by the recognition device of the roadside unit 200 are partially common, the first recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 The vehicle Va, which is the first feature, and the vehicle Va, which is the second feature recognized by the recognition device of the roadside unit 200, are the same feature. However, the position of the vehicle Va recognized by the recognition unit 11 of the information integration device 10 in FIG. 4A differs from the position of the vehicle Va recognized by the recognition device of the roadside unit 200 in FIG. 4B. there is
- landmark information such as signs, stop lines, billboards, and tunnel entrances included in the map information, vehicle type, speed, and relative positional relationship information with other features.
- location information when relative position information is used, both the vehicle Va recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 and the vehicle Va recognized by the recognition device of the roadside unit 200 are in the right lane. There is an identical red vehicle Vb at a distance location. In this way, even if there is a detection error in the position information of the feature information indicating the feature recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100, the same feature identification unit 13 can determine the relative position.
- the same feature identification unit 13 identifies the same feature by combining any one or more of the other features or all of the other features.
- the processing performed by the same feature identification unit 13 corresponds to the same feature identification processing in the information integration device 10 .
- FIG. 5A is a diagram showing first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100.
- FIG. FIG. 5(B) is a diagram showing the corrected first feature information obtained by correcting the first feature information in FIG. 5(A).
- FIG. 3 shows an actual feature at time T (also referred to as "reference time").
- T also referred to as "reference time”
- the recognition time of the first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 is earlier than the time T, which is the reference time.
- the identical feature identification unit 13 executes processing for synchronizing the feature information at time T with time T, for example, based on the speed information of the vehicle 100 and ⁇ Ta, which is the error time. By synchronizing the time before determining the same feature, the identification rate of the same feature can be increased.
- FIG. 6A is a diagram showing first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100.
- FIG. FIG. 6B is a diagram showing second feature information indicating a plurality of second features recognized by the recognition device of the roadside device 200.
- FIG. 6(C) shows the first feature information of FIG. 6(A) based on the information of the same vehicle included in the first feature information of FIG. 6(A) and the second feature information of FIG. 6(B).
- FIG. 10 is a diagram showing processing for correcting feature information of No. 1;
- the correction information generating unit 16 Based on the same feature specified by the same feature specifying unit 13, the correction information generating unit 16 generates a first feature representing a plurality of first features recognized by the recognition unit 11 of the information integration device 10. Correction information is generated that is used to correct misrecognitions and recognition errors in the object information and the second feature information indicating a plurality of second features recognized by the recognition device of the roadside unit 200 .
- FIGS. 6A to 6C illustrate the case where the recognition accuracy of the second feature by the recognition device of the roadside unit 200 is higher than the recognition accuracy of the first feature by the recognition unit 11 of the information integration device 10. show.
- the correction information generation unit 16 generates correction information that the distance “50” needs to be added to all positions in the X-axis direction of the feature information obtained by the recognition unit 11 of the information integration device 10 .
- FIG. 7A is a diagram showing first feature information indicating a plurality of first features (including signs) recognized by the recognition unit 11 of the information integration device 10.
- FIG. 7B is a diagram showing second feature information indicating a plurality of second features (including signs) recognized by the recognition device of the roadside unit 200 or included in the map information.
- FIG. 7(C) shows the first feature information of FIG. 7(A) based on the sign information included in the feature information of FIG. 7(A) and the feature information or map information of FIG. 7(B). It is a figure which shows the process which correct
- the correction information generation unit 16 first needs to correct one, two or more, or all of the recognition unit 11 of the information integration device 10 and the roadside unit 200 and recognition devices. to judge whether Absolute position information of sign information included in map information is accurate position information.
- the correction information generation unit 16 compares the position of the sign recognized by the recognition unit 11 of the information integration device 10 and the position of the sign recognized by the recognition device of the roadside unit 200 with the position of the sign included in the map information. , the positional information of the sign recognized by the recognition unit 11 of the information integration device 10 is different from the positional information of the sign included in the map information. It is determined that the position information of the sign obtained matches the position information of the sign included in the map information, and therefore no correction is necessary.
- the correction information generation unit 16 determines that the position of the sign recognized by the recognition unit 11 of the information integration device 10 is shifted in the ⁇ X-axis direction by a distance of “50” from the position of the sign included in the map information. I judge. Therefore, the correction information generation unit 16 generates correction information to move all the positions of the first features recognized by the recognition unit 11 of the information integration device 10 by a distance of “50” in the +X-axis direction.
- the processing performed by the correction information generation unit 16 corresponds to the correction information generation processing of the information integration device 10 .
- the correction information storage unit 15 is a storage unit that stores the correction information 15a generated by the correction information generation unit 16.
- the information transmitting/receiving unit 12 transmits one or both of the first feature information indicating the first feature recognized by the recognition unit 11 and the correction information generated by the correction information generation unit 16 to another moving object or a fixed object. Send it to a recognition device on your body.
- the information transmission/reception unit 12 is an acquisition unit that receives one or both of the feature information and the correction information from a recognition device mounted on another mobile or fixed body.
- the information integrating unit 17 receives the first feature information indicating the first feature recognized by the recognition unit 11 and the information transmitting/receiving unit 12 and recognizes it by a recognition device mounted on another mobile or fixed object.
- the feature information of a wide range is generated by integrating the second feature information indicating the second feature.
- the information integration unit 17 performs correction information recognized by the recognition unit 11 based on correction information generated by the correction information generation unit 16 and correction information generated by a recognition device mounted on another moving body or fixed body. By correcting erroneous recognition and recognition error of one or both of the first feature information indicating the first feature and the second feature information indicating the second feature received by the information transmitting/receiving unit 12, the information can generate accurate and extensive integrated feature information.
- FIG. 8(A) is a diagram showing the actual road conditions at a certain time T.
- FIG. 8B is a diagram showing first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10.
- FIG. 8C is a diagram showing second feature information indicating a plurality of second features recognized by the recognition device of the roadside device 200.
- FIG. 9(A) is a diagram showing the corrected first feature information obtained by correcting the first feature information in FIG. 8(B).
- FIG. 9(B) is a diagram showing the corrected second feature information obtained by correcting the second feature information of FIG. 8(C).
- FIG. 9(C) is a diagram showing integrated feature information obtained by integrating the feature information of FIGS. 9(A) and (B).
- the feature information (before correction) is corrected using the correction information.
- the positions of the plurality of first features are moved in the +X-axis direction.
- Correction information is stored for correcting by moving a distance of "50", that is, by adding "50" to the X coordinate.
- FIG. 9(A) A diagram in which the first feature information is corrected based on this correction information is shown in FIG. 9(A).
- FIG. 10 is a flow chart showing the operation of the information integration device 10. As shown in FIG.
- step S10 the recognition unit 11 performs recognition processing of one or more first features existing around the vehicle 100 based on the output of the in-vehicle sensor 20, and acquires first feature information. Then, the information transmitting/receiving unit 12 acquires second feature information about one or more second features.
- step S11 the same feature identification unit 13 determines the first feature information generated by the recognition unit 11 and the information received by the information transmission/reception unit 12 and generated by a recognition device mounted on another moving or fixed body. It is determined whether correction information for the second feature information has already been generated (that is, whether correction information exists or not). If the correction information has already been generated (that is, if there is correction information), the process proceeds to step S12, and if the correction information has not been generated (that is, if there is no correction information), the process proceeds to step Proceed to S14.
- step S12 the correction information generation unit 16 determines whether or not there is a change (that is, an environment difference) between the environment when the correction information was generated and the current environment.
- Correction information is highly dependent on the environment. For example, the recognition error when the vehicle is traveling at a speed of 60 km/h differs greatly from the recognition error when the vehicle is traveling at a speed of 10 km/h. For this reason, the generated correction information is in the same environment as the environment in which the correction information was generated or in an equivalent environment (that is, when the environment difference is within a predetermined range). object), it can be used to correct erroneous detection and recognition error of feature information.
- the "environment” indicates the running state of the moving object (speed, acceleration/deceleration, steering angle, etc.), road environment (straight section, curve, near intersection, inside tunnel, etc.), weather, and the like. After this determination, the process proceeds to step S13.
- step S13 the correction information generation unit 16 determines whether correction information needs to be generated depending on whether or not there is an environmental change in step S12. If there is no environmental change (no determination in step S13), the correction information that has already been generated is valid, so the process proceeds to step S17. When there is an environmental change (when determination is made in step S13), it is necessary to generate correction information, so the process proceeds to step S14.
- step S14 the identical feature identifying unit 13 determines the recognition time included in the first feature information indicating one or more first features recognized by the recognizing unit 11 and the one received by the information transmitting/receiving unit 12.
- the recognition time of the second feature information indicating one or more second features is corrected, and time correction, which is time synchronization processing, is performed. That is, the identical feature identification unit 13 identifies the first feature information and the second feature information based on the recognition time of the first feature information and the recognition time of the second feature information. Perform time correction for correcting time information.
- the same feature identification unit 13 identifies the same feature from the time-corrected first feature information and second feature information.
- identifying identical features include a method of identifying identical features that have the same relative distance from a feature point using feature points on a high-precision map, and a method for identifying the
- step S16 the correction information generation unit 16 generates the recognition unit 11 of the information integration device 10 and the roadside unit based on the information of the same feature identified by the same feature identification unit 13 and the feature information obtained by a plurality of recognition devices. Correction information for correcting erroneous recognition and recognition error of the feature information indicating the feature recognized by the recognition device 200 is generated. As means for generating correction information, if the positions of the features identified as identical do not match, the feature information obtained by the recognition device with low reliability is adjusted to the position of the feature information with high reliability. . The generated correction information is stored in the correction information storage unit 15 .
- step S17 the information integration unit 17 uses the correction information to identify the first feature information generated by the recognition unit 11 of the information integration device 10 and the second feature information generated by the recognition device of the roadside device 200. To correct erroneous recognition and recognition error of feature information appearing in information.
- step S18 the information integration unit 17 integrates the corrected first feature information and second recognition information to generate wide-range and accurate integrated feature information.
- the first map representing one or more first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 The feature information and the second feature information indicating one or more second features recognized by the recognition device of the roadside unit 200 are compared, and the first feature information or the second feature information is determined based on the comparison result. At least one of the feature information is corrected, and then the first feature information and the second feature information are integrated to generate integrated feature information. Therefore, according to the information integration device 10, it is possible to accurately grasp feature information in a wide range.
- a moving body using the information integration apparatus 10 can grasp information on features around the moving body, and based on the grasped feature information, generate driving support information related to driving of the moving body. Appropriate feature information can be provided to the support device.
- the moving body using the information integration device 10 grasps the feature information in its surroundings, and based on the grasped feature information, generates autonomous traveling information related to autonomous traveling of the moving body. Appropriate feature information can be provided to the autonomous mobile device.
- Information integration device 11
- Recognition unit 12
- Information transmission/reception unit (communication unit) 13
- Same feature identification unit 14
- Map information storage unit 14a
- Correction information storage unit 15a
- Correction information generation unit 17 information integration unit, 20 vehicle-mounted sensor, 100 vehicle, 200 roadside unit, 300 sign.
Landscapes
- Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Operations Research (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
An information integration device (10) is provided with: a recognition unit (11) which acquires sensor information from a sensor (20) provided to a moving body (100), recognizes one or more first ground features on the basis of the sensor information, and generates first ground feature information indicating the one or more first ground features; a ground feature identification unit (13) which acquires second ground feature information indicating one or more second ground features from an external device (200) or available map information or both, and identifies first and second ground features that are the same ground feature from among the one or more first ground features and the one or more second ground features; a correction information generation unit (16) which generates correction information used to correct at least one of the first ground feature information and the second ground feature information, on the basis of the difference in position between a first ground feature and a second ground feature that are identified as the same ground feature; and an information integration unit (17) which generates integrated ground feature information by integrating the first ground feature information and the second ground feature information after correcting at least one of the first ground feature information and the second ground feature information using the correction information.
Description
本開示は、情報統合装置、情報統合方法、及び情報統合プログラムに関する。
The present disclosure relates to an information integration device, an information integration method, and an information integration program.
車両のドライバの運転サポート及び予防安全を目的として、車線逸脱警報(LDW)システム、歩行者検知(PD)システム、アダプティブクルーズコントロール(ACC)システムといった運転支援システムが開発されている。また、目的地までの運転の一部又は全てをドライバに代わって行う自動運転システムが開発されている。
Driving support systems such as lane departure warning (LDW) systems, pedestrian detection (PD) systems, and adaptive cruise control (ACC) systems have been developed for the purpose of driving support and preventive safety for vehicle drivers. In addition, an automatic driving system has been developed that performs part or all of the driving to the destination on behalf of the driver.
これらのシステムでは、移動体(例えば、車両など)の周辺に存在する地物(例えば、人、物、車両など)の位置、種別、状態(例えば、移動速度、移動方向など)、並びに、移動体に関連する地物(例えば、信号機、車線など)の位置、種別、状態(例えば、信号機の点灯状態など)を、広範囲で精度よく認識することが必要である。
In these systems, the position, type, state (e.g., moving speed, moving direction, etc.) of features (e.g., people, objects, vehicles, etc.) existing around a moving body (e.g., vehicle, etc.), and movement It is necessary to accurately recognize the position, type, and state (eg, the lighting state of a traffic light) of features (eg, traffic lights, lanes, etc.) related to the body over a wide range.
地物の認識は、移動体に搭載されたセンサによって情報を取得し、解析することで行うことが一般的である。しかし、移動体に搭載されたセンサは、広い認識範囲を有しておらず、また、認識精度が低い場合がある。これに対し、特許文献1は、複数の車両に搭載された認識装置が取得した地物情報に基づいて広範囲及び高精度な地物情報を生成することを提案している。
Features are generally recognized by acquiring and analyzing information from sensors mounted on moving objects. However, a sensor mounted on a moving body does not have a wide recognition range and may have low recognition accuracy. On the other hand, Patent Literature 1 proposes generating wide-range and highly accurate feature information based on feature information acquired by recognition devices mounted on a plurality of vehicles.
しかしながら、各認識装置では、地物の誤認識又は認識誤差が発生する。ここで、認識誤差とは、実際の地物の位置及びサイズと認識結果としての地物情報が示す地物の位置及びサイズとの差である。したがって、複数の認識装置で認識された地物を示す地物情報を統合する場合、1つの地物を複数の地物として処理する可能性があり、広範囲で高精度な地物情報を生成することは困難である。
However, in each recognition device, erroneous recognition or recognition errors of features occur. Here, the recognition error is the difference between the actual position and size of the feature and the position and size of the feature indicated by the feature information as the recognition result. Therefore, when integrating feature information representing features recognized by a plurality of recognition devices, one feature may be processed as a plurality of features. is difficult.
本開示は、広範囲にある地物の地物情報を高精度に生成することを可能にする情報統合装置、情報統合方法、及び情報統合プログラムを提供することを目的とする。
An object of the present disclosure is to provide an information integration device, an information integration method, and an information integration program that make it possible to generate feature information of a wide range of features with high accuracy.
本開示の情報統合装置は、移動体に備えられたセンサから、前記センサによって取得されたセンサ情報を取得し、取得した前記センサ情報に基づいて前記移動体の周辺の1つ以上の地物である1つ以上の第1の地物を認識し、前記1つ以上の第1の地物を示す第1の地物情報を生成する認識部と、外部装置から、又は取得可能な地図情報から、又は前記外部装置及び前記地図情報の両方から、前記移動体の周辺に存在すると認識された1つ以上の地物である1つ以上の第2の地物を示す第2の地物情報を取得し、前記第1の地物情報によって示される前記1つ以上の第1の地物と前記第2の地物情報によって示される前記1つ以上の第2の地物の中から同一の地物を特定する同一地物特定部と、同一の地物であると特定された第1の地物と第2の地物との位置の差に基づいて、前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正するために用いる補正情報を生成する補正情報生成部と、前記補正情報を用いて前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正した後に、前記第1の地物情報と前記第2の地物情報とを統合することで統合地物情報を生成する情報統合部と、を備えたことを特徴とする。
The information integration device of the present disclosure acquires sensor information acquired by the sensor from a sensor provided in a mobile body, and based on the acquired sensor information, one or more features around the mobile body a recognition unit that recognizes one or more first features and generates first feature information indicating the one or more first features, and from an external device or from acquirable map information or, from both the external device and the map information, second feature information indicating one or more second features which are one or more features recognized to exist around the moving object. and extracting the same feature from among the one or more first features indicated by the first feature information and the one or more second features indicated by the second feature information; Based on the positional difference between the same feature specifying unit that specifies the object and the first feature and the second feature that are specified as being the same feature, the first feature information and the a correction information generating unit that generates correction information used to correct at least one of the second feature information; and at least one of the first feature information and the second feature information using the correction information. and an information integration unit that generates integrated feature information by integrating the first feature information and the second feature information after correcting the above.
本開示の情報統合方法は、情報統合装置によって実施される方法であって、移動体に備えられたセンサから、前記センサによって取得されたセンサ情報を取得し、取得した前記センサ情報に基づいて前記移動体の周辺の1つ以上の地物である1つ以上の第1の地物を認識し、前記1つ以上の第1の地物を示す第1の地物情報を生成するステップと、外部装置から、又は取得可能な地図情報から、又は前記外部装置及び前記地図情報の両方から、前記移動体の周辺に存在すると認識された1つ以上の地物である1つ以上の第2の地物を示す第2の地物情報を取得し、前記第1の地物情報によって示される前記1つ以上の第1の地物と前記第2の地物情報によって示される前記1つ以上の第2の地物の中から同一の地物を特定するステップと、同一の地物であると特定された第1の地物と第2の地物との位置の差に基づいて、前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正するために用いる補正情報を生成するステップと、前記補正情報を用いて前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正した後に、前記第1の地物情報と前記第2の地物情報とを統合することで統合地物情報を生成するステップと、を有することを特徴とする。
An information integration method of the present disclosure is a method implemented by an information integration device, in which sensor information acquired by the sensor is acquired from a sensor provided in a mobile body, and based on the acquired sensor information, the a step of recognizing one or more first features that are one or more features around a moving object and generating first feature information indicating the one or more first features; One or more second features that are one or more features recognized to exist around the moving object from an external device, from map information that can be acquired, or from both the external device and the map information second feature information indicating features is obtained, and the one or more first features indicated by the first feature information and the one or more first features indicated by the second feature information are obtained; identifying the same feature from among the second features; generating correction information used to correct at least one of the first feature information and the second feature information; and using the correction information to correct the first feature information and the second feature information. and generating integrated feature information by integrating the first feature information and the second feature information after correcting at least one of the information.
本開示によれば、広範囲にある地物の地物情報を高精度に生成することができる。
According to the present disclosure, it is possible to generate feature information of features in a wide range with high accuracy.
以下に、実施の形態に係る情報統合装置、情報統合方法、及び情報統合プログラムを、図面を参照しながら説明する。以下の実施の形態は、例にすぎず、実施の形態を適宜組み合わせること及び各実施の形態を適宜変更することが可能である。
An information integration device, an information integration method, and an information integration program according to embodiments will be described below with reference to the drawings. The following embodiments are merely examples, and the embodiments can be combined as appropriate and each embodiment can be modified as appropriate.
〈実施の形態の構成〉
図1は、本実施の形態に係る情報統合装置10の構成を示す機能ブロック図である。情報統合装置10は、実施の形態に係る情報統合方法を実施することができる装置である。図1の例では、情報統合装置10は、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、情報統合部17、地図情報格納部14、及び補正情報格納部15を備える。これらの構成の詳細は、後述される。 <Configuration of Embodiment>
FIG. 1 is a functional block diagram showing the configuration of aninformation integration device 10 according to this embodiment. The information integration device 10 is a device capable of implementing the information integration method according to the embodiment. In the example of FIG. 1, the information integration device 10 includes a recognition unit 11, an information transmission/reception unit 12, a same feature identification unit 13, a correction information generation unit 16, an information integration unit 17, a map information storage unit 14, and a correction information storage unit. 15. Details of these configurations will be described later.
図1は、本実施の形態に係る情報統合装置10の構成を示す機能ブロック図である。情報統合装置10は、実施の形態に係る情報統合方法を実施することができる装置である。図1の例では、情報統合装置10は、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、情報統合部17、地図情報格納部14、及び補正情報格納部15を備える。これらの構成の詳細は、後述される。 <Configuration of Embodiment>
FIG. 1 is a functional block diagram showing the configuration of an
図1には、移動機能を備えた物体(すなわち、走行可能な物体)である移動体としての車両100に搭載された情報統合装置10が示されている。しかし、情報統合装置10は、車両100以外の移動体(例えば、船舶、ロボット、パーソナルモビリティなど)にも搭載可能である。また、情報統合装置10は、移動機能を備えていない固定体(例えば、インフラ設備など)にも搭載可能である。ここで、インフラ設備は、例えば、信号機又は路側機などである。また、情報統合装置10は、ネットワークを介して移動体、固定体などと通信可能なサーバに搭載されてもよい。
FIG. 1 shows an information integration device 10 mounted on a vehicle 100 as a moving body that is an object with a moving function (that is, an object that can travel). However, the information integration device 10 can also be mounted on a moving body other than the vehicle 100 (eg, ship, robot, personal mobility, etc.). In addition, the information integration device 10 can be mounted on a fixed body (for example, infrastructure equipment, etc.) that does not have a mobile function. Here, the infrastructure equipment is, for example, a traffic light or a roadside device. Also, the information integration device 10 may be installed in a server that can communicate with a mobile object, a fixed object, or the like via a network.
以下の説明では、車両100に情報統合装置10が搭載された例を説明する。車両100には、情報統合装置10と、車載センサ20とが搭載されている。車載センサ20は、例えば、カメラ、ミリ波レーダ、Lidar(Light Detection And Ranging)、ソナーなどのセンサ、又はこれらのセンサのうちの2つ以上の組み合わせで構成される。車載センサ20と認識部11とは、周辺の地物を認識する認識装置(「認識手段」ともいう。)を構成する。
In the following description, an example in which the information integration device 10 is mounted on the vehicle 100 will be described. A vehicle 100 is equipped with an information integration device 10 and an in-vehicle sensor 20 . The in-vehicle sensor 20 is composed of, for example, a sensor such as a camera, millimeter wave radar, Lidar (Light Detection and Ranging), sonar, or a combination of two or more of these sensors. The in-vehicle sensor 20 and the recognition unit 11 constitute a recognition device (also referred to as “recognition means”) that recognizes surrounding features.
図1では、車両100に、情報統合装置10と車載センサ20とが搭載されているが、車両100には、緊急自動ブレーキ、車線走行維持などを行う先進運転支援装置、目的地までの運転の一部又は全てをドライバに代わって行う自動運転装置などの装置が搭載されてもよい。或いは、情報統合装置10は、先進運転支援装置、自動運転装置、などを構成する装置の一部であってもよい。
In FIG. 1, a vehicle 100 is equipped with an information integration device 10 and an on-vehicle sensor 20. The vehicle 100 includes an advanced driving support device that performs emergency automatic braking, lane maintenance, etc. A device such as an automatic driving device that performs part or all on behalf of the driver may be installed. Alternatively, the information integration device 10 may be part of a device that constitutes an advanced driving assistance device, an automatic driving device, or the like.
情報統合装置10は、移動体ある車両100に備えられた車載センサ20から、車載センサ20によって取得されたセンサ情報を取得し、取得したセンサ情報に基づいて車両100の周辺(例えば、移動経路上)の1つ以上の地物である1つ以上の第1の地物を認識し、1つ以上の第1の地物を示す第1の地物情報を生成する認識部11と、外部装置(例えば、路側機200)から、又は取得可能な地図情報(例えば、地図情報14a)から、又は外部装置及び地図情報の両方から、車両100の周辺に存在すると認識された1つ以上の地物である1つ以上の第2の地物を示す第2の地物情報を取得し、第1の地物情報によって示される1つ以上の第1の地物と第2の地物情報によって示される1つ以上の第2の地物の中から同一の地物を特定する同一地物特定部13とを備えている。また、情報統合装置10は、同一の地物であると特定された第1の地物と第2の地物との位置の差に基づいて、第1の地物情報及び第2の地物情報の少なくとも一方を補正するために用いる補正情報を生成する補正情報生成部16と、生成された補正情報を用いて第1の地物情報及び第2の地物情報の少なくとも一方を補正した後に、第1の地物情報と第2の地物情報とを統合することで統合地物情報を生成する情報統合部17とを備えている。
The information integration device 10 acquires sensor information acquired by the vehicle-mounted sensor 20 from the vehicle-mounted sensor 20 provided in the vehicle 100, which is a moving object, and based on the acquired sensor information, detects information around the vehicle 100 (for example, on the moving route). ), and generates first feature information indicating the one or more first features, and an external device One or more features recognized to exist in the vicinity of the vehicle 100 (e.g. roadside unit 200), from available map information (e.g. map information 14a), or from both an external device and map information acquire second feature information indicating one or more second features indicated by the first feature information and the one or more first features indicated by the first feature information and the second feature information indicated by the second feature information a same feature identification unit 13 for identifying the same feature from among the one or more second features identified by the second feature. In addition, the information integration device 10 generates the first feature information and the second feature information based on the positional difference between the first feature and the second feature identified as the same feature. a correction information generation unit 16 that generates correction information used to correct at least one of the information; , and an information integration unit 17 for generating integrated feature information by integrating the first feature information and the second feature information.
図2は、本実施の形態に係る情報統合装置10のハードウェア構成を示す図である。情報統合装置10は、例えば、コンピュータ、専用の演算装置、又はコンピュータと専用の演算装置とを組み合わせた装置である。情報統合装置10は、情報処理部であるプロセッサ31と、記憶部であるメモリ32と、不揮発性の記憶部である記憶装置33と、インタフェース34と、通信部35とを備える。
FIG. 2 is a diagram showing the hardware configuration of the information integration device 10 according to this embodiment. The information integration device 10 is, for example, a computer, a dedicated computing device, or a device combining a computer and a dedicated computing device. The information integration device 10 includes a processor 31 as an information processing section, a memory 32 as a storage section, a storage device 33 as a non-volatile storage section, an interface 34 and a communication section 35 .
プロセッサ31は、システムバスを介して他のハードウェアと接続され、これら他のハードウェアを制御する。プロセッサ31は、プロセッシングを行うIC(Integrated Circuit)である。プロセッサ31の具体例は、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、GPU(Graphics Processing Unit)、又はFPGA(Field Programmable Gate Array)である。
The processor 31 is connected to other hardware via a system bus and controls these other hardware. The processor 31 is an IC (Integrated Circuit) that performs processing. Specific examples of the processor 31 are CPU (Central Processing Unit), DSP (Digital Signal Processor), GPU (Graphics Processing Unit), or FPGA (Field Programmable Gate Array).
プロセッサ31は、メモリ32に記憶されるプログラムを実行する。プログラムは、本実施の形態に係る情報統合プログラムを含む。情報統合装置10の機能は、プログラムを実行するプロセッサ31によって実現される。プロセッサ31は、プロセッシングサーキットリー(処理回路)の一例である。
The processor 31 executes programs stored in the memory 32 . The program includes an information integration program according to this embodiment. The functions of the information integration device 10 are implemented by the processor 31 that executes programs. Processor 31 is an example of processing circuitry.
例えば、認識部11、情報送受信部12、同一地物特定部13、及び補正情報生成部16、情報統合部17は、プロセッサ31で実現され、地図情報格納部14と補正情報格納部15は、メモリ32で実現される。
For example, the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 are realized by the processor 31, and the map information storage unit 14 and the correction information storage unit 15 are Implemented in memory 32 .
CPUは、プログラムの実行、データ演算などの処理を行う。DSPは、算術演算、データ移動などのデジタル信号処理を行う。例えば、ミリ波レーダから得られるセンサデータのセンシングといった処理は、CPUが処理でせずに、DSPによって高速に処理することが望ましい。
The CPU performs processes such as program execution and data calculation. DSPs perform digital signal processing such as arithmetic operations and data movement. For example, processing such as sensing of sensor data obtained from a millimeter-wave radar is desirably processed at high speed by a DSP rather than by a CPU.
GPUは、画像の処理に特化したプロセッサである。GPUは、複数の画素データを並列処理することで、高速な画像処理が可能である。GPUは、画像処理で頻繁に使われるテンプレートマッチング処理を、高速に処理できる。例えば、カメラから得られるセンサデータのセンシングは、GPUで処理することが望ましい。カメラから得られるセンサデータのセンシングをCPUで処理すると、処理時間が膨大になる。また、GPUは、単なる画像処理用のプロセッサとしてではなく、GPUの演算資源を用いて、汎用計算を行う使い方(GPGPU:General Purpose Computing on Graphics Processing Units)がある。従来の画像処理技術では、画像に映りこむ車両の検出精度に限界が合ったが、GPGPUによるディープラーニングで画像処理を行うことでより高精度に車両を検出することができる。
A GPU is a processor that specializes in image processing. A GPU is capable of high-speed image processing by processing a plurality of pieces of pixel data in parallel. A GPU can perform template matching processing, which is frequently used in image processing, at high speed. For example, sensing of sensor data obtained from a camera is desirably processed by a GPU. If the CPU processes the sensing of sensor data obtained from the camera, the processing time becomes enormous. In addition, the GPU is not simply used as a processor for image processing, but is used to perform general-purpose calculations (GPGPU: General Purpose Computing on Graphics Processing Units) using the computational resources of the GPU. With conventional image processing technology, there is a limit to the accuracy of detecting vehicles reflected in images.
FPGAは、論理回路の構成をプログラム可能なプロセッサである。FPGAは、専用のハードウェア演算回路とプログラム可能なソフトウェアとの両方の性質を持つ。複雑な演算及び並列性のある処理は、FPGAで高速に実行できる。
An FPGA is a processor that can program the configuration of logic circuits. FPGAs are both dedicated hardware arithmetic circuits and programmable software. Complex calculations and parallel processing can be executed at high speed by FPGA.
メモリ32は、例えば、揮発性メモリである。揮発性メモリは、情報統合装置10の動作時に、データを高速に移動可能である。揮発性メモリの具体例は、DDR2-SDRAM(Double-Data-Rate2 Synchronous Dynamic Random Access Memory)、DDR3-SDRAM(Double-Data-Rate3 Synchronous Dynamic Random Access Memory)である。
The memory 32 is, for example, a volatile memory. Volatile memory can move data at high speed when the information integration device 10 operates. Specific examples of volatile memory are DDR2-SDRAM (Double-Data-Rate2 Synchronous Dynamic Random Access Memory) and DDR3-SDRAM (Double-Data-Rate3 Synchronous Dynamic Random Access Memory).
記憶装置33は、不揮発性メモリであり、情報統合装置10の電源がオフの間も、実行プログラム及びデータを保持し続けることができる。不揮発性メモリの具体例は、HDD(Hard Disk Drive)、SSD(Solid State Drive)、フラッシュメモリである。不揮発性メモリは、SD(Secure Digital)メモリカード、CF(CompactFlash(登録商標))、NANDフラッシュ、フレキシブルディスク、光ディスク、コンパクトディスク、ブルーレイ(登録商標)ディスク、DVDといった可搬記憶媒体であってもよい。メモリ32は、図示していないメモリインタフェースを介して、プロセッサ31と接続される。メモリインタフェースは、プロセッサからのメモリアクセスを一元的に管理し、効率的なメモリアクセス制御を行う。メモリインタフェースは、情報統合装置10におけるデータ転送に利用される。
The storage device 33 is a non-volatile memory, and can continue to hold execution programs and data even while the power of the information integration device 10 is off. Specific examples of nonvolatile memory are HDD (Hard Disk Drive), SSD (Solid State Drive), and flash memory. The non-volatile memory may be SD (Secure Digital) memory card, CF (Compact Flash (registered trademark)), NAND flash, flexible disk, optical disk, compact disk, Blu-ray (registered trademark) disk, DVD, or other portable storage medium. good. The memory 32 is connected to the processor 31 via a memory interface (not shown). The memory interface centrally manages memory accesses from processors and performs efficient memory access control. The memory interface is used for data transfer in the information integration device 10 .
図1に示される、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の機能は、プログラムを実行するプロセッサ31により実現される。メモリ32には、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の機能を実現するプログラムが記憶されている。プロセッサ31は、これらのプログラムをメモリ32から読み込み、これらのプログラムを実行する。また、メモリ32は、各プログラムの中間情報を一時的に格納することにも使用される。
The functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 shown in FIG. 1 are realized by the processor 31 that executes the program. The memory 32 stores programs for realizing the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17. FIG. Processor 31 reads these programs from memory 32 and executes these programs. The memory 32 is also used to temporarily store intermediate information for each program.
また、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の機能は、ハードウェアである論理回路により実現されてもよい。この場合は、メモリ32には、論理回路情報が記憶される。論理回路情報は、プロセッサ31により読み込まれて実行される。
Also, the functions of the recognition unit 11, the information transmission/reception unit 12, the identical feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 may be realized by logic circuits that are hardware. In this case, the memory 32 stores logic circuit information. Logic circuit information is read and executed by the processor 31 .
プロセッサ31は、複数のプロセッサでも構成されてもよい。この場合、複数のプロセッサが、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の各機能を実現するプログラムを連携して実行してもよい。
The processor 31 may also be configured with a plurality of processors. In this case, even if a plurality of processors cooperate to execute programs that implement the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17, good.
メモリ32には、地図情報14aと補正情報15aが記憶されている。地図情報14aは、路面標示、標識及び信号機などの種類、位置、幅、大きさなどから成る高精度な道路環境の情報である。補正情報15aは、地物情報の誤認識、認識誤差を補正するために使用される情報である。ただし、地図情報14aと補正情報15aは、情報統合装置10の外部の記憶装置に格納されてもよい。
The memory 32 stores map information 14a and correction information 15a. The map information 14a is highly accurate road environment information including types, positions, widths, sizes, etc. of road markings, signs, traffic lights, and the like. The correction information 15a is information used to correct erroneous recognition and recognition error of feature information. However, the map information 14a and the correction information 15a may be stored in a storage device external to the information integrating apparatus 10. FIG.
車両100には、1つ以上の車載センサ20が取り付けられている。車載センサ20は、情報統合装置10が車両100の周辺状況を認識するためのセンサ情報を情報統合装置10に出力する。
One or more in-vehicle sensors 20 are attached to the vehicle 100 . The in-vehicle sensor 20 outputs sensor information for the information integration device 10 to recognize the surrounding situation of the vehicle 100 to the information integration device 10 .
また、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の処理の結果を示す情報、データ、信号値及び変数値の少なくともいずれかが、メモリ32、プロセッサ31のレジスタ及びキャッシュメモリの少なくともいずれかに記憶される。
In addition, at least one of information, data, signal values, and variable values indicating the processing results of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17, It is stored in at least one of memory 32, registers of processor 31, and cache memory.
また、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の機能を実現するプログラムは、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ブルーレイ(登録商標)ディスク、DVD等の可搬記録媒体に格納されていてもよい。そして、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の機能を実現するプログラムが格納された可搬記録媒体を商業的に流通させてもよい。
Also, the program that realizes the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 can be a magnetic disk, a flexible disk, an optical disk, a compact disk, a Blu-ray ( It may be stored in a portable recording medium such as a registered trademark) disk, DVD, or the like. Then, a portable recording medium storing a program for realizing the functions of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 is commercially distributed. good too.
また、認識部11、情報送受信部12、同一地物特定部13、補正情報生成部16、及び情報統合部17の「部」を、「回路」又は「工程」又は「手順」又は「処理」に読み替えてもよい。
Further, the “units” of the recognition unit 11, the information transmission/reception unit 12, the identical feature identification unit 13, the correction information generation unit 16, and the information integration unit 17 are replaced with “circuit”, “process”, “procedure”, or “processing”. can be read as
次に、情報統合装置10を構成する、認識部11、情報送受信部12、同一地物特定部13、地図情報14a、補正情報15a、補正情報生成部16、及び情報統合部17の詳細を説明する。
Next, details of the recognition unit 11, the information transmission/reception unit 12, the same feature identification unit 13, the map information 14a, the correction information 15a, the correction information generation unit 16, and the information integration unit 17, which constitute the information integration device 10, will be described. do.
認識部11は、移動体である車両100の周辺(例えば、車両100が走行している車線、この車線に並ぶ他の車線、など)に存在する1つ以上の認識対象物である1つ以上の地物の位置とその属性を、車両100の移動経路上の移動に伴い、移動経路上の複数の位置で認識する。認識部11は、移動経路の周辺に存在する1つ以上の認識対象物の位置とその属性を認識してもよい。認識対象物は、移動経路の近傍に存在する地物及び移動経路上に存在する地物のいずれであってもよい。認識対象物は、立体の地物又は平面の地物のいずれであってもよい。認識対象物である地物とは、例えば、路面標示、標識、信号機、他車両、及び歩行者である。
The recognition unit 11 recognizes one or more objects to be recognized that exist around the vehicle 100 (for example, the lane in which the vehicle 100 is traveling, other lanes that line up with this lane, etc.). , are recognized at a plurality of positions on the movement route as the vehicle 100 moves on the movement route. The recognition unit 11 may recognize the positions and attributes of one or more recognition objects existing around the movement route. The recognition target may be either a feature existing near the movement route or a feature existing on the movement route. A recognition target may be either a three-dimensional feature or a two-dimensional feature. Features that are objects to be recognized are, for example, road markings, signs, traffic lights, other vehicles, and pedestrians.
認識部11は、認識対象物の地物情報として、地物の位置、及び属性を認識する。認識対象物の属性は、認識対象物の種類、及び認識対象物の状態の少なくとも一方を含む。認識対象物の種類は、例えば、認識された物体が、人、他車両、標識などのいずれであるかを示す情報である。また、認識対象物が標識であれば、認識対象物の種類は、標識の種類まで細かく分類した情報を含んでもよい。認識対象物が車両であれば、認識対象の種類は、普通自動車、トラック、軽自動車など車種を細かく分類した情報を含んでもよい。認識対象物の状態とは、例えば、認識対象物が信号機であれば、認識対象の種類は、赤信号、青信号、黄信号のいずが点灯中であるかを示す情報を含んでもよい。認識部11により行われる処理は、情報統合方法における認識処理に相当する。
The recognition unit 11 recognizes the position and attribute of the feature as the feature information of the recognition target. The attribute of the recognition target includes at least one of the type of the recognition target and the state of the recognition target. The type of recognition target object is, for example, information indicating whether the recognized object is a person, another vehicle, a sign, or the like. Also, if the recognition target is a sign, the type of the recognition target may include detailed information including the type of sign. If the object to be recognized is a vehicle, the type of the object to be recognized may include information that finely classifies vehicle types such as ordinary automobiles, trucks, and light automobiles. For example, if the recognition target is a traffic light, the recognition target may include information indicating whether the recognition target is a red light, a green light, or a yellow light. The processing performed by the recognition unit 11 corresponds to recognition processing in the information integration method.
地図情報格納部14は、路面標示、標識及び信号機などの種類、位置、幅、大きさなどから成る高精度な道路環境の地図情報14aを記憶する記憶部である。地図情報格納部14は、ネットワークを介して通信可能なサーバによって管理される記憶装置であってもよい。地図情報は、車両100が走行予定の領域の地図情報を含む。
The map information storage unit 14 is a storage unit that stores high-precision road environment map information 14a including the types, positions, widths, sizes, etc. of road markings, signs, and traffic lights. The map information storage unit 14 may be a storage device managed by a server that can communicate via a network. The map information includes map information of an area in which vehicle 100 is scheduled to travel.
同一地物特定部13は、認識部11によって認識された地物情報(「第1の地物情報」ともいう。)と、車両100以外の機器(例えば、路側機)に搭載された認識装置によって認識された地物情報(「第2の地物情報」ともいう。)とを比較し、第1の地物情報に含まれる1つ以上の地物(「第1の地物」ともいう。)と第2の地物情報に含まれる1つ以上の地物(「第2の地物」ともいう。)とを比較することによって、同一の地物を特定する。
The same feature identification unit 13 identifies feature information (also referred to as “first feature information”) recognized by the recognition unit 11 and a recognition device mounted on a device other than the vehicle 100 (for example, a roadside device). feature information (also referred to as "second feature information") recognized by the first feature information, and one or more features included in the first feature information (also referred to as "first feature information"). ) and one or more features included in the second feature information (also referred to as “second features”) to identify the same feature.
図3は、ある時刻Tにおける実際の道路状況を示す図である。図3には、時刻Tに、実際に道路に存在する複数の地物が示されている。図3において、車両100は情報統合装置10を有し、路側機200は地物を認識する認識装置を有している。路側機200の認識装置としては、各種の認識装置を使用可能であるが、車両100に搭載された認識装置と同様のものであってもよい。
Fig. 3 is a diagram showing the actual road conditions at a certain time T. FIG. 3 shows, at time T, a plurality of features that actually exist on the road. In FIG. 3, a vehicle 100 has an information integration device 10, and a roadside device 200 has a recognition device for recognizing features. As the recognition device of the roadside device 200, various recognition devices can be used, and the same recognition device as the recognition device mounted on the vehicle 100 may be used.
図4(A)は、時刻Tに車両100の情報統合装置10の認識部11によって認識された複数の第1の地物を矩形の実線枠で示す。図4(B)は、時刻Tに路側機200の認識装置によって認識された複数の第2の地物を矩形の破線枠で示す。車両100の情報統合装置10の認識部11による認識領域と路側機200の認識装置による認識領域とが部分的に共通であるため、車両100の情報統合装置10の認識部11によって認識された第1の地物である車両Vaと路側機200の認識装置によって認識された第2の地物である車両Vaとは、同一の地物である。しかし、図4(A)で情報統合装置10の認識部11によって認識された車両Vaの位置と、図4(B)で路側機200の認識装置によって認識された車両Vaの位置とは異なっている。
FIG. 4(A) shows a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 at time T with solid rectangular frames. FIG. 4(B) shows a plurality of second features recognized by the recognition device of the roadside device 200 at time T with broken-line rectangular frames. Since the recognition area by the recognition unit 11 of the information integration device 10 of the vehicle 100 and the recognition area by the recognition device of the roadside unit 200 are partially common, the first recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 The vehicle Va, which is the first feature, and the vehicle Va, which is the second feature recognized by the recognition device of the roadside unit 200, are the same feature. However, the position of the vehicle Va recognized by the recognition unit 11 of the information integration device 10 in FIG. 4A differs from the position of the vehicle Va recognized by the recognition device of the roadside unit 200 in FIG. 4B. there is
図4(A)における車両Va及び図4(B)における車両Vaの位置と、図3に示される実際の車両Vaの位置とを比較すると、車両100の情報統合装置10によって認識された車両Vaの位置に誤差が含まれていることが分かる。このため、絶対的な位置情報を基に、車両100の情報統合装置10の認識部11によって認識された車両Vaと路側機200の認識装置によって認識された車両Vaとを同一の地物であると判断することができない。そこで、情報統合装置10の同一地物特定部13は、他の特徴から、情報統合装置10の認識部11によって認識された車両Vaと路側機200の認識装置によって認識された車両Vaとが同一の地物であると判断する。
Comparing the position of the vehicle Va in FIG. 4A and the position of the vehicle Va in FIG. 4B with the actual position of the vehicle Va shown in FIG. It can be seen that there is an error in the position of Therefore, based on the absolute position information, the vehicle Va recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 and the vehicle Va recognized by the recognition device of the roadside unit 200 are the same feature. cannot be determined. Therefore, the same feature identification unit 13 of the information integration device 10 determines that the vehicle Va recognized by the recognition unit 11 of the information integration device 10 and the vehicle Va recognized by the recognition device of the roadside unit 200 are the same from other characteristics. is a feature of
他の特徴としては、地図情報に含まれる標識、停止線、看板、トンネルの入り口などのランドマーク情報、車両の種別、速度、他の地物との相対位置関係情報等が挙げられる。例えば、相対的な位置情報を用いる場合には、車両100の情報統合装置10の認識部11によって認識された車両Vaと路側機200の認識装置によって認識された車両Vaの何れも、右車線一定距離の場所に同一の赤い車両Vbが存在する。このように、同一地物特定部13は、車両100の情報統合装置10の認識部11によって認識された地物を示す地物情報の位置情報に検出誤差があったとしても、相対的な位置関係などを基に判断することで、情報統合装置10の認識部11によって認識された車両Vaと路側機200の認識装置によって認識された車両Vaが同一の地物であると判断することができる。同一地物特定部13は、その他の特徴のいずれか1つもしくは2つ以上、又は、その他の特徴の全てを組み合わせて同一の地物を特定する。同一地物特定部13により行われる処理は、情報統合装置10における同一の地物の特定処理に相当する。
Other features include landmark information such as signs, stop lines, billboards, and tunnel entrances included in the map information, vehicle type, speed, and relative positional relationship information with other features. For example, when relative position information is used, both the vehicle Va recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 and the vehicle Va recognized by the recognition device of the roadside unit 200 are in the right lane. There is an identical red vehicle Vb at a distance location. In this way, even if there is a detection error in the position information of the feature information indicating the feature recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100, the same feature identification unit 13 can determine the relative position. By determining based on the relationship, it is possible to determine that the vehicle Va recognized by the recognition unit 11 of the information integration device 10 and the vehicle Va recognized by the recognition device of the roadside unit 200 are the same feature. . The same feature identification unit 13 identifies the same feature by combining any one or more of the other features or all of the other features. The processing performed by the same feature identification unit 13 corresponds to the same feature identification processing in the information integration device 10 .
図5(A)は、車両100の情報統合装置10の認識部11によって認識された複数の第1の地物を示す第1の地物情報を示す図である。図5(B)は、図5(A)の第1の地物情報を補正して得られた補正後の第1の地物情報を示す図である。図3には、時刻T(「基準時刻」ともいう。)における実際の地物が示される。しかしながら、情報統合装置10の認識部11、及び、路側機200の認識装置は、一定の周期で認識を行っているため、情報統合装置10の認識部11が取得する第1の地物情報、及び、路側機200が取得する第2の地物情報は、時刻Tの瞬間に認識されたものではない。
FIG. 5A is a diagram showing first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100. FIG. FIG. 5(B) is a diagram showing the corrected first feature information obtained by correcting the first feature information in FIG. 5(A). FIG. 3 shows an actual feature at time T (also referred to as "reference time"). However, since the recognition unit 11 of the information integration device 10 and the recognition device of the roadside device 200 perform recognition at regular intervals, the first feature information acquired by the recognition unit 11 of the information integration device 10, Also, the second feature information acquired by the roadside device 200 was not recognized at the moment of time T.
図5(A)では、車両100の情報統合装置10の認識部11によって認識された複数の第1の地物を示す第1の地物情報の認識時刻は、基準時刻である時刻TよりもΔTa遅いTa(=T+ΔTa)である。同一地物特定部13は、例えば、車両100の速度情報と誤差時刻であるΔTaに基づいて、時刻Tにおける地物情報を基に時刻Tに同期する処理を実施する。同一の地物の判定前に時刻同期をすることで、同一の地物の特定率を上げることができる。
In FIG. 5A, the recognition time of the first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 is earlier than the time T, which is the reference time. ΔTa is slow Ta (=T+ΔTa). The identical feature identification unit 13 executes processing for synchronizing the feature information at time T with time T, for example, based on the speed information of the vehicle 100 and ΔTa, which is the error time. By synchronizing the time before determining the same feature, the identification rate of the same feature can be increased.
図6(A)は、車両100の情報統合装置10の認識部11によって認識された複数の第1の地物を示す第1の地物情報を示す図である。図6(B)は、路側機200の認識装置によって認識された複数の第2の地物を示す第2の地物情報を示す図である。図6(C)は、図6(A)の第1の地物情報及び図6(B)の第2の地物情報に含まれる同一車両の情報に基づいて、図6(A)の第1の地物情報を補正する処理を示す図である。
FIG. 6A is a diagram showing first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100. FIG. FIG. 6B is a diagram showing second feature information indicating a plurality of second features recognized by the recognition device of the roadside device 200. As shown in FIG. FIG. 6(C) shows the first feature information of FIG. 6(A) based on the information of the same vehicle included in the first feature information of FIG. 6(A) and the second feature information of FIG. 6(B). FIG. 10 is a diagram showing processing for correcting feature information of No. 1;
補正情報生成部16は、同一地物特定部13によって特定された同一の地物を基に、情報統合装置10の認識部11によって認識された複数の第1の地物を示す第1の地物情報及び路側機200の認識装置によって認識された複数の第2の地物を示す第2の地物情報における、誤認識及び認識誤差を補正するために使用される補正情報を生成する。図6(A)から(C)は、路側機200の認識装置による第2の地物の認識精度が、情報統合装置10の認識部11による第1の地物の認識精度よりも高い場合を示す。同一地物特定部13で同一と判断された車両Vaにおいて、車両100による車両Vaの位置情報と路側機200による車両Vaの位置情報とを比較すると、X軸方向の位置に距離「50」の差があることが分かる。このため、補正情報生成部16は、情報統合装置10の認識部11による地物情報の全てのX軸方向の位置に、距離「50」を加算する必要があるという補正情報を生成する。
Based on the same feature specified by the same feature specifying unit 13, the correction information generating unit 16 generates a first feature representing a plurality of first features recognized by the recognition unit 11 of the information integration device 10. Correction information is generated that is used to correct misrecognitions and recognition errors in the object information and the second feature information indicating a plurality of second features recognized by the recognition device of the roadside unit 200 . FIGS. 6A to 6C illustrate the case where the recognition accuracy of the second feature by the recognition device of the roadside unit 200 is higher than the recognition accuracy of the first feature by the recognition unit 11 of the information integration device 10. show. Regarding the vehicle Va determined to be the same by the same feature identification unit 13, when the position information of the vehicle Va from the vehicle 100 and the position information of the vehicle Va from the roadside unit 200 are compared, it is found that the position in the X-axis direction has a distance of "50". It can be seen that there is a difference. Therefore, the correction information generation unit 16 generates correction information that the distance “50” needs to be added to all positions in the X-axis direction of the feature information obtained by the recognition unit 11 of the information integration device 10 .
図7(A)は、情報統合装置10の認識部11によって認識された複数の第1の地物(標識を含む)を示す第1の地物情報を示す図である。図7(B)は、路側機200の認識装置によって認識された又は地図情報に含まれる複数の第2の地物(標識を含む)を示す第2の地物情報を示す図である。図7(C)は、図7(A)の地物情報と図7(B)の地物情報又は地図情報とに含まれる標識の情報に基づいて図7(A)の第1の地物情報を補正する処理を示す図である。
FIG. 7A is a diagram showing first feature information indicating a plurality of first features (including signs) recognized by the recognition unit 11 of the information integration device 10. FIG. FIG. 7B is a diagram showing second feature information indicating a plurality of second features (including signs) recognized by the recognition device of the roadside unit 200 or included in the map information. FIG. 7(C) shows the first feature information of FIG. 7(A) based on the sign information included in the feature information of FIG. 7(A) and the feature information or map information of FIG. 7(B). It is a figure which shows the process which correct|amends information.
補正情報を生成するには、補正情報生成部16は、まず情報統合装置10の認識部11及び路側機200のび認識装置のいずれか1つ、又は2つ以上、又は全てに補正が必要であるかを判断する。地図情報に含まれる標識情報の絶対的な位置情報は、正確な位置情報である。補正情報生成部16は、情報統合装置10の認識部11によって認識された標識の位置と路側機200の認識装置によって認識された標識の位置を地図情報に含まれる標識の位置と比較することで、情報統合装置10の認識部11によって認識された標識の位置情報は、地図情報に含まれている標識の位置情報と異なるため補正が必要であると判断し、路側機200の認識装置によって認識された標識の位置情報は、地図情報に含まれている標識の位置情報と一致しているため補正が不要であると判断する。さらに、補正情報生成部16は、情報統合装置10の認識部11によって認識された標識の位置が、地図情報に含まれている標識の位置より、-X軸方向に距離「50」ずれていると判断する。このため、補正情報生成部16は、情報統合装置10の認識部11によって認識された第1の地物の全ての位置を、+X軸方向に距離「50」移動させるという補正情報を生成する。補正情報生成部16により行われる処理は、情報統合装置10の補正情報生成処理に相当する。
In order to generate the correction information, the correction information generation unit 16 first needs to correct one, two or more, or all of the recognition unit 11 of the information integration device 10 and the roadside unit 200 and recognition devices. to judge whether Absolute position information of sign information included in map information is accurate position information. The correction information generation unit 16 compares the position of the sign recognized by the recognition unit 11 of the information integration device 10 and the position of the sign recognized by the recognition device of the roadside unit 200 with the position of the sign included in the map information. , the positional information of the sign recognized by the recognition unit 11 of the information integration device 10 is different from the positional information of the sign included in the map information. It is determined that the position information of the sign obtained matches the position information of the sign included in the map information, and therefore no correction is necessary. Furthermore, the correction information generation unit 16 determines that the position of the sign recognized by the recognition unit 11 of the information integration device 10 is shifted in the −X-axis direction by a distance of “50” from the position of the sign included in the map information. I judge. Therefore, the correction information generation unit 16 generates correction information to move all the positions of the first features recognized by the recognition unit 11 of the information integration device 10 by a distance of “50” in the +X-axis direction. The processing performed by the correction information generation unit 16 corresponds to the correction information generation processing of the information integration device 10 .
補正情報格納部15は、補正情報生成部16が生成した補正情報15aを記憶する記憶部である。
The correction information storage unit 15 is a storage unit that stores the correction information 15a generated by the correction information generation unit 16.
情報送受信部12は、認識部11によって認識された第1の地物を示す第1の地物情報と補正情報生成部16が生成した補正情報のいずれか一方又は両方を他の移動体又は固定体に搭載された認識装置に送信する。また、情報送受信部12は、他の移動体又は固定体に搭載された認識装置から地物情報と補正情報のいずれか一方又は両方を受信する取得部である。
The information transmitting/receiving unit 12 transmits one or both of the first feature information indicating the first feature recognized by the recognition unit 11 and the correction information generated by the correction information generation unit 16 to another moving object or a fixed object. Send it to a recognition device on your body. The information transmission/reception unit 12 is an acquisition unit that receives one or both of the feature information and the correction information from a recognition device mounted on another mobile or fixed body.
情報統合部17は、認識部11によって認識された第1の地物を示す第1の地物情報と情報送受信部12が受信した他の移動体又は固定体に搭載された認識装置によって認識された第2の地物を示す第2の地物情報とを統合し、広範囲の地物情報を生成する。特に、情報統合部17は、補正情報生成部16が生成した補正情報と、他の移動体又は固定体に搭載された認識装置が生成した補正情報とを基に、認識部11によって認識された第1の地物を示す第1の地物情報と情報送受信部12が受信した第2の地物を示す第2の地物情報の一方又は両方の誤認識及び認識誤差を補正して、情報を統合することで正確かつ広範囲の統合地物情報を生成することができる。
The information integrating unit 17 receives the first feature information indicating the first feature recognized by the recognition unit 11 and the information transmitting/receiving unit 12 and recognizes it by a recognition device mounted on another mobile or fixed object. The feature information of a wide range is generated by integrating the second feature information indicating the second feature. In particular, the information integration unit 17 performs correction information recognized by the recognition unit 11 based on correction information generated by the correction information generation unit 16 and correction information generated by a recognition device mounted on another moving body or fixed body. By correcting erroneous recognition and recognition error of one or both of the first feature information indicating the first feature and the second feature information indicating the second feature received by the information transmitting/receiving unit 12, the information can generate accurate and extensive integrated feature information.
図8(A)は、ある時刻Tにおける実際の道路状況を示す図である。図8(B)は、情報統合装置10の認識部11によって認識された複数の第1の地物を示す第1の地物情報を示す図である。図8(C)は、路側機200の認識装置によって認識された複数の第2の地物を示す第2の地物情報を示す図である。図9(A)は、図8(B)の第1の地物情報を補正して得られた補正後の第1の地物情報を示す図である。図9(B)は、図8(C)の第2の地物情報を補正して得られた補正後の第2の地物情報を示す図である。図9(C)は、図9(A)及び(B)の地物情報を統合して得られた統合地物情報を示す図である。
Fig. 8(A) is a diagram showing the actual road conditions at a certain time T. FIG. 8B is a diagram showing first feature information indicating a plurality of first features recognized by the recognition unit 11 of the information integration device 10. As shown in FIG. FIG. 8C is a diagram showing second feature information indicating a plurality of second features recognized by the recognition device of the roadside device 200. FIG. FIG. 9(A) is a diagram showing the corrected first feature information obtained by correcting the first feature information in FIG. 8(B). FIG. 9(B) is a diagram showing the corrected second feature information obtained by correcting the second feature information of FIG. 8(C). FIG. 9(C) is a diagram showing integrated feature information obtained by integrating the feature information of FIGS. 9(A) and (B).
まず、地物情報(補正前)に対して補正情報を用いて補正を実施する。この例では、車両100の情報統合装置10の認識部11によって認識された複数の第1の地物を示す第1の地物情報について、複数の第1の地物の位置を+X軸方向に距離「50」移動させる、すなわち、X座標に「50」を加算することで補正するという補正情報が格納されている。この補正情報を基に、第1の地物情報を補正した図が、図9(A)に示される。図9(A)に示される第1の地物情報(補正後)と図9(B)に示される第2の地物情報(補正後)とを統合することで実際の道路状況を正確に認識した地物を示す統合地物情報(図9(C))を生成することができる。情報統合部17により行われる処理は、情報統合処理に相当する。
First, the feature information (before correction) is corrected using the correction information. In this example, for the first feature information indicating the plurality of first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100, the positions of the plurality of first features are moved in the +X-axis direction. Correction information is stored for correcting by moving a distance of "50", that is, by adding "50" to the X coordinate. A diagram in which the first feature information is corrected based on this correction information is shown in FIG. 9(A). By integrating the first feature information (after correction) shown in FIG. 9A and the second feature information (after correction) shown in FIG. It is possible to generate integrated feature information (FIG. 9(C)) indicating the recognized features. The processing performed by the information integration unit 17 corresponds to information integration processing.
〈実施の形態の動作〉
次に、本実施の形態に係る情報統合装置10の動作を説明する。図10は、情報統合装置10の動作を示すフローチャートである。 <Operation of Embodiment>
Next, the operation of theinformation integration device 10 according to this embodiment will be described. FIG. 10 is a flow chart showing the operation of the information integration device 10. As shown in FIG.
次に、本実施の形態に係る情報統合装置10の動作を説明する。図10は、情報統合装置10の動作を示すフローチャートである。 <Operation of Embodiment>
Next, the operation of the
まず、ステップS10において、認識部11は、車載センサ20の出力に基づいて、車両100の周辺に存在する1つ以上の第1の地物の認識処理を行って第1の地物情報を取得し、情報送受信部12は、1つ以上の第2の地物についての第2の地物情報を取得する。
First, in step S10, the recognition unit 11 performs recognition processing of one or more first features existing around the vehicle 100 based on the output of the in-vehicle sensor 20, and acquires first feature information. Then, the information transmitting/receiving unit 12 acquires second feature information about one or more second features.
ステップS11では、同一地物特定部13は、認識部11で生成した第1の地物情報、及び情報送受信部12が受信した他の移動体又は固定体に搭載された認識装置により生成された第2の地物情報に対する補正情報が既に生成済みであるか否か(すなわち、補正情報有りか無か。)を判定する。補正情報が既に生成済みである場合(すなわち、補正情報有りの場合)には、処理はステップS12に進み、補正情報が生成済みでない場合(すなわち、補正情報無しの場合)には、処理はステップS14に進む。
In step S11, the same feature identification unit 13 determines the first feature information generated by the recognition unit 11 and the information received by the information transmission/reception unit 12 and generated by a recognition device mounted on another moving or fixed body. It is determined whether correction information for the second feature information has already been generated (that is, whether correction information exists or not). If the correction information has already been generated (that is, if there is correction information), the process proceeds to step S12, and if the correction information has not been generated (that is, if there is no correction information), the process proceeds to step Proceed to S14.
ステップS12では、補正情報生成部16は、補正情報を生成したときの環境と現在の環境の変化(すなわち、環境の差)の有無を判定する。補正情報は、環境に大きく依存する。例えば、車両が速度60km/hで走行中のときの認識誤差と、車両が速度10km/hで走行中のときの認識誤差とは、大きく異なる。このため、生成済みの補正情報は、その補正情報を生成した環境と同じ環境のときのものか又は同等の環境のときのもの(すなわち、環境の差が予め決められた範囲内であるときのもの)である場合に、正しく地物情報の誤検出、認識誤差を補正するために使用することができる。ここで、「環境」は、移動体の走行状態(速度、加減速度、ステアリング角度など)、道路環境(直線区間、カーブ、交差点付近、トンネル内など)、天候などを示す。この判定後、処理はステップS13に進む。
In step S12, the correction information generation unit 16 determines whether or not there is a change (that is, an environment difference) between the environment when the correction information was generated and the current environment. Correction information is highly dependent on the environment. For example, the recognition error when the vehicle is traveling at a speed of 60 km/h differs greatly from the recognition error when the vehicle is traveling at a speed of 10 km/h. For this reason, the generated correction information is in the same environment as the environment in which the correction information was generated or in an equivalent environment (that is, when the environment difference is within a predetermined range). object), it can be used to correct erroneous detection and recognition error of feature information. Here, the "environment" indicates the running state of the moving object (speed, acceleration/deceleration, steering angle, etc.), road environment (straight section, curve, near intersection, inside tunnel, etc.), weather, and the like. After this determination, the process proceeds to step S13.
ステップS13では、補正情報生成部16は、ステップS12の環境変化の有無に応じて補正情報の生成要否を判定する。環境変化がない場合(ステップS13において判断が無の場合)、既に生成済みの補正情報が有効であるため、処理はステップS17に進む。環境変化が有る場合(ステップS13において判断が有り場合)には、補正情報を生成する必要があるため、処理はステップS14に進む。
In step S13, the correction information generation unit 16 determines whether correction information needs to be generated depending on whether or not there is an environmental change in step S12. If there is no environmental change (no determination in step S13), the correction information that has already been generated is valid, so the process proceeds to step S17. When there is an environmental change (when determination is made in step S13), it is necessary to generate correction information, so the process proceeds to step S14.
ステップS14では、同一地物特定部13は、認識部11によって認識された1つ以上の第1の地物を示す第1の地物情報に含まれる認識時刻と情報送受信部12が受信した1つ以上の第2の地物を示す第2の地物情報の認識時刻を補正し、時刻同期の処理である時刻補正を行う。つまり、同一地物特定部13は、第1の地物情報の認識時刻と第2の地物情報の認識時刻とに基づいて、第1の地物情報と第2の地物情報とを同時刻の情報に補正する時刻補正を行う。
In step S14, the identical feature identifying unit 13 determines the recognition time included in the first feature information indicating one or more first features recognized by the recognizing unit 11 and the one received by the information transmitting/receiving unit 12. The recognition time of the second feature information indicating one or more second features is corrected, and time correction, which is time synchronization processing, is performed. That is, the identical feature identification unit 13 identifies the first feature information and the second feature information based on the recognition time of the first feature information and the recognition time of the second feature information. Perform time correction for correcting time information.
ステップS15では、同一地物特定部13は、時刻補正済みの第1の地物情報と第2の地物情報の中から同一の地物を特定する。同一の地物を特定する例としては、高精度地図の特徴点を用いて特徴点からの相対距離が一致する地物を同一と特定する方法、複数の地物の相対関係から同一の地物を特定する方法、などが挙げられる。
In step S15, the same feature identification unit 13 identifies the same feature from the time-corrected first feature information and second feature information. Examples of identifying identical features include a method of identifying identical features that have the same relative distance from a feature point using feature points on a high-precision map, and a method for identifying the
ステップS16では、補正情報生成部16は、同一地物特定部13によって特定された同一の地物の情報と複数の認識装置による地物情報を基に情報統合装置10の認識部11及び路側機200の認識装置によって認識された地物を示す地物情報の誤認識及び認識誤差を補正する補正情報を生成する。補正情報の生成手段としては、同一であると特定した地物の位置が一致していなければ、信頼度が低い認識装置による地物情報を信頼度が高い地物情報の位置に合わせる情報となる。生成した補正情報は、補正情報格納部15に格納する。
In step S16, the correction information generation unit 16 generates the recognition unit 11 of the information integration device 10 and the roadside unit based on the information of the same feature identified by the same feature identification unit 13 and the feature information obtained by a plurality of recognition devices. Correction information for correcting erroneous recognition and recognition error of the feature information indicating the feature recognized by the recognition device 200 is generated. As means for generating correction information, if the positions of the features identified as identical do not match, the feature information obtained by the recognition device with low reliability is adjusted to the position of the feature information with high reliability. . The generated correction information is stored in the correction information storage unit 15 .
ステップS17では、情報統合部17は、補正情報を用いて、情報統合装置10の認識部11によって生成された第1の地物情報と、路側機200の認識装置が生成した第2の地物情報とに現れる地物情報の誤認識及び認識誤差を補正する。
In step S17, the information integration unit 17 uses the correction information to identify the first feature information generated by the recognition unit 11 of the information integration device 10 and the second feature information generated by the recognition device of the roadside device 200. To correct erroneous recognition and recognition error of feature information appearing in information.
ステップS18では、情報統合部17は、補正された第1の地物情報及び第2の認識情報とを統合することで、広範囲で正確な統合地物情報を生成する。
In step S18, the information integration unit 17 integrates the corrected first feature information and second recognition information to generate wide-range and accurate integrated feature information.
〈実施の形態の効果〉
以上に説明したように、本実施の形態に係る情報統合装置10によれば、車両100の情報統合装置10の認識部11によって認識された1つ以上の第1の地物を示す第1の地物情報及び路側機200の認識装置によって認識された1つ以上の第2の地物を示す第2の地物情報を比較し、比較の結果に基づいて第1の地物情報又は第2の地物情報の少なくとも一方を補正し、その後、第1の地物情報と第2の地物情報とを統合して、統合地物情報を生成する。このため、情報統合装置10によれば、広範囲の地物情報を正確に把握することができる。 <Effect of Embodiment>
As described above, according to theinformation integration device 10 according to the present embodiment, the first map representing one or more first features recognized by the recognition unit 11 of the information integration device 10 of the vehicle 100 The feature information and the second feature information indicating one or more second features recognized by the recognition device of the roadside unit 200 are compared, and the first feature information or the second feature information is determined based on the comparison result. At least one of the feature information is corrected, and then the first feature information and the second feature information are integrated to generate integrated feature information. Therefore, according to the information integration device 10, it is possible to accurately grasp feature information in a wide range.
以上に説明したように、本実施の形態に係る情報統合装置10によれば、車両100の情報統合装置10の認識部11によって認識された1つ以上の第1の地物を示す第1の地物情報及び路側機200の認識装置によって認識された1つ以上の第2の地物を示す第2の地物情報を比較し、比較の結果に基づいて第1の地物情報又は第2の地物情報の少なくとも一方を補正し、その後、第1の地物情報と第2の地物情報とを統合して、統合地物情報を生成する。このため、情報統合装置10によれば、広範囲の地物情報を正確に把握することができる。 <Effect of Embodiment>
As described above, according to the
また、本実施の形態に係る情報統合装置10を用いた移動体は、その周辺の地物情報を把握し、把握した地物情報を基に移動体の運転に係る運転支援情報を生成する運転支援装置に適切な地物情報を提供することができる。
In addition, a moving body using the information integration apparatus 10 according to the present embodiment can grasp information on features around the moving body, and based on the grasped feature information, generate driving support information related to driving of the moving body. Appropriate feature information can be provided to the support device.
また、本実施の形態に係る情報統合装置10を用いた移動体は、その周辺の地物情報を把握し、把握した地物情報を基に移動体の自立走行に係る自立走行情報を生成する自立走行装置に適切な地物情報を提供することができる。
In addition, the moving body using the information integration device 10 according to the present embodiment grasps the feature information in its surroundings, and based on the grasped feature information, generates autonomous traveling information related to autonomous traveling of the moving body. Appropriate feature information can be provided to the autonomous mobile device.
10 情報統合装置、 11 認識部、 12 情報送受信部(通信部)、 13 同一地物特定部、 14 地図情報格納部、 14a 地図情報、 15 補正情報格納部、 15a 補正情報、 16 補正情報生成部、 17 情報統合部、 20 車載センサ、 100 車両、 200 路側機、 300 標識。
10Information integration device 11 Recognition unit 12 Information transmission/reception unit (communication unit) 13 Same feature identification unit 14 Map information storage unit 14a Map information 15 Correction information storage unit 15a Correction information 16 Correction information generation unit , 17 information integration unit, 20 vehicle-mounted sensor, 100 vehicle, 200 roadside unit, 300 sign.
10
Claims (11)
- 移動体に備えられたセンサから、前記センサによって取得されたセンサ情報を取得し、取得した前記センサ情報に基づいて前記移動体の周辺の1つ以上の地物である1つ以上の第1の地物を認識し、前記1つ以上の第1の地物を示す第1の地物情報を生成する認識部と、
外部装置から、又は取得可能な地図情報から、又は前記外部装置及び前記地図情報の両方から、前記移動体の周辺に存在すると認識された1つ以上の地物である1つ以上の第2の地物を示す第2の地物情報を取得し、前記第1の地物情報によって示される前記1つ以上の第1の地物と前記第2の地物情報によって示される前記1つ以上の第2の地物の中から同一の地物を特定する同一地物特定部と、
同一の地物であると特定された第1の地物と第2の地物との位置の差に基づいて、前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正するために用いる補正情報を生成する補正情報生成部と、
前記補正情報を用いて前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正した後に、前記第1の地物情報と前記第2の地物情報とを統合することで統合地物情報を生成する情報統合部と、
を備えたことを特徴とする情報統合装置。 Acquiring sensor information acquired by the sensor from a sensor provided in a mobile body, and based on the acquired sensor information, one or more first features that are one or more features around the mobile body a recognition unit that recognizes a feature and generates first feature information indicating the one or more first features;
One or more second features that are one or more features recognized to exist around the moving object from an external device, from map information that can be acquired, or from both the external device and the map information second feature information indicating features is obtained, and the one or more first features indicated by the first feature information and the one or more first features indicated by the second feature information are obtained; a same feature identification unit that identifies the same feature from among the second features;
At least one of the first feature information and the second feature information is corrected based on the positional difference between the first feature and the second feature identified as the same feature. a correction information generating unit that generates correction information used for
After correcting at least one of the first feature information and the second feature information using the correction information, integrating the first feature information and the second feature information an information integration unit that generates integrated feature information;
An information integration device comprising: - 前記第1の地物情報は、前記1つ以上の第1の地物の位置及び属性を含み、
前記第2の地物情報は、前記1つ以上の第2の地物の位置及び属性を含む
ことを特徴とする請求項1に記載の情報統合装置。 the first feature information includes the position and attributes of the one or more first features;
2. The information integrating apparatus according to claim 1, wherein said second feature information includes positions and attributes of said one or more second features. - 前記移動体は、走行可能な車両であり、
前記外部装置は、固定された1つ以上の認識装置、他の移動体に備えられた1つ以上の認識装置、又は、固定された1つ以上の認識装置及び他の移動体に備えられた1つ以上の認識装置の組み合わせであり、前記1つ以上の第2の地物を認識して前記第2の地物情報を生成する
ことを特徴とする請求項1又は2に記載の情報統合装置。 The mobile object is a vehicle that can run,
The external device is one or more fixed recognition devices, one or more recognition devices provided on another moving body, or one or more fixed recognition devices and one or more recognition devices provided on another moving body 3. Information integration according to claim 1 or 2, characterized in that it is a combination of one or more recognition devices, and recognizes the one or more second features to generate the second feature information. Device. - 前記同一地物特定部は、前記第1の地物情報の認識時刻と前記第2の地物情報の認識時刻とに基づいて、前記第1の地物情報と前記第2の地物情報とを同時刻の情報に補正する時刻補正を行い、前記時刻補正が行われた後に、前記第1の地物情報及び前記第2の地物情報に基づいて、前記同一の地物を特定する処理を行う
ことを特徴とする請求項1から3のいずれか1項に記載の情報統合装置。 The same feature identification unit determines the first feature information and the second feature information based on the recognition time of the first feature information and the recognition time of the second feature information. are corrected to information of the same time, and after the time correction is performed, the same feature is specified based on the first feature information and the second feature information. The information integration device according to any one of claims 1 to 3, characterized by: - 前記地図情報は、前記移動体が走行予定の領域の地図情報を含む
ことを特徴とする請求項1から4のいずれか1項に記載の情報統合装置。 The information integration device according to any one of claims 1 to 4, wherein the map information includes map information of an area in which the moving body is scheduled to travel. - 前記地図情報は、前記移動体に備えられた記憶装置又はネットワーク上の情報提供サーバに記憶されている情報である
ことを特徴とする請求項1から5のいずれか1項に記載の情報統合装置。 6. The information integration device according to any one of claims 1 to 5, wherein the map information is information stored in a storage device provided in the mobile object or in an information providing server on a network. . - 前記第1の地物情報と前記補正情報の少なくとも一方を前記外部装置に送信する通信部をさらに有する
ことを特徴とする請求項1から6のいずれか1項に記載の情報統合装置。 The information integrating apparatus according to any one of claims 1 to 6, further comprising a communication unit that transmits at least one of the first feature information and the correction information to the external apparatus. - 前記同一地物特定部は、前記補正情報が既に生成済みであるか否かを判定し、
前記補正情報生成部は、前記補正情報が生成済みでない場合に、前記補正情報を生成する
ことを特徴とする請求項1から7のいずれか1項に記載の情報統合装置。 The same feature identification unit determines whether the correction information has already been generated,
The information integrating apparatus according to any one of claims 1 to 7, wherein the correction information generation unit generates the correction information when the correction information has not been generated. - 前記補正情報が既に生成済である場合、前記情報統合部は、前記補正情報を生成したときにおける前記移動体の環境と現在の前記移動体の環境の差が予め決められた範囲内であるときに、生成済みの前記補正情報を用いて補正を行う
ことを特徴とする請求項8に記載の情報統合装置。 When the correction information has already been generated, the information integration unit determines when the difference between the environment of the moving body when the correction information is generated and the current environment of the moving body is within a predetermined range. 9. The information integrating apparatus according to claim 8, wherein the generated correction information is used to perform correction. - 情報統合装置によって実施される情報統合方法であって、
移動体に備えられたセンサから、前記センサによって取得されたセンサ情報を取得し、取得した前記センサ情報に基づいて前記移動体の周辺の1つ以上の地物である1つ以上の第1の地物を認識し、前記1つ以上の第1の地物を示す第1の地物情報を生成するステップと、
外部装置から、又は取得可能な地図情報から、又は前記外部装置及び前記地図情報の両方から、前記移動体の周辺に存在すると認識された1つ以上の地物である1つ以上の第2の地物を示す第2の地物情報を取得し、前記第1の地物情報によって示される前記1つ以上の第1の地物と前記第2の地物情報によって示される前記1つ以上の第2の地物の中から同一の地物を特定するステップと、
同一の地物であると特定された第1の地物と第2の地物との位置の差に基づいて、前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正するために用いる補正情報を生成するステップと、
前記補正情報を用いて前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正した後に、前記第1の地物情報と前記第2の地物情報とを統合することで統合地物情報を生成するステップと、
を有することを特徴とする情報統合方法。 An information integration method implemented by an information integration device, comprising:
Acquiring sensor information acquired by the sensor from a sensor provided in a mobile body, and based on the acquired sensor information, one or more first features that are one or more features around the mobile body recognizing features and generating first feature information indicative of the one or more first features;
One or more second features that are one or more features recognized to exist around the moving object from an external device, from map information that can be acquired, or from both the external device and the map information second feature information indicating features is obtained, and the one or more first features indicated by the first feature information and the one or more first features indicated by the second feature information are obtained; identifying identical features among the second features;
At least one of the first feature information and the second feature information is corrected based on the positional difference between the first feature and the second feature identified as the same feature. generating correction information used to
After correcting at least one of the first feature information and the second feature information using the correction information, integrating the first feature information and the second feature information generating integrated feature information;
An information integration method characterized by comprising: - 移動体に備えられたセンサから、前記センサによって取得されたセンサ情報を取得し、取得した前記センサ情報に基づいて前記移動体の周辺の1つ以上の地物である1つ以上の第1の地物を認識し、前記1つ以上の第1の地物を示す第1の地物情報を生成するステップと、
外部装置から、又は取得可能な地図情報から、又は前記外部装置及び前記地図情報の両方から、前記移動体の周辺に存在すると認識された1つ以上の地物である1つ以上の第2の地物を示す第2の地物情報を取得し、前記第1の地物情報によって示される前記1つ以上の第1の地物と前記第2の地物情報によって示される前記1つ以上の第2の地物の中から同一の地物を特定するステップと、
同一の地物であると特定された第1の地物と第2の地物との位置の差に基づいて、前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正するために用いる補正情報を生成するステップと、
前記補正情報を用いて前記第1の地物情報及び前記第2の地物情報の少なくとも一方を補正した後に、前記第1の地物情報と前記第2の地物情報とを統合することで統合地物情報を生成するステップと、
をコンピュータに実行させることを特徴とする情報統合プログラム。 Acquiring sensor information acquired by the sensor from a sensor provided in a mobile body, and based on the acquired sensor information, one or more first features that are one or more features around the mobile body recognizing features and generating first feature information indicative of the one or more first features;
One or more second features that are one or more features recognized to exist around the moving object from an external device, from map information that can be acquired, or from both the external device and the map information second feature information indicating features is obtained, and the one or more first features indicated by the first feature information and the one or more first features indicated by the second feature information are obtained; identifying identical features among the second features;
At least one of the first feature information and the second feature information is corrected based on the positional difference between the first feature and the second feature identified as the same feature. generating correction information used to
After correcting at least one of the first feature information and the second feature information using the correction information, integrating the first feature information and the second feature information generating integrated feature information;
An information integration program characterized by causing a computer to execute
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021569909A JP7126629B1 (en) | 2021-06-30 | 2021-06-30 | Information integration device, information integration method, and information integration program |
PCT/JP2021/024695 WO2023276025A1 (en) | 2021-06-30 | 2021-06-30 | Information integration device, information integration method, and information integration program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/024695 WO2023276025A1 (en) | 2021-06-30 | 2021-06-30 | Information integration device, information integration method, and information integration program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023276025A1 true WO2023276025A1 (en) | 2023-01-05 |
Family
ID=83047414
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/024695 WO2023276025A1 (en) | 2021-06-30 | 2021-06-30 | Information integration device, information integration method, and information integration program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7126629B1 (en) |
WO (1) | WO2023276025A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020012207A1 (en) * | 2018-07-11 | 2020-01-16 | 日産自動車株式会社 | Driving environment information generation method, driving control method, driving environment information generation device |
WO2020058735A1 (en) * | 2018-07-02 | 2020-03-26 | 日産自動車株式会社 | Driving support method and driving support device |
-
2021
- 2021-06-30 JP JP2021569909A patent/JP7126629B1/en active Active
- 2021-06-30 WO PCT/JP2021/024695 patent/WO2023276025A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020058735A1 (en) * | 2018-07-02 | 2020-03-26 | 日産自動車株式会社 | Driving support method and driving support device |
WO2020012207A1 (en) * | 2018-07-11 | 2020-01-16 | 日産自動車株式会社 | Driving environment information generation method, driving control method, driving environment information generation device |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023276025A1 (en) | 2023-01-05 |
JP7126629B1 (en) | 2022-08-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10816984B2 (en) | Automatic data labelling for autonomous driving vehicles | |
JP6602352B2 (en) | Decision improvement system based on planning feedback for autonomous vehicles | |
CN108688660B (en) | Operating range determining device | |
JP7040374B2 (en) | Object detection device, vehicle control system, object detection method and computer program for object detection | |
US20190317519A1 (en) | Method for transforming 2d bounding boxes of objects into 3d positions for autonomous driving vehicles (advs) | |
US20180136643A1 (en) | Visual communication system for autonomous driving vehicles (adv) | |
US20150153184A1 (en) | System and method for dynamically focusing vehicle sensors | |
JP6757442B2 (en) | Lane post-processing in self-driving cars | |
US11738747B2 (en) | Server device and vehicle | |
JP6522255B1 (en) | Behavior selection apparatus, behavior selection program and behavior selection method | |
CN115104138A (en) | Multi-modal, multi-technology vehicle signal detection | |
KR20200084446A (en) | Electronic apparatus for self driving of vehicles based on intersection node and thereof control method | |
JP2022139009A (en) | Drive support device, drive support method, and program | |
US20220388506A1 (en) | Control apparatus, movable object, control method, and computer-readable storage medium | |
US20210206392A1 (en) | Method and device for operating an automated vehicle | |
JP7192541B2 (en) | Information processing device, information processing method and program | |
JP7126629B1 (en) | Information integration device, information integration method, and information integration program | |
US20240190475A1 (en) | Travel area determination device and travel area determination method | |
JP2023145103A (en) | Information processing apparatus, moving body, system, information processing method, program, and server | |
JP6861911B2 (en) | Information processing equipment, information processing methods and information processing programs | |
KR20230018005A (en) | Device and Method for Generating Lane Information | |
Alrousan et al. | Multi-Sensor Fusion in Slow Lanes for Lane Keep Assist System | |
US20210191423A1 (en) | Self-Location Estimation Method and Self-Location Estimation Device | |
EP4064220A1 (en) | Method, system and device for detecting traffic light for vehicle | |
WO2020073272A1 (en) | Snapshot image to train an event detector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2021569909 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21948332 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21948332 Country of ref document: EP Kind code of ref document: A1 |