Nothing Special   »   [go: up one dir, main page]

WO2022196352A1 - Information communication device, moving body, and straddle-type vehicle - Google Patents

Information communication device, moving body, and straddle-type vehicle Download PDF

Info

Publication number
WO2022196352A1
WO2022196352A1 PCT/JP2022/008697 JP2022008697W WO2022196352A1 WO 2022196352 A1 WO2022196352 A1 WO 2022196352A1 JP 2022008697 W JP2022008697 W JP 2022008697W WO 2022196352 A1 WO2022196352 A1 WO 2022196352A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
unit
target
processing unit
Prior art date
Application number
PCT/JP2022/008697
Other languages
French (fr)
Japanese (ja)
Inventor
虎喜 岩丸
翔 田島
直輝 沖本
圭一 水村
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2023506949A priority Critical patent/JP7512513B2/en
Publication of WO2022196352A1 publication Critical patent/WO2022196352A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation

Definitions

  • the present invention relates to information communication devices, mobile bodies, and straddle-type vehicles.
  • a control device that controls automatic driving of a vehicle receives change information about the road environment from an external server, and in the case of a temporary change in the road environment, restricts automatic driving in the changed section. , a technique for updating the road structure map to the latest version if the road environment is not temporarily changed.
  • the change information related to the road environment consists of a huge amount of road information, for example, it is necessary to update even a section that has undergone a temporary change due to a trivial road construction that does not change the road shape. If it is executed again, the frequency of downloading change information from the external server and updating the road structure map of the vehicle 10 will increase, so the communication cost between the external server and the vehicle control device will increase, and the external server will increase. An excessive load is concentrated on the communication device of the vehicle and the information processing device (CPU) of the vehicle that performs update processing.
  • CPU information processing device
  • the present invention provides an information communication technology capable of reducing the communication load with the server.
  • An information communication device is an information communication device capable of communicating with a server, an image acquisition unit that acquires an image of the external environment of the mobile object; an image processing unit that extracts a target around the moving object by analyzing the image acquired by the image acquisition unit; a storage unit for accumulating information of a plurality of targets extracted by the image processing unit each time the moving body moves; an encoding unit that encodes change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image; and a communication unit configured to transmit the change information encoded by the encoding unit to the server.
  • FIG. 3 is a diagram showing an arrangement example of an imaging unit
  • FIG. 4 is a diagram for explaining an example of extraction of targets when a moving body moves
  • FIG. 4 is a diagram schematically showing a plurality of pieces of image detection information (image detection library)
  • FIG. 4 is a diagram for explaining the processing flow of the information communication device
  • FIG. 1 is a diagram showing a configuration example of an information communication system STM.
  • the information communication system SYM includes an information processing device 10 functioning as a server, and a mobile unit 30 having an information communication device 20 capable of communicating with the information processing device 10.
  • the mobile body 30 includes a vehicle, and in the following description, an example of a straddle-type vehicle will be described as the mobile body 30 .
  • the information processing apparatus 10 has a processing unit 11 (CPU), a communication unit 12, and a storage unit 13 as a hardware configuration.
  • a processing unit 11 CPU: Central Processing Unit
  • the communication unit 12 can transmit and receive data to and from the information communication device 20 of the mobile object 30 via the network NET.
  • the communication unit 12 can transmit various information including predetermined road information in response to a data request from the information communication device 20 .
  • the storage unit 13 includes a ROM (Read Only Memory) for storing programs to be executed by the processing unit 11, and a RAM (Random Access Memory) for storing various information as a work area when the processing unit 11 executes the programs. Memory) and a large-capacity storage section for storing various information including map information and predetermined road information.
  • the storage unit 13 can be configured by a memory card, flash memory, HDD (Hard Disk Drive), or the like.
  • the information processing device 10 can store information acquired through communication with the information communication device 20 via the network NET in the storage unit 13 . Also, the processing unit 11 of the information processing device 10 can update information (map information) stored in the storage unit 13 based on information acquired through communication with the information communication device 20 . Based on the information acquired through communication with the information communication device 20, the processing unit 11 detects, for example, locations where road construction is being performed, locations where the pavement condition of the road is deteriorated such as holes in the road, and the like. Generate map information that reflects local changes in .
  • the information communication device 20 has a processing section 21 (CPU: Central Processing Unit), a communication section 22 , a storage section 23 , imaging sections 24 A and 24 B, and a position information acquisition section 25 .
  • CPU Central Processing Unit
  • the position information acquisition unit 25 includes a gyro sensor and a GPS sensor, and the gyro sensor detects rotational motion of the moving body 30. Also, the GPS sensor detects the current location information (latitude and longitude) of the mobile object.
  • the processing unit 21 can determine the route of the mobile object 30 (the road along which the mobile object 30 moves (moving route)) and the positional information of the mobile object 30 on the movement route based on the detection results of the gyro sensor and the GPS sensor.
  • FIG. 2 is a diagram showing an arrangement example of the imaging units 24A and 24B.
  • a plurality of imaging units 24A are provided in the peripheral portion of the vehicle body so as to be able to capture an image showing the surroundings of the moving body 30.
  • the plurality of imaging units 24A has an imaging area It is provided so as to include the entire area (360 degrees) around the moving body 30 . That is, the plurality of imaging units 24A are provided so that the imaging regions of two adjacent imaging units 24A partially overlap each other.
  • the directivity direction of the imaging section 24A is schematically indicated by a dashed line, but the actual detection range of the imaging section 24A is assumed to be wider than that illustrated.
  • the imaging units 24B are provided in front of and behind the occupant's seat so that the occupant of the moving body 30 can be imaged from the front and rear respectively.
  • the pointing direction of the imaging section 24B is schematically indicated by a dashed line, but the actual detection range of the imaging section 24B is assumed to be wider than that illustrated.
  • the imaging units 24A and 24B acquire (image) an image with an angle of view including the entire area (360 degrees) around the moving body 30 with respect to the movement direction of the moving body 30.
  • the imaging units 24A and 24B are capable of capturing still images or moving images.
  • a digital camera equipped with an image sensor such as a CMOS (Complementary Metal-Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) sensor is used for the imaging units 24A and 24B.
  • CMOS Complementary Metal-Oxide Semiconductor
  • CCD Charge Coupled Device
  • the storage unit 23 stores programs executed by the processing unit 21, various information received from the information processing apparatus 10, image detection library information used by the processing unit 21 (image processing unit 212), and various analysis processing results of the processing unit 21. memorize
  • the processing unit 21 of the information communication device 20 has a control unit 211, an image processing unit 212, and an encoding unit 213 as functional configurations. These functional configurations are realized by executing a predetermined program read from the storage unit 23 by the processing unit 21 of the information communication device 20 .
  • the components of the functional configuration of the information communication device 20 may be configured by integrated circuits or the like as long as they perform similar functions.
  • the control unit 211 performs various processes for controlling travel of the mobile object 30 and controlling the information communication device 20 as a whole.
  • the image processing unit 212 performs image processing on the images captured by the imaging units 24A and 24B.
  • the image processing unit 212 extracts targets (objects) around the moving body 30 by analyzing the images captured by the imaging units 24A and 24B.
  • the image processing unit 212 analyzes the image to determine, for example, the state of the road (state of pavement, state of lane markings (white lines, etc.) on the road, presence or absence of potholes (including shape and size of potholes), presence or absence of scattering of oil, sand, etc.) can be extracted as a target.
  • the image processing unit 212 analyzes the image to determine, for example, objects on the road (other moving bodies parked, falling objects, broken down vehicles, construction sites, etc.), and the layout of the sides and surroundings of the road. Objects (road signs, plants, etc.) can be extracted as landmarks.
  • the image processing unit 212 can extract, for example, road conditions based on weather phenomena (floods, snowfall, precipitation, puddles, frozen road surfaces, etc.) as targets by analyzing images.
  • the image processing unit 212 can also extract, for example, creatures that have entered the road (animals such as dogs, cats, horses, and cows that have entered the road) as targets by analyzing images.
  • FIG. 3 is a diagram explaining an example of target extraction when the moving body 30 moves.
  • ST31 schematically shows a state in which the moving body 30 is moving on a predetermined road (moving route), and the imaging sections 24A and 24B are performing imaging while the moving body 30 is moving.
  • ST32 shows an example of extracting a target obtained by analyzing an image captured while moving at position P1 on the movement route.
  • the image processing unit 212 When extracting a target by image processing, the image processing unit 212 refers to image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and analyzes the image. Targets are extracted based on image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and analyzes the image. Targets are extracted based on image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and analyzes the image. Targets are extracted based on image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and analyzes the image. Targets are extracted based on image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and analyzes the image. Targets are extracted based on image detection information (image detection library) in which a target that
  • image detection information In the image detection information (image detection library), targets that can exist in the image corresponding to each position are acquired and stored in advance over the entire predetermined road (travel route).
  • the image detection information (image detection library) is classified into multiple types based on, for example, driving region (for example, by country, domestic driving area), driving time (for example, by season), and road (moving route) information.
  • Image detection information (image detection library) is stored in the storage unit 23 . It is also possible to acquire a plurality of pieces of image detection information (image detection library) through communication with the server and update the information stored in the storage unit 23 .
  • FIG. 4 is a diagram schematically showing a plurality of pieces of image detection information (image detection library) stored in the storage unit 23.
  • the image processing unit 212 can specify the driving area and the road (moving route) based on the position information (latitude, longitude) acquired by the position information acquisition unit 25, and furthermore, obtains the date and time information.
  • the running time can be specified based on the information of the internal timer (timer).
  • the image processing unit 212 can acquire the corresponding image detection information (image detection library) from the storage unit 23 .
  • targets that can exist in the image corresponding to each position are obtained and stored in advance from the start point to the end point (P0-PX in ST31 of FIG. 3) of each road. ing.
  • the image processing unit 212 can specify the imaging position (position P1) based on the information acquired by the position information acquisition unit 25, and in the image detection information, the image corresponding to the imaging position (position P1) includes Targets are extracted based on image analysis with reference to possible targets.
  • a predetermined road (moving route: start point P0-end point PX) is illustrated.
  • targets targets (target 1, target 2, Targets 3, . . . target 8) are illustrated.
  • an example of five times is shown as the predetermined plural times of movement, and the image processing section 212 stores the targets extracted in each imaging time in the storage section 23 . It should be noted that the number of times of movement is not limited to five, and any number of times can be set.
  • the image processing unit 212 determines normal environment information at the position P1 based on the target information stored in the storage unit 23 . As shown in ST32, the target to be extracted varies depending on the number of times of imaging, and the target that temporarily existed moves with the passage of time. It will not be extracted as a target. The image processing unit 212 determines a target having a threshold value indicating a predetermined existence probability or more as normal environment information at the position P1.
  • the image processing unit 212 determines targets 1 to 5 having a threshold value (80%) or more indicating a predetermined existence probability as normal environment information at the position P1.
  • the image processing unit 212 stores the normal environment information at the determined position P1 in the storage unit 23.
  • normal environment information at the imaging position (position P1) is also referred to as baseline information.
  • the threshold indicating the predetermined existence probability is not limited to 80%, and any threshold can be set.
  • the image processing unit 212 extracts a target (new target) from the image captured at the position P1. Then, the image processing unit 212 compares the extracted target (new target) with normal environment information (baseline information), and acquires change information from the normal environment information.
  • the image processing unit 212 compares them with normal environment information (targets 1 to 5).
  • a target object 9 is acquired as change information based on .
  • the image processing unit 212 associates the acquired change information (target object 9) with the position information (P1: latitude and longitude) of the moving body 30 detected by the position information acquisition unit 25 and stores them in the storage unit 23 .
  • the encoding unit 213 performs encoding processing of information for transmission (uploading) to the information processing device 10 based on analysis of image processing by the image processing unit 212 .
  • the information to be transmitted (uploaded) to the information processing apparatus 10 is the change information acquired by the analysis of the image processing unit 212, and the encoding unit 213 converts the change information and the position information associated with the change information. to encode
  • the encoding unit 213 performs encoding for conversion into numerical information indicating the content of the type according to the type of change information (for example, the target object 9).
  • the encoding unit 213 converts, for example, an event indicating the state of the road (state of pavement, presence or absence of depressions, etc.) at the imaging position into numerical information (“01”).
  • the encoding unit 213 also converts an event indicating an object on the road (another parked moving object, a falling object, a broken-down vehicle, a construction site, etc.) into numerical information (“02”).
  • the encoding unit 213 converts, as the type of change information, for example, an event indicating an arrangement (road sign, planting, etc.) on the side of the road and around the road into numerical information (“03”). Further, the encoding unit 213 converts, as the type of change information, for example, an event indicating the state of the road surface based on a weather phenomenon (flood, snowfall, rainfall, etc.) into numerical information (“04”). Then, the encoding unit 213 converts, for example, an event indicating a creature entering the road into numerical information (“05”) as the type of change information.
  • the type of change information for example, an event indicating an arrangement (road sign, planting, etc.) on the side of the road and around the road into numerical information (“03”). Further, the encoding unit 213 converts, as the type of change information, for example, an event indicating the state of the road surface based on a weather phenomenon (flood, snowfall, rainfall, etc.) into numerical information (“04”). Then,
  • the encoding unit 213 also encodes the position information (P1) associated with the change information.
  • the position information (P1) includes latitude and longitude information, and the encoding unit 213 performs encoding to convert the latitude and longitude information into numerical information.
  • "1" at the beginning indicates north latitude
  • "2" indicates south latitude
  • angle information is indicated in decimal notation.
  • the encoding unit 213 converts 135° east longitude into “3135” and converts 135° west longitude into “4135”.
  • the encoding unit 213 encodes the change information and the position information associated with the change information, and generates encoded information combining the encoded change information and the encoded position information. do.
  • the communication unit 22 of the information communication device 20 functions as an interface for connecting to the network NET, and is capable of transmitting and receiving various information to and from the information processing device 10 .
  • the communication unit 22 transmits the encoded information generated by the encoding unit 213 to the information processing device 10 .
  • the processing unit 11 of the information processing device 10 can decode the encoded information acquired through communication with the information communication device 20 and update the information (map information) stored in the storage unit 13 .
  • the processing unit 11 applies a decoding method in which the transform method in the encoding process is reversed to obtain decoded information from the encoded information.
  • the processing unit 11 Based on the decrypted information, the processing unit 11 reflects local changes, such as locations where road construction is being carried out and locations where the pavement condition of the road has deteriorated, such as holes in the road. map information can be generated.
  • FIG. 5 is a diagram for explaining the processing flow of the information communication device 20. As shown in FIG.
  • the moving body 30 starts moving.
  • the imaging units 24 ⁇ /b>A and 24 ⁇ /b>B start imaging to obtain an image of the external environment of the moving body 30 .
  • the image processing unit 212 extracts targets around the moving body 30 by analyzing the image acquired at the imaging position (for example, position P1 in FIG. 3) of the image acquisition unit (imaging units 24A and 24B). .
  • the image processing unit 212 refers to image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and extracts an image.
  • Targets are extracted based on the analysis of
  • the image detection information image detection library
  • the target can be extracted with high accuracy. Further, by referring to the image detection information (image detection library), it is possible to estimate the outline of the target and extract the target in the image at a higher speed.
  • the storage unit 23 stores various image detection information (image detection library), and the image processing unit 212 processes different images depending on the travel area or time of travel of the moving object 30.
  • Target extraction based on image analysis can be performed with reference to the detection information.
  • Various image detection information (image detection library) stored in the storage unit 23 can also be updated based on information acquired from the information processing apparatus 10 (server).
  • the communication unit 22 acquires image detection information (image detection library) through communication with the information processing device 10 (server) and updates the information stored in the storage unit 23, and the image processing unit 212 updates the information. It is also possible to extract a target based on image analysis by referring to the image detection information obtained.
  • the image processing unit 212 determines whether or not the environment information (baseline information) at the imaging position has been generated. If the environment information has been generated (S504-Yes), the process proceeds to S505. proceed.
  • the image processing unit 212 determines whether or not a predetermined period of time has elapsed. If the predetermined period of time has elapsed (S505-Yes), the process proceeds to S508. The environmental information is discarded (S508). Then, the image processing unit 212 returns the processing to S501 and repeats the same processing. If no new target is acquired within the predetermined period, there is a possibility that the environmental information at the imaging position (P1) has changed. In such a case, by discarding the environmental information, it is possible to suppress erroneous detection of the occurrence of a change in the imaging position.
  • the image processing unit 212 advances the process to S506.
  • the predetermined time period in S505 can be changed based on the traffic information at the imaging position.
  • the communication unit 22 acquires information indicating the traffic volume of other moving bodies at the imaging position (P1), and the image processing unit 212 obtains a predetermined traffic volume as the information indicating the traffic volume exceeds the threshold traffic volume. It is also possible to set the period to be short and set the predetermined period to be long as the traffic decreases below the threshold traffic volume.
  • the event information indicating the type of target includes an event indicating the state of the road at the imaging position (P1), an event indicating an object on the road, an event on the side of the road and an area around the road. and an event indicating the road surface condition based on a weather phenomenon.
  • the encoding unit 213 encodes the change information generated by the image processing unit 212.
  • the encoding unit 213 converts the latitude and longitude information included in the position information into numerical information, and converts the event information into numerical information indicating the type of target. Then, the encoding unit 213 generates encoded information by combining the position information converted into numerical information and the event information converted into numerical information. Since the information transmitted from the information communication device 20 to the information processing device 10 (server) is only the position information of the imaging position where the change occurred and the event information indicating the type of the target acquired by the analysis of the image processing unit 212, It is possible to reduce the load on the server as compared with the case of transmitting image data.
  • the image processing unit 212 determines whether or not the environment information (baseline information) at the imaging position has been generated. Proceed to S510.
  • the image processing unit 212 stores (accumulates) the targets extracted by the image processing in S503 in the storage unit 23.
  • the image processing unit 212 advances the process to S512.
  • the image processing unit 212 determines, as the environment information, the target present at the imaging position exceeding the threshold ratio among the multiple target information accumulated in the storage unit 23 (ST32 in FIG. 3). After determining the environment information, the image processing unit 212 returns the process to S502 and executes the processes after S502.
  • the above embodiment discloses at least the following information communication device, mobile object having information communication, and straddle-type vehicle having information communication.
  • the information communication device of the above embodiment is an information communication device (20) capable of communicating with a server, an image acquisition unit (24A, 24B) that acquires an image of the external environment of the mobile object; an image processing unit (212) for extracting targets around the moving object by analyzing the image acquired by the image acquisition unit; a storage unit (23) for accumulating information on a plurality of targets extracted by the image processing unit each time the moving body moves; an encoding unit (213) for encoding change information between the information on the target stored in the storage unit and the new target extracted by analyzing the newly acquired image; A communication unit (22) that transmits the change information encoded by the encoding unit to the server.
  • the communication load with the server can be reduced. That is, there is no need to communicate with the server for data reception, and the communication load with the server can be reduced.
  • the image processing unit (212) determining, as environment information, a target existing in excess of a threshold ratio among the information on the plurality of targets accumulated in the storage unit; By comparing the environment information and the new target, a difference from the environment information is acquired as the change information.
  • the information communication device of configuration 2 since the normal environment at the imaging position is determined based on target information accumulated in advance in the storage unit, it is possible to accurately detect the occurrence of a change in the imaging position. .
  • the information communication device of Configuration 3 if a new target is not acquired within a predetermined period of time, there is a possibility that environmental information at the imaging position has changed. In such a case, by discarding the environmental information, it is possible to suppress erroneous detection of the occurrence of a change in the imaging position.
  • the environmental information information can be discarded according to the traffic volume. can be efficiently discarded.
  • the change information includes position information of the imaging position of the image acquisition unit where the change occurred, and event information indicating the type of the target obtained by analysis of the image processing unit. is included.
  • the encoding unit (213) Converting latitude and longitude information included in the location information into numerical information, converting the event information into numerical information indicating the content of the type of the target; Encoded information is generated by combining the position information converted into the numerical information and the event information converted into the numerical information.
  • the information transmitted from the information communication device to the server indicates the position information of the imaging position where the change occurred and the type of the target obtained by the analysis of the image processing unit. Since only event information is sent, it is possible to reduce the load on the server compared to sending image data.
  • the communication section (22) transmits the encoded information generated by the encoding section (213) to the server.
  • the event information indicating the type of the target among the change information includes: An event indicating the state of the road at the imaging position, an event indicating an object on the road, an event indicating objects placed on the side of the road and around the road, and the state of the road surface based on weather phenomena At least one event is included among the events indicating
  • the information communication device of configuration 8 it is possible to acquire various events that occur at the imaging position as change information and transmit it to the server.
  • the image processing unit refers to image detection information in which a target that may exist in an image corresponding to an imaging position is acquired in advance, and detects a target based on analysis of the image. is extracted.
  • targets can be extracted with high accuracy by referring to image detection information (image detection library) in which targets that can exist in an image corresponding to an imaging position are acquired in advance. becomes possible to do. Further, by referring to the image detection information in which the target is acquired in advance, it becomes possible to perform the process of estimating the outline of the target and extracting the target in the image at a higher speed.
  • image detection information image detection library
  • the image processing unit refers to image detection information that varies depending on the travel area or travel time of the mobile object, and extracts a target based on the analysis of the image.
  • traffic rules such as the color of road pavement and signs, which differ depending on the driving area, and differences in weather phenomena (flooding, snowfall, precipitation, etc.) that may occur depending on the driving season are analyzed in the image analysis. It becomes possible to extract a target object by reflecting it.
  • the communication unit acquires the image detection information through communication with the server, updates the information stored in the storage unit,
  • the image processing unit refers to the updated image detection information and extracts a target based on the analysis of the image.
  • image detection information image detection library
  • image analysis can be performed in response to changes in traffic rules, changes in the occurrence tendency of weather phenomena, and changes in the driving environment. It is possible to extract targets based on
  • the mobile object of the above embodiment is a mobile object comprising an information communication device capable of communicating with a server, wherein the information communication device (20) an image acquisition unit (24A, 24B) that acquires an image of the external environment of the mobile object; an image processing unit (212) for extracting targets around the moving object by analyzing the image acquired by the image acquiring unit; a storage unit (23) for accumulating information on a plurality of targets extracted by the image processing unit each time the moving body moves; an encoding unit (213) for encoding change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image; A communication unit (22) that transmits the change information encoded by the encoding unit to the server.
  • the information communication device (20) an image acquisition unit (24A, 24B) that acquires an image of the external environment of the mobile object; an image processing unit (212) for extracting targets around the moving object by analyzing the image acquired by the image acquiring unit; a storage unit (23) for accumulating information on a plurality
  • the mobile object of configuration 12 it is possible to reduce the communication load with the server. That is, there is no need to communicate with the server for data reception, and the communication load with the server can be reduced.
  • the straddle-type vehicle of the above embodiment is a straddle-type vehicle provided with an information communication device capable of communicating with a server, wherein the information communication device (20) comprises: image acquisition units (24A, 24B) for acquiring images of the external environment of the straddle-type vehicle; an image processing unit (212) for extracting targets around the straddle-type vehicle by analyzing the image acquired by the image acquisition unit; a storage unit (23) for accumulating information on a plurality of targets extracted by the image processing unit each time the straddle-type vehicle moves; an encoding unit (213) for encoding change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image; A communication unit (22) that transmits the change information encoded by the encoding unit to the server.
  • the information communication device (20) comprises: image acquisition units (24A, 24B) for acquiring images of the external environment of the straddle-type vehicle; an image processing unit (212) for extracting targets around the straddle-type vehicle by
  • the communication load with the server can be reduced. That is, there is no need to communicate with the server for data reception, and the communication load with the server can be reduced.
  • a program that implements the functions of the above-described embodiments is supplied to an information communication device via a network or a storage medium, and one or more processors in a computer of the information communication device reads the program and performs information communication. It is also possible to perform device processing.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

An information communication device capable of communicating with a server, comprising: an image acquisition unit that acquires an image of an exterior environment of a moving body; an image processing unit that extracts a target in a periphery of the moving body through analysis of the image acquired by the image acquisition unit; a storage unit that accumulates information of a plurality of targets extracted by the image processing unit as the moving body travels; an encoding unit that encodes variation information between the information of the targets accumulated in the storage unit and a new target extracted through analysis of a newly acquired image; and a communication unit that transmits the variation information encoded by the encoding unit to the server.

Description

情報通信装置、移動体、及び鞍乗型車両Information communication device, mobile object, and straddle-type vehicle
 本発明は、情報通信装置、移動体、及び鞍乗型車両に関する。 The present invention relates to information communication devices, mobile bodies, and straddle-type vehicles.
 特許文献1には、車両の自動走行を制御する制御装置が外部サーバから道路環境に関する変更情報を受信して、道路環境の一時的な変更の場合には、変更区間での自動走行を制限し、道路環境の一時的な変更でない場合には、最新版の道路構造マップに更新する技術が開示されている。 In Patent Document 1, a control device that controls automatic driving of a vehicle receives change information about the road environment from an external server, and in the case of a temporary change in the road environment, restricts automatic driving in the changed section. , a technique for updating the road structure map to the latest version if the road environment is not temporarily changed.
特開2018-206024号公報JP 2018-206024 A
 しかしながら、道路環境に関する変更情報は膨大なデータ量の道路情報で構成されていることから、例えば、道路形状が変更されない些細な道路工事により一時的な変更を受けた区間についてまで、更新処理が何度も実行されることとなると、外部サーバから変更情報をダウンロードして車両10の道路構造マップを更新する頻度が高まるため、外部サーバと車両の制御装置との通信コストが増加すると共に、外部サーバや車両の通信装置や更新処理を行う車両の情報処理装置(CPU)に過度な負担が集中することとなる。 However, since the change information related to the road environment consists of a huge amount of road information, for example, it is necessary to update even a section that has undergone a temporary change due to a trivial road construction that does not change the road shape. If it is executed again, the frequency of downloading change information from the external server and updating the road structure map of the vehicle 10 will increase, so the communication cost between the external server and the vehicle control device will increase, and the external server will increase. An excessive load is concentrated on the communication device of the vehicle and the information processing device (CPU) of the vehicle that performs update processing.
 本発明は、上記の課題に鑑み、サーバとの間の通信負荷を低減することが可能な情報通信技術を提供する。 In view of the above problems, the present invention provides an information communication technology capable of reducing the communication load with the server.
 本発明の一態様に係る情報通信装置は、サーバと通信可能な情報通信装置であって、
 移動体の外部環境の画像を取得する画像取得部と、
 前記画像取得部で取得された前記画像の解析により前記移動体の周辺の物標を抽出する画像処理部と、
 前記移動体が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部と、
 前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部と、
 前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部と、を備える。
An information communication device according to an aspect of the present invention is an information communication device capable of communicating with a server,
an image acquisition unit that acquires an image of the external environment of the mobile object;
an image processing unit that extracts a target around the moving object by analyzing the image acquired by the image acquisition unit;
a storage unit for accumulating information of a plurality of targets extracted by the image processing unit each time the moving body moves;
an encoding unit that encodes change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image;
and a communication unit configured to transmit the change information encoded by the encoding unit to the server.
 本発明によれば、サーバとの間の通信負荷を低減することができる。 According to the present invention, it is possible to reduce the communication load with the server.
情報通信システムの構成例を示す図。The figure which shows the structural example of an information communication system. 撮像部の配置例を示す図。FIG. 3 is a diagram showing an arrangement example of an imaging unit; 移動体が移動する際の物標の抽出例を説明する図。FIG. 4 is a diagram for explaining an example of extraction of targets when a moving body moves; 複数の画像検出情報(画像検出ライブラリ)を模式的に示す図。FIG. 4 is a diagram schematically showing a plurality of pieces of image detection information (image detection library); 情報通信装置の処理の流れを説明する図。FIG. 4 is a diagram for explaining the processing flow of the information communication device;
 以下、添付図面を参照して実施形態を詳しく説明する。なお、以下の実施形態は特許請求の範囲に係る発明を限定するものではなく、また実施形態で説明されている特徴の組み合わせの全てが発明に必須のものとは限らない。実施形態で説明されている複数の特徴のうち二つ以上の特徴が任意に組み合わされてもよい。また、同一若しくは同様の構成には同一の参照番号を付し、重複した説明は省略する。 Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. It should be noted that the following embodiments do not limit the invention according to the claims, and not all combinations of features described in the embodiments are essential to the invention. Two or more of the features described in the embodiments may be combined arbitrarily. Also, the same or similar configurations are denoted by the same reference numerals, and redundant explanations are omitted.
 図1は情報通信システムSTMの構成例を示す図であり、情報通信システムSYMは、サーバとして機能する情報処理装置10と、情報処理装置10と通信可能な情報通信装置20を有する移動体30とを有する。実施形態において、移動体30には車両が含まれ、以下の説明では、移動体30として鞍乗型車両の例を説明する。 FIG. 1 is a diagram showing a configuration example of an information communication system STM. The information communication system SYM includes an information processing device 10 functioning as a server, and a mobile unit 30 having an information communication device 20 capable of communicating with the information processing device 10. have In the embodiment, the mobile body 30 includes a vehicle, and in the following description, an example of a straddle-type vehicle will be described as the mobile body 30 .
 (情報処理装置10(サーバ)の構成)
 情報処理装置10は、ハードウェア構成として、処理部11(CPU)、通信部12、記憶部13を有する。処理部11(CPU:Central Processing Unit)は、装置全体を制御するための各種の処理を行う。
(Configuration of information processing device 10 (server))
The information processing apparatus 10 has a processing unit 11 (CPU), a communication unit 12, and a storage unit 13 as a hardware configuration. A processing unit 11 (CPU: Central Processing Unit) performs various processes for controlling the entire apparatus.
 通信部12は、ネットワークNETを介して移動体30の情報通信装置20との間でデータの送受信が可能である。通信部12は情報通信装置20からのデータの要求に応じて所定の道路情報を含む各種情報を送信することが可能である。 The communication unit 12 can transmit and receive data to and from the information communication device 20 of the mobile object 30 via the network NET. The communication unit 12 can transmit various information including predetermined road information in response to a data request from the information communication device 20 .
 記憶部13は、処理部11で実行されるプログラムを記憶するためのROM(Read Only Memory)と、処理部11がプログラムを実行する際の作業領域として各種情報を記憶するためのRAM(Random Access Memory)と、地図情報や所定の道路情報を含む各種情報を記憶する大容量記憶部とを有する。記憶部13は、メモリカード、フラッシュメモリ、HDD(Hard Disk Drive)などにより構成することができる。 The storage unit 13 includes a ROM (Read Only Memory) for storing programs to be executed by the processing unit 11, and a RAM (Random Access Memory) for storing various information as a work area when the processing unit 11 executes the programs. Memory) and a large-capacity storage section for storing various information including map information and predetermined road information. The storage unit 13 can be configured by a memory card, flash memory, HDD (Hard Disk Drive), or the like.
 情報処理装置10は、ネットワークNETを介した情報通信装置20との通信により取得した情報を記憶部13に保存することが可能である。また、情報処理装置10の処理部11は情報通信装置20との通信により取得した情報に基づいて、記憶部13に保存されている情報(地図情報)を更新することが可能である。処理部11は、情報通信装置20との通信により取得した情報に基づいて、例えば、道路工事が行われている箇所、道路に穴が開いているなど道路の舗装状態が劣化している箇所などの局所的な変更点を反映した地図情報を生成する。 The information processing device 10 can store information acquired through communication with the information communication device 20 via the network NET in the storage unit 13 . Also, the processing unit 11 of the information processing device 10 can update information (map information) stored in the storage unit 13 based on information acquired through communication with the information communication device 20 . Based on the information acquired through communication with the information communication device 20, the processing unit 11 detects, for example, locations where road construction is being performed, locations where the pavement condition of the road is deteriorated such as holes in the road, and the like. Generate map information that reflects local changes in .
 (情報通信装置20の構成)
 情報通信装置20は、処理部21(CPU:Central Processing Unit)、通信部22、記憶部23、撮像部24A、24B及び位置情報取得部25を有する。
(Configuration of information communication device 20)
The information communication device 20 has a processing section 21 (CPU: Central Processing Unit), a communication section 22 , a storage section 23 , imaging sections 24 A and 24 B, and a position information acquisition section 25 .
 ここで、位置情報取得部25は、ジャイロセンサおよびGPSセンサを含み、ジャイロセンサは移動体30の回転運動を検知する。また、GPSセンサは、移動体の現在の位置情報(緯度、経度)を検知する。処理部21はジャイロセンサおよびGPSセンサの検知結果等により移動体30の進路(移動体30が移動する道路(移動経路))と移動経路上における移動体30の位置情報を判定することができる。 Here, the position information acquisition unit 25 includes a gyro sensor and a GPS sensor, and the gyro sensor detects rotational motion of the moving body 30. Also, the GPS sensor detects the current location information (latitude and longitude) of the mobile object. The processing unit 21 can determine the route of the mobile object 30 (the road along which the mobile object 30 moves (moving route)) and the positional information of the mobile object 30 on the movement route based on the detection results of the gyro sensor and the GPS sensor.
 図2は撮像部24A、24Bの配置例を示す図である。図2に例示されるように、撮像部24Aは、移動体30の周辺の様子を示す画像を撮像可能となるように、車体周辺部に複数設けられ、複数の撮像部24Aは、撮像領域が移動体30の周辺の全域(360度)を含むように設けられている。即ち、複数の撮像部24Aは、互いに隣り合う2つの撮像部24Aの撮像領域が部分的に互いに重なるように設けられる。図中には、撮像部24Aの指向方向が破線により模式的に示されるが、撮像部24Aの実際の検出範囲は図示されたものより広範なものとする。 FIG. 2 is a diagram showing an arrangement example of the imaging units 24A and 24B. As exemplified in FIG. 2, a plurality of imaging units 24A are provided in the peripheral portion of the vehicle body so as to be able to capture an image showing the surroundings of the moving body 30. The plurality of imaging units 24A has an imaging area It is provided so as to include the entire area (360 degrees) around the moving body 30 . That is, the plurality of imaging units 24A are provided so that the imaging regions of two adjacent imaging units 24A partially overlap each other. In the drawing, the directivity direction of the imaging section 24A is schematically indicated by a dashed line, but the actual detection range of the imaging section 24A is assumed to be wider than that illustrated.
 撮像部24Bは、移動体30の乗員を前方および後方のそれぞれから撮像可能となるように、乗員の座席の前後にそれぞれ設けられている。図中には、撮像部24A同様、撮像部24Bの指向方向が破線により模式的に示されるが、撮像部24Bの実際の検出範囲は図示されたものより広範なものとする。 The imaging units 24B are provided in front of and behind the occupant's seat so that the occupant of the moving body 30 can be imaged from the front and rear respectively. In the drawing, similarly to the imaging section 24A, the pointing direction of the imaging section 24B is schematically indicated by a dashed line, but the actual detection range of the imaging section 24B is assumed to be wider than that illustrated.
 撮像部24A、24Bは移動体30の移動方向に対して、移動体30の周辺の全域(360度)を含む画角で画像を取得(撮像)する。撮像部24A、24Bは静止画または動画を撮像することが可能である。撮像部24A及び24Bには、例えば、CMOS(Complementary Metal-Oxide Semiconductor)センサやCCD(Charge Coupled Device)センサ等のイメージセンサを備えるデジタルカメラが用いられる。 The imaging units 24A and 24B acquire (image) an image with an angle of view including the entire area (360 degrees) around the moving body 30 with respect to the movement direction of the moving body 30. The imaging units 24A and 24B are capable of capturing still images or moving images. A digital camera equipped with an image sensor such as a CMOS (Complementary Metal-Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) sensor is used for the imaging units 24A and 24B.
 記憶部23は、処理部21で実行されるプログラムや情報処理装置10から受信した各種情報、処理部21(画像処理部212)が使用する画像検出ライブラリ情報や処理部21の各種解析処理の結果を記憶する。 The storage unit 23 stores programs executed by the processing unit 21, various information received from the information processing apparatus 10, image detection library information used by the processing unit 21 (image processing unit 212), and various analysis processing results of the processing unit 21. memorize
 情報通信装置20の処理部21は、機能構成として制御部211、画像処理部212及び符号化部213を有する。これらの機能構成は、情報通信装置20の処理部21が記憶部23から読み込んだ所定のプログラムを実行することで実現される。情報通信装置20の機能構成の各部の構成は、同様の機能を果たすのであれば、それらは集積回路などで構成してもよい。 The processing unit 21 of the information communication device 20 has a control unit 211, an image processing unit 212, and an encoding unit 213 as functional configurations. These functional configurations are realized by executing a predetermined program read from the storage unit 23 by the processing unit 21 of the information communication device 20 . The components of the functional configuration of the information communication device 20 may be configured by integrated circuits or the like as long as they perform similar functions.
 制御部211は移動体30の走行制御及び情報通信装置20の全体を制御するための各種処理を行う。 The control unit 211 performs various processes for controlling travel of the mobile object 30 and controlling the information communication device 20 as a whole.
 画像処理部212は撮像部24A、24Bで撮像された画像に対する画像処理を行う。画像処理部212は、撮像部24A、24Bにより撮像された画像の解析により、移動体30の周辺の物標(オブジェクト)を抽出する。画像処理部212は、画像の解析により、例えば、道路の状態(舗装の状態、道路上の車線の区画線(白線等)の状態、陥没箇所の有無(陥没箇所の形状、サイズを含む)、油、砂などの散乱の有無)を物標として抽出することが可能である。 The image processing unit 212 performs image processing on the images captured by the imaging units 24A and 24B. The image processing unit 212 extracts targets (objects) around the moving body 30 by analyzing the images captured by the imaging units 24A and 24B. The image processing unit 212 analyzes the image to determine, for example, the state of the road (state of pavement, state of lane markings (white lines, etc.) on the road, presence or absence of potholes (including shape and size of potholes), presence or absence of scattering of oil, sand, etc.) can be extracted as a target.
 また、画像処理部212は、画像の解析により、例えば、道路上の物体(駐車中の他の移動体、落下物、故障車両、工事現場など)や、道路の側方および道路の周辺の配置物(道路標識や植栽など)を物標として抽出することが可能である。 In addition, the image processing unit 212 analyzes the image to determine, for example, objects on the road (other moving bodies parked, falling objects, broken down vehicles, construction sites, etc.), and the layout of the sides and surroundings of the road. Objects (road signs, plants, etc.) can be extracted as landmarks.
 また、画像処理部212は、画像の解析により、例えば、気象現象に基づいた路面の状態(洪水、降雪、降水、水溜まり、路面の凍結など)を物標として抽出することが可能である。この他、画像処理部212は、画像の解析により、例えば、道路に進入した生物(道路に進入した犬、猫、馬、牛などの生物)を物標として抽出することも可能である。 In addition, the image processing unit 212 can extract, for example, road conditions based on weather phenomena (floods, snowfall, precipitation, puddles, frozen road surfaces, etc.) as targets by analyzing images. In addition, the image processing unit 212 can also extract, for example, creatures that have entered the road (animals such as dogs, cats, horses, and cows that have entered the road) as targets by analyzing images.
 図3は移動体30が移動する際の物標の抽出例を説明する図である。ST31は移動体30が所定の道路(移動経路)を移動している状態を模式的に示しており、移動体30の移動中に撮像部24A、24Bにより撮像が行われている。ST32は、移動経路上の位置P1を移動時に撮像された画像の解析により取得された物標の抽出例を示している。 FIG. 3 is a diagram explaining an example of target extraction when the moving body 30 moves. ST31 schematically shows a state in which the moving body 30 is moving on a predetermined road (moving route), and the imaging sections 24A and 24B are performing imaging while the moving body 30 is moving. ST32 shows an example of extracting a target obtained by analyzing an image captured while moving at position P1 on the movement route.
 画像処理部212は画像処理により物標を抽出する際に、撮像位置に対応した画像中に存在し得る物標が予め取得された画像検出情報(画像検出ライブラリ)を参照して、画像の解析に基づいた物標の抽出を行う。 When extracting a target by image processing, the image processing unit 212 refers to image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and analyzes the image. Targets are extracted based on
 画像検出情報(画像検出ライブラリ)には、所定の道路(移動経路)の全体にわたり、各位置に対応した画像中に存在し得る物標が予め取得され保持されている。画像検出情報(画像検出ライブラリ)は、例えば、走行地域(例えば、国ごと、国内の走行エリア)、走行時期(例えば、季節ごと)、道路(移動経路)の情報に基づいて分類された複数の画像検出情報(画像検出ライブラリ)が記憶部23に記憶されている。また、複数の画像検出情報(画像検出ライブラリ)をサーバとの通信により取得して、記憶部23に記憶されている情報を更新することも可能である。 In the image detection information (image detection library), targets that can exist in the image corresponding to each position are acquired and stored in advance over the entire predetermined road (travel route). The image detection information (image detection library) is classified into multiple types based on, for example, driving region (for example, by country, domestic driving area), driving time (for example, by season), and road (moving route) information. Image detection information (image detection library) is stored in the storage unit 23 . It is also possible to acquire a plurality of pieces of image detection information (image detection library) through communication with the server and update the information stored in the storage unit 23 .
 図4は記憶部23に記憶された複数の画像検出情報(画像検出ライブラリ)を模式的に示す図である。画像処理部212は、位置情報取得部25で取得された位置情報(緯度、経度)に基づいて走行地域及び道路(移動経路)を特定することができ、更に、年月日及び時刻の情報を有する内部の計時部(タイマー)の情報に基づいて走行時期を特定することができる。これにより、画像処理部212は、該当する画像検出情報(画像検出ライブラリ)を記憶部23から取得することができる。画像検出情報(画像検出ライブラリ)には、各道路の始点から終点(図3のST31のP0-PX)の全体にわたり、各位置に対応した画像中に存在し得る物標が予め取得され保持されている。 FIG. 4 is a diagram schematically showing a plurality of pieces of image detection information (image detection library) stored in the storage unit 23. FIG. The image processing unit 212 can specify the driving area and the road (moving route) based on the position information (latitude, longitude) acquired by the position information acquisition unit 25, and furthermore, obtains the date and time information. The running time can be specified based on the information of the internal timer (timer). Thereby, the image processing unit 212 can acquire the corresponding image detection information (image detection library) from the storage unit 23 . In the image detection information (image detection library), targets that can exist in the image corresponding to each position are obtained and stored in advance from the start point to the end point (P0-PX in ST31 of FIG. 3) of each road. ing.
 画像処理部212は、位置情報取得部25で取得された情報に基づいて撮像位置(位置P1)を特定することができ、画像検出情報において、撮像位置(位置P1)に対応した画像中に存在し得る物標を参照して、画像の解析に基づいた物標の抽出を行う。 The image processing unit 212 can specify the imaging position (position P1) based on the information acquired by the position information acquisition unit 25, and in the image detection information, the image corresponding to the imaging position (position P1) includes Targets are extracted based on image analysis with reference to possible targets.
 図3のST31では、所定の道路(移動経路:始点P0-終点PX)を例示している。また、図3のST32では、移動体30が移動経路を所定の複数回移動したときに各移動時において、位置P1で撮像された画像から抽出された物標(物標1、物標2、物標3、・・・・物標8)を例示している。図3のST32の例では、所定の複数回の移動として5回の例を示しており、画像処理部212は、各撮像回で抽出した物標を記憶部23に記憶する。なお、複数回の移動は5回に限られず、任意の複数回を設定することができる。 In ST31 of FIG. 3, a predetermined road (moving route: start point P0-end point PX) is illustrated. In ST32 of FIG. 3, targets (target 1, target 2, Targets 3, . . . target 8) are illustrated. In the example of ST32 in FIG. 3, an example of five times is shown as the predetermined plural times of movement, and the image processing section 212 stores the targets extracted in each imaging time in the storage section 23 . It should be noted that the number of times of movement is not limited to five, and any number of times can be set.
 そして、画像処理部212は、位置P1における通常の環境情報を、記憶部23に記憶された物標の情報を基に決定する。ST32に示すように、抽出される物標は、撮像回数により変動しており、一時的に存在していた物標は時間の経過により移動してしまうため、次の撮像タイミングでは撮像されず、物標として抽出されないことになる。画像処理部212は、所定の存在確率を示す閾値以上の物標を位置P1における通常の環境情報として決定する。 Then, the image processing unit 212 determines normal environment information at the position P1 based on the target information stored in the storage unit 23 . As shown in ST32, the target to be extracted varies depending on the number of times of imaging, and the target that temporarily existed moves with the passage of time. It will not be extracted as a target. The image processing unit 212 determines a target having a threshold value indicating a predetermined existence probability or more as normal environment information at the position P1.
 例えば、存在確率を示す閾値を80%とすると、ST32において、物標1~4は、撮像回数5回の全てで抽出されており、存在確率は100%である。また、物標5は、撮像回数5回のうち4回抽出されており、存在確率は80%である。そして、物標6~8は、撮像回数5回のうち3回抽出されており、存在確率は60%である。ST32の抽出例において、画像処理部212は、所定の存在確率を示す閾値(80%)以上の物標1~5を位置P1における通常の環境情報として決定する。 For example, if the threshold indicating the existence probability is 80%, in ST32, targets 1 to 4 are extracted in all five imaging times, and the existence probability is 100%. Moreover, the target 5 is extracted four times out of the five imaging times, and the existence probability is 80%. Targets 6 to 8 are extracted three times out of five imaging times, and the existence probability is 60%. In the extraction example of ST32, the image processing unit 212 determines targets 1 to 5 having a threshold value (80%) or more indicating a predetermined existence probability as normal environment information at the position P1.
 そして、画像処理部212は、決定した位置P1における通常の環境情報を記憶部23に記憶する。以下、撮像位置(位置P1)における通常の環境情報をベースライン情報ともいう。なお、所定の存在確率を示す閾値は80%に限られず、任意の閾値を設定することができる。 Then, the image processing unit 212 stores the normal environment information at the determined position P1 in the storage unit 23. Hereinafter, normal environment information at the imaging position (position P1) is also referred to as baseline information. Note that the threshold indicating the predetermined existence probability is not limited to 80%, and any threshold can be set.
 複数回の移動(N=5回)により撮像位置(位置P1)における通常の環境情報(ベースライン情報:物標1~5)を決定した後、移動体30が移動経路(P0-PX)を移動したときに(N=6回目以降の移動)、画像処理部212は、位置P1で撮像された画像から物標(新規物標)を抽出する。そして、画像処理部212は、抽出した物標(新規物標)と、通常の環境情報(ベースライン情報)とを比較して、通常の環境情報との変化情報を取得する。 After determining the normal environment information (baseline information: targets 1 to 5) at the imaging position (position P1) by moving a plurality of times (N=5 times), the moving body 30 moves along the movement path (P0-PX). When moving (N=sixth or subsequent movement), the image processing unit 212 extracts a target (new target) from the image captured at the position P1. Then, the image processing unit 212 compares the extracted target (new target) with normal environment information (baseline information), and acquires change information from the normal environment information.
 例えば、N=6回目以降の移動において、抽出した物標が物標1~5及び物標9である場合に、画像処理部212は、通常の環境情報(物標1~5)との比較に基づいた変化情報として物標9を取得する。画像処理部212は取得した変化情報(物標9)と、位置情報取得部25により検出された移動体30の位置情報(P1:緯度、経度)とを対応づけて記憶部23に記憶する。 For example, when the extracted targets are targets 1 to 5 and target 9 in N=6 and subsequent movements, the image processing unit 212 compares them with normal environment information (targets 1 to 5). A target object 9 is acquired as change information based on . The image processing unit 212 associates the acquired change information (target object 9) with the position information (P1: latitude and longitude) of the moving body 30 detected by the position information acquisition unit 25 and stores them in the storage unit 23 .
 符号化部213は画像処理部212による画像処理の解析に基づいて、情報処理装置10へ送信(アップロード)するための情報の符号化処理を行う。情報処理装置10へ送信(アップロード)する対象となる情報は、画像処理部212の解析により取得された変化情報であり、符号化部213は変化情報と、変化情報に対応付けられた位置情報とを符号化する。 The encoding unit 213 performs encoding processing of information for transmission (uploading) to the information processing device 10 based on analysis of image processing by the image processing unit 212 . The information to be transmitted (uploaded) to the information processing apparatus 10 is the change information acquired by the analysis of the image processing unit 212, and the encoding unit 213 converts the change information and the position information associated with the change information. to encode
 符号化部213は、変化情報(例えば、物標9)の種別に応じて、種別の内容を示す数値情報に変換する符号化を行う。符号化部213は、変化情報の種別として、例えば、撮像位置における道路の状態(舗装の状態、陥没箇所の有無など)を示す事象を数値情報(「01」)に変換する。また、符号化部213は、道路上の物体(駐車中の他の移動体、落下物、故障車両、工事現場など)を示す事象を数値情報(「02」)に変換する。また、符号化部213は、変化情報の種別として、例えば、道路の側方および道路の周辺の配置物(道路標識や植栽など)を示す事象を数値情報(「03」)に変換する。更に、符号化部213は、変化情報の種別として、例えば、気象現象(洪水、降雪、降水など)に基づいた路面の状態を示す事象を数値情報(「04」)に変換する。そして、符号化部213は、変化情報の種別として、例えば、道路に進入した生物を示す事象を数値情報(「05」)に変換する。 The encoding unit 213 performs encoding for conversion into numerical information indicating the content of the type according to the type of change information (for example, the target object 9). As the type of change information, the encoding unit 213 converts, for example, an event indicating the state of the road (state of pavement, presence or absence of depressions, etc.) at the imaging position into numerical information (“01”). The encoding unit 213 also converts an event indicating an object on the road (another parked moving object, a falling object, a broken-down vehicle, a construction site, etc.) into numerical information (“02”). Also, the encoding unit 213 converts, as the type of change information, for example, an event indicating an arrangement (road sign, planting, etc.) on the side of the road and around the road into numerical information (“03”). Further, the encoding unit 213 converts, as the type of change information, for example, an event indicating the state of the road surface based on a weather phenomenon (flood, snowfall, rainfall, etc.) into numerical information (“04”). Then, the encoding unit 213 converts, for example, an event indicating a creature entering the road into numerical information (“05”) as the type of change information.
 また、符号化部213は、変化情報に対応付けられた位置情報(P1)を符号化する。位置情報(P1)には緯度および経度の情報が含まれ、符号化部213は、緯度および経度の情報を数値情報に変換する符号化を行う。符号化部213は、例えば、北緯35°を「1035」に変換し、南緯35度を「2035」に変換する。ここで、冒頭の「1」は北緯を示し、「2」は南緯を示し、角度の情報は10進法で示す。また、符号化部213は、例えば、東経135°を「3135」に変換し、西経135°を「4135」に変換する。ここで、冒頭の「3」は東経を示し、「4」は西経を示し、角度の情報は10進法で示す。なお、符号化部213による上記の変換処理は例示的なものであり、符号化部213は種々のデータ圧縮処理により変化情報を符号化することが可能である。 The encoding unit 213 also encodes the position information (P1) associated with the change information. The position information (P1) includes latitude and longitude information, and the encoding unit 213 performs encoding to convert the latitude and longitude information into numerical information. The encoding unit 213, for example, converts 35 degrees north latitude to "1035" and converts 35 degrees south latitude to "2035". Here, "1" at the beginning indicates north latitude, "2" indicates south latitude, and angle information is indicated in decimal notation. For example, the encoding unit 213 converts 135° east longitude into “3135” and converts 135° west longitude into “4135”. Here, "3" at the beginning indicates the east longitude, "4" indicates the west longitude, and angle information is indicated in decimal notation. Note that the above conversion processing by the encoding unit 213 is an example, and the encoding unit 213 can encode change information by various data compression processing.
 以上の処理に基づいて、符号化部213は変化情報と、変化情報に対応付けられた位置情報とを符号化し、符号化した変化情報と符号化した位置情報とを組み合わせた符号化情報を生成する。 Based on the above processing, the encoding unit 213 encodes the change information and the position information associated with the change information, and generates encoded information combining the encoded change information and the encoded position information. do.
 情報通信装置20の通信部22は、ネットワークNETに接続するためのインタフェースとして機能して、情報処理装置10との間で各種情報を送受信することが可能である。通信部22は、符号化部213により生成された符号化情報を情報処理装置10に送信する。 The communication unit 22 of the information communication device 20 functions as an interface for connecting to the network NET, and is capable of transmitting and receiving various information to and from the information processing device 10 . The communication unit 22 transmits the encoded information generated by the encoding unit 213 to the information processing device 10 .
 情報処理装置10の処理部11は情報通信装置20との通信により取得した符号化情報を復号化して、記憶部13に保存されている情報(地図情報)を更新することが可能である。処理部11は、符号化処理における変換方法を逆にした復号方法を適用して符号化情報から復号化情報を取得する。処理部11は、復号化情報に基づいて、例えば、道路工事が行われている箇所、道路に穴が開いているなど道路の舗装状態が劣化している箇所などの局所的な変更点を反映した地図情報を生成することができる。 The processing unit 11 of the information processing device 10 can decode the encoded information acquired through communication with the information communication device 20 and update the information (map information) stored in the storage unit 13 . The processing unit 11 applies a decoding method in which the transform method in the encoding process is reversed to obtain decoded information from the encoded information. Based on the decrypted information, the processing unit 11 reflects local changes, such as locations where road construction is being carried out and locations where the pavement condition of the road has deteriorated, such as holes in the road. map information can be generated.
 (処理フロー)
 次に、情報通信装置20が実行する処理フローについて説明する。図5は、情報通信装置20の処理の流れを説明する図である。
(processing flow)
Next, a processing flow executed by the information communication device 20 will be described. FIG. 5 is a diagram for explaining the processing flow of the information communication device 20. As shown in FIG.
 図5のS501において、移動体30が移動を開始する。S502において、撮像部24A、24Bは、撮像を開始して移動体30の外部環境の画像を取得する。 In S501 of FIG. 5, the moving body 30 starts moving. In S<b>502 , the imaging units 24</b>A and 24</b>B start imaging to obtain an image of the external environment of the moving body 30 .
 S503において、画像処理部212は、画像取得部(撮像部24A、24B)の撮像位置(例えば、図3の位置P1)で取得された画像の解析により移動体30の周辺の物標を抽出する。物標を抽出する処理を実行する際に、画像処理部212は、撮像位置に対応した画像中に存在し得る物標が予め取得された画像検出情報(画像検出ライブラリ)を参照して、画像の解析に基づいた物標の抽出を行う。画像検出情報(画像検出ライブラリ)を参照することにより、物標の抽出を高精度に行うことができる。また、画像検出情報(画像検出ライブラリ)を参照することにより、物標の輪郭等を推定して画像中における物標を抽出する処理を、より高速に行うことが可能になる。 In S503, the image processing unit 212 extracts targets around the moving body 30 by analyzing the image acquired at the imaging position (for example, position P1 in FIG. 3) of the image acquisition unit ( imaging units 24A and 24B). . When executing processing for extracting a target, the image processing unit 212 refers to image detection information (image detection library) in which a target that can exist in an image corresponding to an imaging position is acquired in advance, and extracts an image. Targets are extracted based on the analysis of By referring to the image detection information (image detection library), the target can be extracted with high accuracy. Further, by referring to the image detection information (image detection library), it is possible to estimate the outline of the target and extract the target in the image at a higher speed.
 図4で説明したように、記憶部23には、種々の画像検出情報(画像検出ライブラリ)が記憶されており、画像処理部212は、移動体30が走行する走行地域または走行時期により異なる画像検出情報を参照して、画像の解析に基づいた物標の抽出を行うことができる。また、記憶部23に記憶された種々の画像検出情報(画像検出ライブラリ)は、情報処理装置10(サーバ)から取得した情報に基づいて、更新することも可能である。例えば、通信部22は情報処理装置10(サーバ)との通信により画像検出情報(画像検出ライブラリ)を取得して、記憶部23に記憶されている情報を更新し、画像処理部212は、更新された画像検出情報を参照して、画像の解析に基づいた物標の抽出を行うことも可能である。 As described with reference to FIG. 4, the storage unit 23 stores various image detection information (image detection library), and the image processing unit 212 processes different images depending on the travel area or time of travel of the moving object 30. Target extraction based on image analysis can be performed with reference to the detection information. Various image detection information (image detection library) stored in the storage unit 23 can also be updated based on information acquired from the information processing apparatus 10 (server). For example, the communication unit 22 acquires image detection information (image detection library) through communication with the information processing device 10 (server) and updates the information stored in the storage unit 23, and the image processing unit 212 updates the information. It is also possible to extract a target based on image analysis by referring to the image detection information obtained.
 S504において、画像処理部212は、撮像位置における環境情報(ベースライン情報)は生成済であるか否かを判定し、環境情報が生成済の場合には(S504-Yes)、処理をS505に進める。 In S504, the image processing unit 212 determines whether or not the environment information (baseline information) at the imaging position has been generated. If the environment information has been generated (S504-Yes), the process proceeds to S505. proceed.
 S505において、画像処理部212は、環境情報を決定した後、所定の期間が経過したか否かを判定し、所定の期間が経過した場合には(S505-Yes)、処理をS508に進めて環境情報を破棄する(S508)。そして、画像処理部212は処理をS501に戻し、同様の処理を繰り返す。所定の期間内に新規物標が取得されなかった場合には、撮像位置(P1)における環境情報に変化が生じている可能性がある。このような場合には環境情報の情報を破棄することにより、撮像位置における変化の発生を誤検出することを抑制することができる。 In S505, after determining the environment information, the image processing unit 212 determines whether or not a predetermined period of time has elapsed. If the predetermined period of time has elapsed (S505-Yes), the process proceeds to S508. The environmental information is discarded (S508). Then, the image processing unit 212 returns the processing to S501 and repeats the same processing. If no new target is acquired within the predetermined period, there is a possibility that the environmental information at the imaging position (P1) has changed. In such a case, by discarding the environmental information, it is possible to suppress erroneous detection of the occurrence of a change in the imaging position.
 一方、S505の判定で、所定の期間が経過していない場合には(S505-No)、画像処理部212は処理をS506に進める。S505における所定の期間は、撮像位置における交通量の情報に基づいて、変更することができる。通信部22は、撮像位置(P1)における他の移動体の交通量を示す情報を取得し、画像処理部212は、交通量を示す情報が閾値交通量を超えて多くなるに応じて所定の期間を短く設定し、閾値交通量より少なくなるに応じて所定の期間を長く設定することも可能である。撮像位置における他の移動体の交通量を示す情報に基づいて、環境情報の情報を破棄するタイミングを設定することにより、交通量に応じて環境情報の情報を破棄する処理を効率よく行うことができる。 On the other hand, if it is determined in S505 that the predetermined period has not elapsed (S505-No), the image processing unit 212 advances the process to S506. The predetermined time period in S505 can be changed based on the traffic information at the imaging position. The communication unit 22 acquires information indicating the traffic volume of other moving bodies at the imaging position (P1), and the image processing unit 212 obtains a predetermined traffic volume as the information indicating the traffic volume exceeds the threshold traffic volume. It is also possible to set the period to be short and set the predetermined period to be long as the traffic decreases below the threshold traffic volume. By setting the timing for discarding the environmental information based on the information indicating the traffic volume of other moving bodies at the imaging position, it is possible to efficiently perform the processing for discarding the environmental information according to the traffic volume. can.
 S506において、画像処理部212は、環境情報と、S503の画像処理により抽出された物標(新規物標)との比較により、環境情報からの差分を変化情報として取得する。変化情報には、変化が発生した撮像位置の位置情報と、画像処理部212の解析により取得された物標の種別を示す事象情報とが含まれる。 In S506, the image processing unit 212 acquires the difference from the environment information as change information by comparing the environment information with the targets (new targets) extracted by the image processing in S503. The change information includes position information of the imaging position where the change occurred, and event information indicating the type of the target obtained by the analysis of the image processing unit 212 .
 ここで、変化情報のうち物標の種別を示す事象情報には、撮像位置(P1)における道路の状態を示す事象、当該道路上の物体を示す事象、当該道路の側方および当該道路の周辺の配置物を示す事象、及び、気象現象に基づいた道路の路面の状態を示す事象のうち、少なくとも一つの事象が含まれる。 Here, among the change information, the event information indicating the type of target includes an event indicating the state of the road at the imaging position (P1), an event indicating an object on the road, an event on the side of the road and an area around the road. and an event indicating the road surface condition based on a weather phenomenon.
 S507において、符号化部213は、画像処理部212により生成された変化情報を符号化する。符号化部213は、位置情報に含まれる緯度および経度の情報を数値情報に変換し、事象情報を物標の種別の内容を示す数値情報に変換する。そして符号化部213は、数値情報に変換した位置情報と、数値情報に変換した事象情報とを組み合わせた符号化情報を生成する。情報通信装置20から情報処理装置10(サーバ)に送信する情報は、変化が発生した撮像位置の位置情報と、画像処理部212の解析により取得された物標の種別を示す事象情報のみなので、画像データを送信する場合に比べてサーバの負荷を低減することが可能になる。 In S507, the encoding unit 213 encodes the change information generated by the image processing unit 212. The encoding unit 213 converts the latitude and longitude information included in the position information into numerical information, and converts the event information into numerical information indicating the type of target. Then, the encoding unit 213 generates encoded information by combining the position information converted into numerical information and the event information converted into numerical information. Since the information transmitted from the information communication device 20 to the information processing device 10 (server) is only the position information of the imaging position where the change occurred and the event information indicating the type of the target acquired by the analysis of the image processing unit 212, It is possible to reduce the load on the server as compared with the case of transmitting image data.
 一方、S504において、画像処理部212は、撮像位置における環境情報(ベースライン情報)は生成済であるか否かを判定し、環境情報が生成済でない場合には(S504-No)、処理をS510に進める。 On the other hand, in S504, the image processing unit 212 determines whether or not the environment information (baseline information) at the imaging position has been generated. Proceed to S510.
 S510において、画像処理部212は、S503の画像処理により抽出された物標を記憶部23に記憶(蓄積)する。 In S510, the image processing unit 212 stores (accumulates) the targets extracted by the image processing in S503 in the storage unit 23.
 S511において、画像処理部212は、図3のST32で例示したように(回数N=5)、記憶部23に所定の移動回数の物標の情報が蓄積されているか否かを判定する。所定の移動回数の物標の情報が蓄積されていない場合には(S511-No)、画像処理部212は、処理をS502に戻し、同様の処理を繰り返す。所定の移動回数(N)の物標の情報が蓄積されるまで、S503の画像処理により抽出された物標はS512で環境情報を生成ために蓄積される。そして、N+1回の移動(走行)時に、S503の画像処理により抽出された物標は、環境情報(ベースライン情報)の生成後に取得された新規物標として、環境情報と比較され、変化情報を生成するために使用される(S506)。 In S511, the image processing unit 212 determines whether or not the target information of the predetermined number of times of movement is accumulated in the storage unit 23, as illustrated in ST32 of FIG. 3 (the number of times N=5). If target information for the predetermined number of movements has not been accumulated (S511-No), the image processing unit 212 returns the processing to S502 and repeats the same processing. Targets extracted by the image processing in S503 are stored to generate environment information in S512 until information on the targets for a predetermined number of movements (N) is stored. Then, during N+1 times of movement (running), the target extracted by the image processing in S503 is compared with the environment information as a new target acquired after the environment information (baseline information) is generated, and change information is obtained. (S506).
 一方、S511の判定で、所定の移動回数の物標の情報が蓄積されている場合には(S511-Yes)、画像処理部212は、処理をS512に進める。 On the other hand, if it is determined in S511 that target information for the predetermined number of movements has been accumulated (S511-Yes), the image processing unit 212 advances the process to S512.
 S512において、画像処理部212は、記憶部23に蓄積された複数の物標の情報のうち(図3のST32)、閾値割合を超えて撮像位置に存在する物標を環境情報として決定する。環境情報を決定した後、画像処理部212は処理をS502に戻し、S502以降の処理を実行する。 In S512, the image processing unit 212 determines, as the environment information, the target present at the imaging position exceeding the threshold ratio among the multiple target information accumulated in the storage unit 23 (ST32 in FIG. 3). After determining the environment information, the image processing unit 212 returns the process to S502 and executes the processes after S502.
 (実施形態のまとめ)
 上記実施形態は、少なくとも以下の情報通信装置、情報通信を有する移動体、情報通信を有する鞍乗型車両を開示する。
(Summary of embodiment)
The above embodiment discloses at least the following information communication device, mobile object having information communication, and straddle-type vehicle having information communication.
 構成1.上記実施形態の情報通信装置は、サーバと通信可能な情報通信装置(20)であって、
 移動体の外部環境の画像を取得する画像取得部(24A、24B)と、
 前記画像取得部で取得された前記画像の解析により前記移動体の周辺の物標を抽出する画像処理部(212)と、
 前記移動体が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部(23)と、
 前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部(213)と、
 前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部(22)と、を備える。
Configuration 1. The information communication device of the above embodiment is an information communication device (20) capable of communicating with a server,
an image acquisition unit (24A, 24B) that acquires an image of the external environment of the mobile object;
an image processing unit (212) for extracting targets around the moving object by analyzing the image acquired by the image acquisition unit;
a storage unit (23) for accumulating information on a plurality of targets extracted by the image processing unit each time the moving body moves;
an encoding unit (213) for encoding change information between the information on the target stored in the storage unit and the new target extracted by analyzing the newly acquired image;
A communication unit (22) that transmits the change information encoded by the encoding unit to the server.
 構成1の情報通信装置によれば、サーバとの間の通信負荷を低減することができる。すなわち、サーバとの間でデータ受信のために通信を行う必要がなく、サーバとの間の通信負荷を低減できる。 According to the information communication device of configuration 1, the communication load with the server can be reduced. That is, there is no need to communicate with the server for data reception, and the communication load with the server can be reduced.
 構成2.上記実施形態の情報通信装置では、前記画像処理部(212)は、
 前記記憶部に蓄積された前記複数の物標の情報のうち、閾値割合を超えて存在する物標を環境情報として決定し、
 前記環境情報と前記新規物標との比較により、前記環境情報からの差分を前記変化情報として取得する。
Configuration 2. In the information communication device of the above embodiment, the image processing unit (212)
determining, as environment information, a target existing in excess of a threshold ratio among the information on the plurality of targets accumulated in the storage unit;
By comparing the environment information and the new target, a difference from the environment information is acquired as the change information.
 構成2の情報通信装置によれば、予め記憶部に蓄積された物標の情報を基に撮像位置における通常の環境を決定するため、撮像位置における変化の発生を精度よくみつけることが可能になる。 According to the information communication device of configuration 2, since the normal environment at the imaging position is determined based on target information accumulated in advance in the storage unit, it is possible to accurately detect the occurrence of a change in the imaging position. .
 構成3.上記実施形態の情報通信装置では、前記画像処理部(212)は、前記環境情報を決定した後、所定の期間内に前記新規物標が取得されなかった場合に、前記環境情報を破棄する。  Configuration 3. In the information communication device of the above embodiment, the image processing unit (212) discards the environment information when the new target is not acquired within a predetermined period after determining the environment information.
 構成3の情報通信装置によれば、所定の期間内に新規物標が取得されなかった場合には、撮像位置における環境情報に変化が生じている可能性がある。このような場合には環境情報の情報を破棄することにより、撮像位置における変化の発生を誤検出することを抑制することができる。 According to the information communication device of Configuration 3, if a new target is not acquired within a predetermined period of time, there is a possibility that environmental information at the imaging position has changed. In such a case, by discarding the environmental information, it is possible to suppress erroneous detection of the occurrence of a change in the imaging position.
 構成4.上記実施形態の情報通信装置では、前記通信部(22)は、前記画像取得部による撮像位置における他の移動体の交通量を示す情報を取得し、
 前記画像処理部(212)は、前記交通量を示す情報が閾値交通量を超えて多くなるに応じて前記期間を短く設定し、前記閾値交通量より少なくなるに応じて前記期間を長く設定する。
Configuration 4. In the information communication device of the above embodiment, the communication unit (22) acquires information indicating the traffic volume of other moving objects at the imaging position of the image acquisition unit,
The image processing unit (212) sets the period shorter as the information indicating the traffic volume exceeds the threshold traffic volume and sets the period longer as the traffic volume decreases below the threshold traffic volume. .
 構成4の情報通信装置によれば、撮像位置における他の移動体の交通量を示す情報に基づいて、環境情報の情報を破棄するタイミングを設定することにより、交通量に応じて環境情報の情報を破棄する処理を効率よく行うことが可能になる。 According to the information communication device of configuration 4, by setting the timing for discarding the environmental information information based on the information indicating the traffic volume of other moving bodies at the imaging position, the environmental information information can be discarded according to the traffic volume. can be efficiently discarded.
 構成5.上記実施形態の情報通信装置では、前記変化情報には、変化が発生した前記画像取得部による撮像位置の位置情報と、前記画像処理部の解析により取得された物標の種別を示す事象情報とが含まれる。  Configuration 5. In the information communication device of the above embodiment, the change information includes position information of the imaging position of the image acquisition unit where the change occurred, and event information indicating the type of the target obtained by analysis of the image processing unit. is included.
 構成6.上記実施形態の情報通信装置では、前記符号化部(213)は、
 前記位置情報に含まれる緯度および経度の情報を数値情報に変換し、
 前記事象情報を前記物標の種別の内容を示す数値情報に変換し、
 前記数値情報に変換した前記位置情報と、前記数値情報に変換した前記事象情報とを組み合わせた符号化情報を生成する。
Configuration 6. In the information communication device of the above embodiment, the encoding unit (213)
Converting latitude and longitude information included in the location information into numerical information,
converting the event information into numerical information indicating the content of the type of the target;
Encoded information is generated by combining the position information converted into the numerical information and the event information converted into the numerical information.
 構成5及び構成6の情報通信装置によれば、情報通信装置からサーバに送信する情報は、変化が発生した撮像位置の位置情報と、画像処理部の解析により取得された物標の種別を示す事象情報のみなので、画像データを送信する場合に比べてサーバの負荷を低減することが可能になる。 According to the information communication device of Configurations 5 and 6, the information transmitted from the information communication device to the server indicates the position information of the imaging position where the change occurred and the type of the target obtained by the analysis of the image processing unit. Since only event information is sent, it is possible to reduce the load on the server compared to sending image data.
 構成7.上記実施形態の情報通信装置では、前記通信部(22)は前記符号化部(213)で生成された前記符号化情報を前記サーバに送信する。  Configuration 7. In the information communication device of the above embodiment, the communication section (22) transmits the encoded information generated by the encoding section (213) to the server.
 構成7の情報通信装置によれば、データを圧縮しながら、変化が発生し次第情報を送るので、リアルタイムに情報を更新できる。 According to the information communication device of configuration 7, while compressing data, information is sent as soon as a change occurs, so information can be updated in real time.
 構成8.上記実施形態の情報通信装置では、前記変化情報のうち前記物標の種別を示す事象情報には、
 前記撮像位置における道路の状態を示す事象、当該道路上の物体を示す事象、当該道路の側方および当該道路の周辺の配置物を示す事象、及び、気象現象に基づいた前記道路の路面の状態を示す事象のうち、少なくとも一つの事象が含まれる。
Configuration 8. In the information communication device of the above embodiment, the event information indicating the type of the target among the change information includes:
An event indicating the state of the road at the imaging position, an event indicating an object on the road, an event indicating objects placed on the side of the road and around the road, and the state of the road surface based on weather phenomena At least one event is included among the events indicating
 構成8の情報通信装置によれば、撮像位置において発生した種々の事象を変更情報として取得して、サーバに送信することが可能になる。 According to the information communication device of configuration 8, it is possible to acquire various events that occur at the imaging position as change information and transmit it to the server.
 構成9.上記実施形態の情報通信装置では、前記画像処理部は、撮像位置に対応した画像中に存在し得る物標が予め取得された画像検出情報を参照して、前記画像の解析に基づいた物標の抽出を行う。  Configuration 9. In the information communication device of the above-described embodiment, the image processing unit refers to image detection information in which a target that may exist in an image corresponding to an imaging position is acquired in advance, and detects a target based on analysis of the image. is extracted.
 構成9の情報通信装置によれば、撮像位置に対応した画像中に存在し得る物標が予め取得された画像検出情報(画像検出ライブラリ)を参照することにより、物標の抽出を高精度に行うことが可能になる。また、物標が予め取得された画像検出情報を参照することにより、物標の輪郭等を推定して画像中における物標を抽出する処理を、より高速に行うことが可能になる。 According to the information communication device of configuration 9, targets can be extracted with high accuracy by referring to image detection information (image detection library) in which targets that can exist in an image corresponding to an imaging position are acquired in advance. becomes possible to do. Further, by referring to the image detection information in which the target is acquired in advance, it becomes possible to perform the process of estimating the outline of the target and extracting the target in the image at a higher speed.
 構成10.上記実施形態の情報通信装置では、前記画像処理部は、前記移動体が走行する走行地域または走行時期により異なる画像検出情報を参照して、前記画像の解析に基づいた物標の抽出を行う。  Configuration 10. In the information communication device of the above embodiment, the image processing unit refers to image detection information that varies depending on the travel area or travel time of the mobile object, and extracts a target based on the analysis of the image.
 構成10の情報通信装置によれば、道路の舗装の色や標識といった走行地域ごとに異なる交通規則や、走行時期により発生し得る気象現象(洪水や降雪や降水など)の違いを画像の解析に反映して物標の抽出を行うことが可能になる。 According to the information communication device of configuration 10, traffic rules such as the color of road pavement and signs, which differ depending on the driving area, and differences in weather phenomena (flooding, snowfall, precipitation, etc.) that may occur depending on the driving season are analyzed in the image analysis. It becomes possible to extract a target object by reflecting it.
 構成11.上記実施形態の情報通信装置では、前記通信部は前記サーバとの通信により前記画像検出情報を取得して、前記記憶部に記憶されている情報を更新し、
 前記画像処理部は、更新された前記画像検出情報を参照して、前記画像の解析に基づいた物標の抽出を行う。
Configuration 11. In the information communication device of the above embodiment, the communication unit acquires the image detection information through communication with the server, updates the information stored in the storage unit,
The image processing unit refers to the updated image detection information and extracts a target based on the analysis of the image.
 構成11の情報通信装置によれば、画像検出情報(画像検出ライブラリ)を更新することにより、交通規則の変化や気象現象の発生傾向の変化、走行環境の変化に対応して、画像の解析に基づいた物標の抽出を行うことが可能になる。 According to the information communication device of configuration 11, by updating the image detection information (image detection library), image analysis can be performed in response to changes in traffic rules, changes in the occurrence tendency of weather phenomena, and changes in the driving environment. It is possible to extract targets based on
 構成12.上記実施形態の移動体は、サーバと通信可能な情報通信装置を備える移動体であって、前記情報通信装置(20)は、
 移動体の外部環境の画像を取得する画像取得部(24A、24B)と、
 前記画像取得部で取得された前記画像の解析により前記移動体の周辺の物標を抽出する画像処理部(212)と、
 前記移動体が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部(23)と、
 前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部(213)と、
 前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部(22)と、を備える。
Configuration 12. The mobile object of the above embodiment is a mobile object comprising an information communication device capable of communicating with a server, wherein the information communication device (20)
an image acquisition unit (24A, 24B) that acquires an image of the external environment of the mobile object;
an image processing unit (212) for extracting targets around the moving object by analyzing the image acquired by the image acquiring unit;
a storage unit (23) for accumulating information on a plurality of targets extracted by the image processing unit each time the moving body moves;
an encoding unit (213) for encoding change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image;
A communication unit (22) that transmits the change information encoded by the encoding unit to the server.
 構成12の移動体によれば、サーバとの間の通信負荷を低減することができる。すなわち、サーバとの間でデータ受信のために通信を行う必要がなく、サーバとの間の通信負荷を低減できる。 According to the mobile object of configuration 12, it is possible to reduce the communication load with the server. That is, there is no need to communicate with the server for data reception, and the communication load with the server can be reduced.
 構成13.上記実施形態の鞍乗型車両は、サーバと通信可能な情報通信装置を備える鞍乗型車両であって、前記情報通信装置(20)は、
 鞍乗型車両の外部環境の画像を取得する画像取得部(24A、24B)と、
 前記画像取得部で取得された前記画像の解析により前記鞍乗型車両の周辺の物標を抽出する画像処理部(212)と、
 前記鞍乗型車両が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部(23)と、
 前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部(213)と、
 前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部(22)と、を備える。
Configuration 13. The straddle-type vehicle of the above embodiment is a straddle-type vehicle provided with an information communication device capable of communicating with a server, wherein the information communication device (20) comprises:
image acquisition units (24A, 24B) for acquiring images of the external environment of the straddle-type vehicle;
an image processing unit (212) for extracting targets around the straddle-type vehicle by analyzing the image acquired by the image acquisition unit;
a storage unit (23) for accumulating information on a plurality of targets extracted by the image processing unit each time the straddle-type vehicle moves;
an encoding unit (213) for encoding change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image;
A communication unit (22) that transmits the change information encoded by the encoding unit to the server.
 構成13の鞍乗型車両によれば、サーバとの間の通信負荷を低減することができる。すなわち、サーバとの間でデータ受信のために通信を行う必要がなく、サーバとの間の通信負荷を低減できる。 According to the straddle-type vehicle of configuration 13, the communication load with the server can be reduced. That is, there is no need to communicate with the server for data reception, and the communication load with the server can be reduced.
 (その他の実施形態)
 本発明は、上述の実施形態の機能を実現するプログラムを、ネットワーク又は記憶媒体を介して情報通信装置に供給し、その情報通信装置のコンピュータにおける1つ以上のプロセッサがプログラムを読出して、情報通信装置の処理を実行することも可能である。
(Other embodiments)
According to the present invention, a program that implements the functions of the above-described embodiments is supplied to an information communication device via a network or a storage medium, and one or more processors in a computer of the information communication device reads the program and performs information communication. It is also possible to perform device processing.
 発明は上記の実施形態に制限されるものではなく、発明の要旨の範囲内で、種々の変形・変更が可能である。 The invention is not limited to the above embodiments, and various modifications and changes are possible within the scope of the invention.
 本願は、2021年3月18日提出の日本国特許出願特願2021-045142を基礎として優先権を主張するものであり、その記載内容の全てを、ここに援用する。 This application claims priority based on Japanese Patent Application No. 2021-045142 filed on March 18, 2021, and the entire contents thereof are incorporated herein.
 10:情報処理装置、20:情報通信装置、21:処理部、30:移動体、
 211:制御部、212:画像処理部、213:符号化部、22:通信部、
 23:記憶部、24A:撮像部、24B:撮像部、25:位置情報取得部
10: information processing device, 20: information communication device, 21: processing unit, 30: moving body,
211: control unit, 212: image processing unit, 213: encoding unit, 22: communication unit,
23: storage unit, 24A: imaging unit, 24B: imaging unit, 25: position information acquisition unit

Claims (13)

  1.  サーバと通信可能な情報通信装置であって、
     移動体の外部環境の画像を取得する画像取得部と、
     前記画像取得部で取得された前記画像の解析により前記移動体の周辺の物標を抽出する画像処理部と、
     前記移動体が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部と、
     前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部と、
     前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部と、
     を備えることを特徴とする情報通信装置。
    An information communication device capable of communicating with a server,
    an image acquisition unit that acquires an image of the external environment of the mobile object;
    an image processing unit that extracts a target around the moving object by analyzing the image acquired by the image acquisition unit;
    a storage unit for accumulating information of a plurality of targets extracted by the image processing unit each time the moving body moves;
    an encoding unit that encodes change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image;
    a communication unit that transmits the change information encoded by the encoding unit to the server;
    An information communication device comprising:
  2.  前記画像処理部は、
     前記記憶部に蓄積された前記複数の物標の情報のうち、閾値割合を超えて存在する物標を環境情報として決定し、
     前記環境情報と前記新規物標との比較により、前記環境情報からの差分を前記変化情報として取得する
     ことを特徴とする請求項1に記載の情報通信装置。
    The image processing unit
    determining, as environment information, a target existing in excess of a threshold ratio among the information on the plurality of targets accumulated in the storage unit;
    The information communication device according to claim 1, wherein a difference from the environment information is acquired as the change information by comparing the environment information and the new target.
  3.  前記画像処理部は、前記環境情報を決定した後、所定の期間内に前記新規物標が取得されなかった場合に、前記環境情報を破棄することを特徴とする請求項2に記載の情報通信装置。 3. The information communication according to claim 2, wherein said image processing unit discards said environment information when said new target is not acquired within a predetermined period after determining said environment information. Device.
  4.  前記通信部は、前記画像取得部による撮像位置における他の移動体の交通量を示す情報を取得し、
     前記画像処理部は、前記交通量を示す情報が閾値交通量を超えて多くなるに応じて前記期間を短く設定し、前記閾値交通量より少なくなるに応じて前記期間を長く設定することを特徴とする請求項3に記載の情報通信装置。
    The communication unit acquires information indicating the traffic volume of other moving objects at the imaging position by the image acquisition unit,
    The image processing unit sets the period shorter as the information indicating the traffic volume exceeds the threshold traffic volume and sets the period longer as the traffic volume decreases below the threshold traffic volume. 4. The information communication device according to claim 3.
  5.  前記変化情報には、変化が発生した前記画像取得部による撮像位置の位置情報と、前記画像処理部の解析により取得された物標の種別を示す事象情報とが含まれることを特徴とする請求項1または2に記載の情報通信装置。 The change information includes position information of the imaging position of the image acquisition unit where the change occurred, and event information indicating the type of the target acquired by the analysis of the image processing unit. Item 3. The information communication device according to Item 1 or 2.
  6.  前記符号化部は、
     前記位置情報に含まれる緯度および経度の情報を数値情報に変換し、
     前記事象情報を前記物標の種別の内容を示す数値情報に変換し、
     前記数値情報に変換した前記位置情報と、前記数値情報に変換した前記事象情報とを組み合わせた符号化情報を生成することを特徴とする請求項5に記載の情報通信装置。
    The encoding unit
    Converting latitude and longitude information included in the location information into numerical information,
    converting the event information into numerical information indicating the content of the type of the target;
    6. The information communication apparatus according to claim 5, wherein encoded information is generated by combining said position information converted into said numerical information and said event information converted into said numerical information.
  7.  前記通信部は前記符号化部で生成された前記符号化情報を前記サーバに送信することを特徴とする請求項6に記載の情報通信装置。 The information communication device according to claim 6, wherein the communication unit transmits the encoded information generated by the encoding unit to the server.
  8.  前記変化情報のうち前記物標の種別を示す事象情報には、
     前記撮像位置における道路の状態を示す事象、当該道路上の物体を示す事象、当該道路の側方および当該道路の周辺の配置物を示す事象、及び、気象現象に基づいた前記道路の路面の状態を示す事象のうち、少なくとも一つの事象が含まれることを特徴とする請求項5または6に記載の情報通信装置。
    The event information indicating the type of the target in the change information includes:
    An event indicating the state of the road at the imaging position, an event indicating an object on the road, an event indicating objects placed on the side of the road and around the road, and the state of the road surface based on weather phenomena 7. The information communication device according to claim 5, wherein at least one event is included among the events indicating
  9.  前記画像処理部は、撮像位置に対応した画像中に存在し得る物標が予め取得された画像検出情報を参照して、前記画像の解析に基づいた物標の抽出を行うことを特徴とする請求項1乃至5のいずれか1項に記載の情報通信装置。 The image processing unit is characterized in that the target is extracted based on the analysis of the image by referring to the image detection information in which a target that can exist in the image corresponding to the imaging position is acquired in advance. The information communication device according to any one of claims 1 to 5.
  10.  前記画像処理部は、前記移動体が走行する走行地域または走行時期により異なる画像検出情報を参照して、前記画像の解析に基づいた物標の抽出を行うことを特徴とする請求項9に記載の情報通信装置。 10. The image processing unit according to claim 9, wherein the image processing unit extracts the target based on the analysis of the image by referring to image detection information that varies depending on the travel area or the travel time of the moving object. information communication equipment.
  11.  前記通信部は前記サーバとの通信により前記画像検出情報を取得して、前記記憶部に記憶されている情報を更新し、
     前記画像処理部は、更新された前記画像検出情報を参照して、前記画像の解析に基づいた物標の抽出を行うことを特徴とする請求項9または10に記載の情報通信装置。
    the communication unit acquires the image detection information through communication with the server and updates the information stored in the storage unit;
    11. The information communication device according to claim 9, wherein the image processing unit refers to the updated image detection information and extracts a target based on analysis of the image.
  12.  サーバと通信可能な情報通信装置を備える移動体であって、
     前記情報通信装置は、
     移動体の外部環境の画像を取得する画像取得部と、
     前記画像取得部で取得された前記画像の解析により前記移動体の周辺の物標を抽出する画像処理部と、
     前記移動体が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部と、
     前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部と、
     前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部と、
     を備えることを特徴とする移動体。
    A mobile body comprising an information communication device capable of communicating with a server,
    The information communication device is
    an image acquisition unit that acquires an image of the external environment of the mobile object;
    an image processing unit that extracts a target around the moving object by analyzing the image acquired by the image acquisition unit;
    a storage unit for accumulating information of a plurality of targets extracted by the image processing unit each time the moving body moves;
    an encoding unit that encodes change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image;
    a communication unit that transmits the change information encoded by the encoding unit to the server;
    A mobile object comprising:
  13.  サーバと通信可能な情報通信装置を備える鞍乗型車両であって、
     前記情報通信装置は、
     鞍乗型車両の外部環境の画像を取得する画像取得部と、
     前記画像取得部で取得された前記画像の解析により前記鞍乗型車両の周辺の物標を抽出する画像処理部と、
     前記鞍乗型車両が移動するごとに前記画像処理部により抽出された複数の物標の情報を蓄積する記憶部と、
     前記記憶部に蓄積された前記物標の情報と、新たに取得された画像の解析により抽出された新規物標との間の変化情報を符号化する符号化部と、
     前記符号化部により符号化された前記変化情報を前記サーバに送信する通信部と、
     を備えることを特徴とする鞍乗型車両。
    A straddle-type vehicle comprising an information communication device capable of communicating with a server,
    The information communication device is
    an image acquisition unit that acquires an image of the external environment of the straddle-type vehicle;
    an image processing unit that extracts targets around the straddle-type vehicle by analyzing the image acquired by the image acquisition unit;
    a storage unit for accumulating information on a plurality of targets extracted by the image processing unit each time the straddle-type vehicle moves;
    an encoding unit that encodes change information between the target information stored in the storage unit and the new target extracted by analyzing the newly acquired image;
    a communication unit that transmits the change information encoded by the encoding unit to the server;
    A straddle-type vehicle comprising:
PCT/JP2022/008697 2021-03-18 2022-03-01 Information communication device, moving body, and straddle-type vehicle WO2022196352A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023506949A JP7512513B2 (en) 2021-03-18 2022-03-01 Information and communication device, mobile body, and saddle-type vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021045142 2021-03-18
JP2021-045142 2021-03-18

Publications (1)

Publication Number Publication Date
WO2022196352A1 true WO2022196352A1 (en) 2022-09-22

Family

ID=83321502

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/008697 WO2022196352A1 (en) 2021-03-18 2022-03-01 Information communication device, moving body, and straddle-type vehicle

Country Status (2)

Country Link
JP (1) JP7512513B2 (en)
WO (1) WO2022196352A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002202741A (en) * 2000-11-02 2002-07-19 Matsushita Electric Ind Co Ltd Information-providing device using led
WO2018181974A1 (en) * 2017-03-30 2018-10-04 パイオニア株式会社 Determination device, determination method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002202741A (en) * 2000-11-02 2002-07-19 Matsushita Electric Ind Co Ltd Information-providing device using led
WO2018181974A1 (en) * 2017-03-30 2018-10-04 パイオニア株式会社 Determination device, determination method, and program

Also Published As

Publication number Publication date
JP7512513B2 (en) 2024-07-08
JPWO2022196352A1 (en) 2022-09-22

Similar Documents

Publication Publication Date Title
Yan et al. EU long-term dataset with multiple sensors for autonomous driving
US20180188738A1 (en) Detection of traffic dynamics and road changes in autonomous driving
CN110859043A (en) System and method for updating highly automated driving map
CN108574929A (en) The method and apparatus for reproducing and enhancing for the networking scenario in the vehicle environment in autonomous driving system
DE102018101505A1 (en) EFFICIENT SITUATION AWARENESS OF PERCEPTIONS IN AUTONOMOUS DRIVING SYSTEMS
DE102019115676A1 (en) METHOD AND SYSTEM FOR RECONSTRUCTING A VEHICLE SCENE IN A CLOUD LEVEL
US20130188837A1 (en) Positioning system
JP6751280B2 (en) Position estimating device, position detecting method and program
CN111256687A (en) Map data processing method and device, acquisition equipment and storage medium
JP2018529945A (en) Automated vehicle location method
JP6038422B1 (en) Vehicle determination device, vehicle determination method, and vehicle determination program
JP2023541424A (en) Vehicle position determination method and vehicle position determination device
US20230326055A1 (en) System and method for self-supervised monocular ground-plane extraction
WO2022196352A1 (en) Information communication device, moving body, and straddle-type vehicle
JP6737253B2 (en) Image processing device
JP7038092B2 (en) Data collection device, data collection system, and data collection method
JP2023063433A (en) Terminal device, transmission method, collection device and collection system
US20220067964A1 (en) Position and orientation calculation method, non-transitory computer-readable storage medium and information processing apparatus
JP2019185453A (en) Accident information recorder and accident information collection server
US20220230353A1 (en) Method and system for learning a neural network to determine a pose of a vehicle in an environment
JP6725183B2 (en) Image sharing support device, image sharing system, and image sharing support method
US20220281459A1 (en) Autonomous driving collaborative sensing
EP3690399B1 (en) Removal of objects from a digital roadmap
JP2012159944A (en) Congestion state notifying system, congestion state notifying method, parking state managing server, on-vehicle device and program
JP7147791B2 (en) Tagging system, cache server, and control method of cache server

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22771106

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023506949

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 202347064120

Country of ref document: IN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22771106

Country of ref document: EP

Kind code of ref document: A1