Nothing Special   »   [go: up one dir, main page]

WO2024047798A1 - Data analysis device, exploration system, data analysis method, and program - Google Patents

Data analysis device, exploration system, data analysis method, and program Download PDF

Info

Publication number
WO2024047798A1
WO2024047798A1 PCT/JP2022/032777 JP2022032777W WO2024047798A1 WO 2024047798 A1 WO2024047798 A1 WO 2024047798A1 JP 2022032777 W JP2022032777 W JP 2022032777W WO 2024047798 A1 WO2024047798 A1 WO 2024047798A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
data
exploration
predetermined surface
radar
Prior art date
Application number
PCT/JP2022/032777
Other languages
French (fr)
Japanese (ja)
Inventor
誠史 吉田
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2022/032777 priority Critical patent/WO2024047798A1/en
Priority to PCT/JP2023/024327 priority patent/WO2024048056A1/en
Publication of WO2024047798A1 publication Critical patent/WO2024047798A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications

Definitions

  • the present disclosure relates to technology for exploring buried objects or cavities in the ground of roads.
  • Pipelines for gas, water, sewage, electricity, or communication cables are buried underground under roadways and sidewalks.
  • Information such as the location of these pipelines is stored in buried property registers, etc. that are individually managed by each business operator. Therefore, for example, during road excavation work, trial excavation is performed to confirm the position and depth of the buried pipe prior to construction in order to prevent damage to the pipe and cutting accidents. If the information about pipelines recorded in a buried property register or the like is inaccurate, the buried pipelines may be damaged or wasteful trial excavation may occur. Therefore, in order to efficiently maintain, manage, and update underground infrastructure facilities for new underground construction and electric wire removal projects, we have created a database of accurate location information for underground conduits. Efforts to share this information among business operators are being considered.
  • ground penetrating radar GPR
  • Ground-penetrating radar measurement equipment that implements this method emits electromagnetic waves (pulse waves) from the ground surface into the ground, and receives the electromagnetic waves reflected at interfaces with different electromagnetic properties underground.
  • Conduct medium exploration The characteristics of the reflected waves make it possible to non-destructively identify pipes that exist at depths of several meters from the ground surface.
  • the depth of a buried pipe can be measured by measuring the travel time of the reflected wave (the time from when the electromagnetic wave is emitted to when the reflected wave is received). In this way, the position and depth of buried objects such as pipes or cavities from the ground surface can be specified without excavating the road.
  • electromagnetic waves with different pulse widths or center frequencies are used depending on the target depth of exploration and resolution. Electromagnetic waves with longer wavelengths reach deeper depths, but have lower directivity and resolution. On the other hand, when the frequency of electromagnetic waves is high, although the directivity is improved, the amount of attenuation is large due to reflection near the ground surface, and the exploration depth is limited.
  • Existing ground-penetrating radar measurement equipment uses signals with frequencies ranging from several tens of MHz to several GHz.
  • the explorer draws measurement lines (hereinafter referred to as “measurement lines”) at regular intervals in the direction across the road, and then traces the ground along these lines. Measurements will be made by scanning a cart equipped with a medium-search radar measurement device (hereinafter referred to as the “measurement cart”).
  • the position of a conduit is usually recorded as the distance on the ground from a landmark (target object) such as the edge of a sidewalk or a utility pole, but as the road environment changes over time, There are cases where it is lost.
  • the location of pipelines buried underground is managed based on the absolute position (latitude, longitude, altitude) and depth from the ground surface, and the location is maintained permanently even if the surrounding environment on the ground surface changes over time. Consideration is being given to recording information that can be identified.
  • the road surface is thoroughly scanned at regular intervals with a ground-penetrating radar measuring device, and multiple There is a method of pasting scan data together.
  • This method has the advantage of being able to accurately identify the location of pipelines that cross the survey line by collecting underground exploration data in three dimensions.
  • the measurement cart may be temporarily stopped or the measurement cart may be stopped along the measurement line to avoid vehicles or pedestrians. There may be a temporary detour. Therefore, in order to accurately paste the measured data together, the absolute position of the underground exploration signal antenna mounted on the underground exploration radar measurement device must be accurately traced to an accuracy of approximately 10 cm or less during scanning. Location information is required.
  • the present invention was made in view of the above-mentioned circumstances, and its purpose is to accurately grasp the condition and pinpoint the position of buried objects or cavities without any location restrictions.
  • the invention according to claim 1 is a data analysis device that analyzes data regarding the position of a buried object or a cavity and its depth from a predetermined surface, the data analysis device comprising: scanning a first predetermined surface; First image data obtained by photographing the first predetermined surface, which is correlated in terms of position with the obtained first radar exploration data, and a second predetermined surface adjacent to the first predetermined surface.
  • an exploration-based image data matching unit that performs image data matching between second radar exploration data obtained by scanning and second image data obtained by photographing the second predetermined surface, which is associated with respect to position; and, based on the image data matching by the exploration system image data matching unit, the first radar exploration data and the second radar exploration data on a vertical plane of the first predetermined plane and the second predetermined plane.
  • a data synthesis unit that generates synthetic exploration data by synthesizing the data; and association with an absolute position obtained by measuring a ground control point of a specific plane on at least one of the first predetermined plane and the second predetermined plane.
  • an absolute position-based image data matching unit that performs image data matching with specific image data including the specific plane and the other specific plane, and an image data matching unit based on the image data matching by the absolute position-based image data matching unit.
  • an absolute position calibration unit that performs absolute position calibration regarding the synthetic exploration data.
  • FIG. 1 is an overall configuration diagram of an exploration system according to an embodiment. It is a block diagram of a radar exploration device. It is a block diagram of the modification of a radar exploration device.
  • FIG. 2 is a configuration diagram of a ground control point measuring device.
  • FIG. 2 is an electrical hardware configuration diagram of the data analysis device.
  • FIG. 2 is a functional configuration diagram of a data analysis device. It is a flow chart showing processing performed by a radar exploration device. It is a flow chart showing processing performed by the ground control point measuring device. It is a flow chart showing processing of a data analysis device. It is a figure showing the relationship between a measurement range and a survey line.
  • FIG. 3 is a diagram showing the relationship between a measurement range, a measurement area, and a photographing area.
  • FIG. 2 is a conceptual diagram showing processing for matching image data and combining radar exploration data.
  • FIG. 1 is an overall configuration diagram of an exploration system according to an embodiment.
  • an exploration system 1 is constructed by a radar exploration device 3 installed on a measurement cart 2, a ground control point measurement device 6 installed on a tripod 4, and a data analysis device 9. .
  • the measurement cart 2 is a cart that runs manually or automatically.
  • the measurement cart 2 may be an AGV (Automated Guided Vehicle) or the like.
  • the radar exploration device 3 is, for example, a GPR measurement device for measuring GPR (Ground Penetration Radar).
  • the radar exploration device 3 transmits electromagnetic waves for exploration from the ground toward the underground, and acquires radar exploration data such as GPR reflected waves from underground.
  • the radar exploration device 3 also photographs the ground to obtain image data of the ground.
  • the radar exploration device 3 scans the ground to continuously acquire radar exploration data and image data, and stores them in association with each other.
  • the radar survey data and image data are then passed to the data analysis device 9.
  • the ground control point measuring device 6 measures the absolute position (latitude, longitude, altitude) of a ground control point by measuring a virtual ground control point (GCP) as an arbitrary point on the ground. GCP measurement equipment, etc.
  • the ground control point measuring device 6 photographs a specific surface of the ground including the ground control point to obtain image data of the specific surface.
  • the ground control point measuring device 6 stores absolute position information and image data of a specific plane in association with each other. Thereafter, the absolute position information and the image data of the specific plane are passed to the data analysis device 9.
  • the data analysis device 9 performs image data matching between a plurality of image data obtained from the radar exploration device 3, calculates the relative displacement amount of each radar exploration data associated with these image data, and calculates the relative displacement amount of each radar exploration data associated with these image data. Synthetic exploration data is generated by accurately combining exploration data in a planar manner. Furthermore, the data analysis device 9 performs image data matching between the image data of a specific plane obtained from the ground control point measuring device 6 and the image data obtained from the radar exploration device 3, and performs absolute position calibration regarding the synthetic exploration data. conduct. This performs data analysis regarding the absolute position of the buried object or cavity.
  • the radar exploration device 3 may be provided in the measurement cart 2 integrally with the ground control point measurement device 6 or the data analysis device 9.
  • the data storage sections 35 and 65 or the photographing sections 36 and 66 which will be described later, may be shared.
  • FIG. 2 is a configuration diagram of the radar exploration device. Note that the radar exploration device 3a is an example of the radar exploration device 3 in FIG.
  • the radar exploration device 3a includes a GNSS antenna 30, a clock section 31, a transmitting section 33, a GPR antenna 32, a receiving section 34, a data storage section 35, and a photographing section 36.
  • the radar exploration device 3a is a device that performs underground exploration using GPR measurement, exploration means other than GPR measurement may be used as long as buried objects, cavities, etc. can be explored.
  • the GNSS antenna 30 is an antenna that receives GNSS signals emitted from navigation satellites.
  • GNSS signals include time information synchronized with UTC (Coordinated Universal Time).
  • the clock unit 31 has a built-in GNSS receiver, receives GNSS signals received by the GNSS antenna 30, and manages time information synchronized with UTC. In order to stamp time stamps, the clock unit 31 can synchronize the time with UTC with high precision by synchronizing with GNSS, for example, and even when the GNSS signal cannot be received temporarily under an elevated railway, etc., the clock unit 31 is built-in.
  • the clock's holdover (self-running) operation maintains the clock's time accuracy within a certain period of time.
  • the clock section 31 may be an atomic clock incorporating a rubidium oscillator or the like.
  • the GPR antenna 32 is an antenna that transmits GPR electromagnetic waves and receives GPR reflected waves therefrom.
  • the GPR antenna 32 transmits GPR electromagnetic waves for exploration transmitted from the transmitter 33 via a coaxial cable or the like toward the ground, receives GPR reflected waves from underground, and receives data of the GPR reflected waves (hereinafter referred to as (denoted as "GPR data") is transmitted to the receiving section 34 via a coaxial cable or the like.
  • the GPR antenna 32 may be an array type antenna in which a plurality of antenna elements are arranged and can sweep a certain width of the ground at once. Further, in order to cover a plurality of frequency regions, a plurality of transmitting sections 33 and GPR antennas 32 may be combined.
  • the transmitter 33 generates GPR electromagnetic waves for underground exploration and transmits them to the GPR antenna 32.
  • the transmitter 33 may change the frequency of the signal to be transmitted stepwise on the time axis.
  • the receiving unit 34 receives GPR data from the GPR antenna 32 and stores it in the data storage unit 35.
  • the data storage unit 35 stores image data obtained by photographing by the photographing unit 36 and GPR data obtained from the receiving unit 34 in association with each other in a certain time epoch.
  • a timestamp (an example of time information) of the data acquisition time acquired from the clock unit 31 is attached to each of these data.
  • a survey line identifier (an example of identification information) for identifying which survey line the measurement data belongs to is recorded in each data, and the times at which measurement is started and ended for each survey line are stored.
  • Time information indicating a time between the start time and end time may be discarded and not stored.
  • the photographing unit 36 is composed of a visible light camera, and photographs the ground in the vicinity scanned by the GPR antenna 32.
  • the number of cameras may be one, or a plurality of cameras may share the photographing range. Alternatively, it may be an omnidirectional camera equipped with a fisheye lens. Further, the camera may be a monocular camera or a stereo camera. In either case, the range of the ground photographed by the camera covers the area scanned by the GPR antenna 32, and also photographs the ground in an area of a certain width (width in the direction orthogonal to the scanning direction) on both sides of the GPR antenna 32. The scope shall be covered. This fixed width is, for example, 10 cm.
  • the position of the imaging unit (camera) 36 of the radar exploration device 3a and the position of the GPR antenna 32 are relatively fixed, and the relative distance (offset value) between the respective reference points is accurately measured in advance. There is. It is assumed that the photographing unit 36 photographs the ground, and that the offset between any pixel on the image data of the ground and the position of the GPR antenna 32 has been measured and calibrated in advance. At the time of measurement, the necessary illumination to photograph the photographing range of the ground is ensured by lighting, etc., as necessary. Further, the photographing unit 36 may use a far-infrared camera or the like that does not use visible light.
  • FIG. 3 is a configuration diagram of a modified example of the radar exploration device.
  • the radar exploration device 3b is an example of the radar exploration device 3 in FIG.
  • the same components as those in FIG. 2 are designated by the same reference numerals and the description thereof will be omitted.
  • the radar exploration device 3b is further equipped with a gyro sensor 37, an acceleration sensor 38, and an odometry 39 compared to the radar exploration device 3a.
  • the gyro sensor 37 is a sensor that detects the angular velocity of the measurement cart 2 (radar exploration device 3b).
  • the acceleration sensor 38 is a sensor that measures the acceleration of the measurement cart 2 (radar exploration device 3b).
  • the odometry 39 is a device that measures the number of rotations of the wheels of the measurement cart 2 using a rotary encoder, for example, and calculates the moving distance, speed, and turning angle of the measurement cart 2 from the number of rotations of the wheels.
  • the odometry 39 is a device for further increasing the accuracy of self-position estimation, and is a device that complements the gyro sensor 37 and the acceleration sensor 38, so it does not necessarily need to be provided in the radar exploration device 3b.
  • the data from the gyro sensor 37, the acceleration sensor 38, and the odometry 39 are transmitted to the radar exploration device 3 in between when the quality of the image data photographed by the photographing section 36 is not good or when a defect occurs in the photographing section 36. This is used to complement the position estimation of the radar exploration device 3 that has acquired the GPR data by measuring the relative displacement of the GPR data.
  • FIG. 4 is a configuration diagram of the ground control point measuring device.
  • the ground control point measuring device 6 includes a GNSS antenna 60, a GNSS receiving section 61, a correction data receiving section 64, a data storage section 65, and a photographing section 66.
  • the GNSS antenna 60 has basically the same configuration as the GNSS antenna 30 in FIG. 2, but may have a configuration of two antennas separated by a certain distance in a horizontal plane.
  • the GNSS receiving unit 61 receives navigation satellite signals from a plurality of satellites from the GNSS antenna 60 and identifies the position of the GNSS antenna 60. In this case, correction data for carrier phase positioning is acquired from the correction data receiving section 64, and positioning calculations using the RTK-GNSS method are performed. Thereby, the GNSS receiving section 61 stores information on the finally specified absolute position (coordinate data) in the data storage section 65.
  • the GNSS receiving unit 61 identifies the position of each antenna, calculates absolute position (coordinate data) information and the direction of a line connecting the two antenna positions, and stores the data. 65.
  • the correction data receiving unit 64 receives correction data for carrier phase positioning via a communication network, etc., and transmits the correction data to the GNSS receiving unit 61. Note that the function of performing positioning calculations for the correction data receiving section 64 or the GNSS receiving section 61 may be provided on the cloud processing platform.
  • the data storage unit 65 stores absolute position information obtained as a result of positioning calculation by the GNSS receiving unit 61, image data obtained by the imaging unit 66, and information such as offset values in association with each other.
  • the configuration of the imaging unit 66 is basically the same as that of the imaging unit 36 in FIG. 2.
  • the relative positions of the GNSS antenna 60 and the imaging unit (camera) 66 are fixed and measured.
  • the ground control point measuring device 6 is installed on the ground using the support of the tripod 4, and the photographing unit 66 photographs an image of the ground surface including a virtual GCP.
  • the optical axis of the photographing section 66 does not necessarily have to exactly coincide with the vertical downward direction; ) is assumed to have been measured and calibrated in advance. Any pixel in the image data obtained by photography (for example, the center of the image data) becomes a virtual GCP. Alternatively, all or part of the image data may be a virtual GCP.
  • the photographing unit 66 may be a far-infrared camera that uses other than visible light.
  • the photographing unit 36 in FIG. 2 is also a far-infrared camera that uses light other than visible light.
  • FIG. 5 is an electrical hardware configuration diagram of the data analysis device.
  • the data analysis device 9 includes, as a computer, a CPU 901, ROM 902, RAM 903, SSD 904, external device connection I/F (Interface) 905, network I/F 906, It includes a display 907, an input device 908, a media I/F 909, and a bus line 910.
  • the CPU 901 controls the operation of the data analysis device 9 as a whole.
  • the ROM 902 stores programs used to drive the CPU 901 such as IPL.
  • RAM903 is used as a work area for CPU901.
  • the SSD 904 reads or writes various data under the control of the CPU 901. Note that an HDD (Hard Disk Drive) may be used instead of the SSD 904.
  • HDD Hard Disk Drive
  • the external device connection I/F 905 is an interface for connecting various external devices.
  • External devices in this case include a display, speaker, keyboard, mouse, USB memory, printer, and the like.
  • the network I/F 906 is an interface for data communication via a communication network such as the Internet.
  • the display 907 is a type of display means such as liquid crystal or organic EL (Electro Luminescence) that displays various images.
  • the input device 908 is a type of input means for selecting and executing various instructions, selecting a processing target, moving a cursor, and the like.
  • An example of the input device 908 is a pointing device.
  • the media I/F 909 controls reading or writing (storage) of data to a recording medium 909m such as a flash memory.
  • the recording media 909m includes DVDs, Blu-ray Discs (registered trademark), and the like.
  • the bus line 910 is an address bus, a data bus, etc. for electrically connecting each component such as the CPU 901 shown in FIG. 5.
  • FIG. 6 is a functional configuration diagram of the data analysis device.
  • the data analysis device 9 includes an acquisition section 90, an exploration image data matching section 91, a data synthesis section 93, an absolute position image data matching section 95, an absolute position calibration section 97, and an output section 99. have.
  • Each of these units is a function realized by instructions from the CPU 901 according to a program stored in the RAM 903 or the like.
  • the acquisition unit 90 acquires image data, radar exploration data (GPR data), identification information, time information, etc. from the radar exploration device 3 via a communication network or the like. Further, the acquisition unit 90 acquires image data, absolute position information, etc. from the ground control point measuring device 6 via a communication network or the like.
  • the exploration system image data matching unit 91 is configured to, for example, match images of first image data associated with the first radar exploration data in terms of position, and second image data associated with the second radar exploration data in terms of position. Perform data matching.
  • the first predetermined surface depicted by the first image data and the second predetermined surface depicted by the second image data are adjacent to each other on the ground.
  • the data synthesis section 93 Based on the image data matching by the exploration system image data matching section 91, the data synthesis section 93 combines the first radar exploration data and the second radar exploration data on a vertical plane of the first predetermined surface and the second predetermined surface, for example.
  • Composite exploration data is generated by calculating the relative displacement between the exploration data, correcting it, and then composing it.
  • the absolute position-based image data matching unit 95 uses an absolute position obtained by measuring the ground control point of a specific plane (for example, the specific plane 51 in FIG. 11) on the first predetermined plane or the second predetermined plane, for example.
  • the third image data obtained by photographing the specific surface associated with each of the fourth image data obtained by photographing another specific surface associated with the absolute position obtained by measuring the image data, and at least one of the first image data and the second image data. , performs image data matching with specific image data including the specific plane and other specific planes.
  • the third and fourth image data and the information on each absolute position are data acquired from the ground control point measuring device 6.
  • the specific surface is not only included in the measurement range A as in the case of the specific surface 51 in FIG. It is possible that there are.
  • the GNSS antenna 60 of the ground control point measuring device 6 has a two-antenna configuration, it measures not only the coordinate values of the image data of a specific plane but also the direction of the image data of the specific plane. As long as the orientation and coordinate values of the image data and the second image data are determined, the ground control point may be set at one location.
  • the specific image data may be composite image data in which first image data (measurement range A, etc.) and second image data (measurement range B, etc.) are combined.
  • the image of the composite image data includes at least two specific surfaces (specific surfaces 51, 52, etc.).
  • the data synthesis unit 93 synthesizes the first image data (measurement range A, etc.) and the second image data (measurement range B, etc.) to generate synthesized image data.
  • the absolute position calibration unit 97 performs absolute position calibration regarding the synthetic exploration data based on the image data matching performed by the absolute position system image data matching unit 95.
  • the output unit 99 outputs the results of the above analysis.
  • FIG. 7 is a flowchart showing the processing performed by the radar exploration device 3.
  • FIG. 10 is a diagram showing the relationship between the measurement range and the measuring line.
  • FIG. 11 is a diagram showing the relationship among the measurement range, the measurement area of radar exploration data, and the imaging area.
  • a buried object 11 or a cavity 12 exists underground 10 in the ground (an example of a predetermined surface) of a region M that is a target of GPR measurement (hereinafter referred to as "measurement region”).
  • An explorer divides the entire measurement area M into multiple areas (hereinafter referred to as “measurement Range”) Divide into A to D.
  • the ground surface in the measurement range A is an example of the first predetermined surface
  • the ground surface in the measurement range B is an example of the second predetermined surface.
  • the explorer then moves the measurement cart 2 along each of the divided measurement ranges to draw survey lines a to d for exploration.
  • the position of each survey line a to d may be the center line in the advancing direction of each measurement range A to D, or any position drawn parallel to the center line for scanning the measurement range with the radar exploration device 3. It may be a line. In that case, a marker is marked at the corresponding position on the radar exploration device 3, and is used as a mark for the radar exploration device 3 to correctly sweep the measurement range along the survey line.
  • the measurement range ( In FIG. 11, the width of measurement range B) is set slightly smaller. That is, although there may be some overlap in the area scanned by the radar exploration device 3 to obtain radar exploration data, the area within the measurement area M is scanned without gaps.
  • the width of the ground imaging area 42 is wider than the width of the measurement area 41 for measuring radar exploration data, and is set to cover not only the measurement area 41 but also the adjacent measurement range.
  • the relationship between the measurement range B in the measurement along the survey line b, the measurement area 41 of the radar exploration data, and the photographing area 42 of the ground is as shown in FIG. 11. That is, the width of the measurement area 41 for obtaining radar exploration data (the width in the direction orthogonal to the scanning direction) is narrower than the width of the imaging area 42 for obtaining the image data (the width in the direction orthogonal to the scanning direction).
  • the width of the predetermined plane such as the measurement range A is narrower than the width of the measurement area 41.
  • the measurement cart 2 does not necessarily need to move in the order of survey lines a, b, c, . . .
  • the radar exploration device 3 repeatedly performs exploration while moving the measurement cart 2 along each survey line.
  • the exploration of each survey line does not need to be carried out consecutively in time, and can be carried out at any arbitrary time.
  • the data storage unit 35 stores radar exploration data and image data, as well as survey line identification information and acquisition time information in association with each other.
  • a measurement area (measurement area M in Fig. 10) is selected on a map displayed on the screen of an operating terminal of the explorer, etc., and the measurement range and survey line are automatically set. may be done. Furthermore, positioning results obtained by composite positioning using GNSS/INS (Inertial Navigation System) may be used to perform machine guidance for manual movement operation of the measurement cart 2 along the survey line or for automatic travel of the measurement cart 2.
  • GNSS/INS Inertial Navigation System
  • FIG. 8 is a flowchart showing the processing performed by the ground control point measuring device 6.
  • the position of the GCP measurement point within the measurement area M may be determined in advance, such as the specific surface 51 in FIG. Any position within M may be selected as a GCP measurement point. Furthermore, RTK-GNSS positioning may be repeatedly performed at different time zones until a valid GNSS positioning solution is obtained at the same location. The timing of GNSS positioning in the GNSS receiving section 61 of the ground control point measuring device 6 and the timing of photographing in the photographing section 66 do not necessarily have to coincide unless the installation position of the tripod 4 on which the ground control point measuring device 6 is installed is moved. For example, RTK-GNSS positioning may be started after photographing the ground near a virtual GCP using the ground control point measuring device 6, and positioning may be continued until an effective convergence (FIX) solution is obtained.
  • FIX effective convergence
  • the GNSS receiving unit 61 measures the absolute position of the specific surface 51 of the ground based on the GNSS signal and correction data on the specific surface 51.
  • the photographing unit 66 acquires image data of the specific surface 51.
  • the data storage unit 65 stores the control point data including the absolute position of the specific surface 51 and the image data of the specific surface in association with each other.
  • the ground control point (GCP) of this embodiment is a virtual one based on image data of the road surface, there is no need to install physical signs or markers on the ground, and any point within the measurement area M can be used as a GCP. Can be set. Therefore, a point with as good a GNSS signal reception environment as possible can be selected as a GCP without being restricted by the position of a sign or marker. Furthermore, since the position of the GNSS satellite in the sky changes over time, it is also possible to perform GCP measurements by selecting a time period when a good positioning solution is likely to be obtained based on the positional relationship with surrounding structures. be.
  • the ground control point measurement device 6 shown in Figure 1 can be mounted on a flying vehicle such as a drone, and the RTK-GNSS positioning at the GCP measurement point can be carried out in the sky where the area around the GNSS antenna 60 is not obstructed by structures.
  • a flying vehicle such as a drone
  • the RTK-GNSS positioning at the GCP measurement point can be carried out in the sky where the area around the GNSS antenna 60 is not obstructed by structures.
  • -It is possible to perform GNSS positioning and photograph the ground near GCP measurement points using a telephoto lens. In this case, signs or markers may be installed on the ground to make it easier to identify the location of the GCP from the sky.
  • optical ranging means such as a total station may be used in combination.
  • FIG. 9 is a flowchart showing the processing performed by the data analysis device. Note that the analysis processing by the data analysis device 9 may be performed in real time while the radar exploration device 3 is exploring, or may be performed (offline) after the radar exploration device 3 is exploring.
  • the acquisition unit 90 acquires each data from the radar exploration device 3 and the ground reference point measurement device 6.
  • the exploration system image data matching unit 91 performs image data matching of the data (first and second image data, etc.) acquired from the radar exploration device 3. For example, as shown in FIG. 12, the exploration system image data matching unit 91 receives image data A1 including the first predetermined surface, which is the ground of the measurement range A, from the radar exploration device 3, and the measurement data adjacent to the measurement range A. Image data matching is performed with image data B1 including a second predetermined surface that is the ground in range B. As shown in FIG. 12, the ground area indicated by the radar exploration data A2 is wider than the measurement range A, and the range of the image data A1 is wider than the ground area indicated by the radar exploration data A2.
  • the ground area indicated by the radar exploration data B2 is wider than the measurement range B
  • the range of the image data B1 is wider than the ground area indicated by the radar exploration data B2. Therefore, the imaging areas of image data A1 and image data B1 partially overlap.
  • the image data A1 and B1 are two-dimensional planar images, but the radar exploration data A2 and B2 are three-dimensional rectangular parallelepiped data as shown in FIG. Note that the image data A1 is an example of first image data, and the image data B1 is an example of second image data. Further, the radar exploration data A2 is an example of first radar exploration data, and the radar exploration data B2 is an example of second radar exploration data.
  • the exploration system image data matching unit 91 performs matching of the adjacent image data when matching the image data. It is possible to calculate the relative displacement amount between the radar exploration data A2 and the radar exploration data B2 of the survey line.
  • the exploration-based image data matching unit 91 matches image data obtained by measurements of adjacent survey lines photographing the same ground area.
  • the exploration image data matching unit 91 extracts from the image data the texture information of the road surface obtained from the road surface properties such as unevenness and cracks on the surface layer of the pavement surface of the road, or the painted condition of the lines. Perform matching.
  • the exploration system image data matching unit 91 generates an image within a measurement range B on the left side when facing the traveling direction (x direction) of the measurement cart 2 (radar exploration device 3), which was photographed while scanning the survey line a in FIG. 10.
  • the data (image data A1 in FIG. 12) is matched with the image data (image data B1 in FIG. 12) within the measurement range B photographed in the measurement of the survey line b.
  • the exploration system image data matching unit 91 uses image data (image data B1 in FIG. 12) within the measurement range A on the left side of the traveling direction (-x direction) of the radar exploration device 3, which was photographed during the measurement of the survey line b. , is matched with the image data (image data A1 in FIG. 12) within the measurement range A that was photographed during the measurement of the measuring line a.
  • the exploration image data matching unit 91 may narrow down the range of image data to be matched by referring to time stamp data in addition to the survey line identifier of the measurement data of each survey line.
  • the exploration system image data matching unit 91 further refers to data from the gyro sensor 37, acceleration sensor 38, or odometry 39 to determine the range of image data to be matched. You can also narrow down the results. Once the image data has been matched, the exploration image data matching unit 91 may narrow down the matching target to corresponding image data at times before and after the matching.
  • the data synthesis unit 93 synthesizes the data acquired from the radar exploration device 3 (first and second radar exploration data, etc.) based on image data matching to generate composite exploration data. For example, as shown in FIG. 12, the data synthesizing unit 93 synthesizes the radar exploration data A2 and the radar exploration data B2 on a plane perpendicular to the first predetermined plane and the second predetermined plane, thereby generating the synthesized exploration data. Generate AB2.
  • the data synthesis unit 93 calculates the amount of displacement in the traveling direction of the measurement cart 2 (radar exploration device 3) from image data and time stamps (obtained time information) within the same survey line, and also From the relative displacement of the radar exploration data of adjacent survey lines obtained by the image data matching unit 91, the amount of displacement in the direction orthogonal to this is calculated, and based on this, the radar exploration data is pasted and synthesized. When combining radar survey data at overlapping positions, one of them is deleted as appropriate and combined.
  • the method of this embodiment is characterized in that, based on each image data of the ground surface, the relative positional relationship of radar exploration data linked to image data is specified. Since the reference points of the imaging unit 36 of the radar exploration device 3 and the GPR antenna 32 have a physically fixed relative positional relationship, radar exploration data measured on different survey lines can be synthesized with high accuracy.
  • the absolute position system image data matching unit 95 matches the data acquired from the ground reference point measurement device 6 (for example, each of the third and fourth image data) and the data acquired from the radar exploration device 3 (the third image data).
  • image data (specific image data related to the first and second image data) is performed.
  • the specific image data related to the first and/or second image data is at least one of the first image data and the second image data, and includes a specific surface (such as the specific surface 51) and another specific surface (such as the specific surface 51). This is image data including a specific surface 52, etc.).
  • the composite exploration data generated by combining multiple pieces of radar exploration data based on the relative displacement amount in the steps up to S33 has its relative position determined based on the unique coordinate system of the measurement area M.
  • absolute coordinates are further linked to this coordinate system using information on the absolute positions of image data of a plurality of GCP measurement points measured by the ground control point measurement device 6.
  • the matching process between the image data of the GCP measurement point taken during ground control point measurement and the image data taken during GPR measurement is based on the approximate position of the GCP measurement point, and the line identifier (identification information) and time stamp (time information) of the GPR measurement data.
  • the range of image data to be matched may be narrowed down by referring to the data in .
  • the absolute position calibration unit 97 performs absolute position calibration regarding the synthetic exploration data based on image data matching.
  • the coordinates of the entire radar exploration data are calibrated to absolute coordinates using the coordinate values of a plurality of virtual GCPs. This is basically the same procedure as calibrating the coordinate data of the entire orthogonal image of an aerial photograph using the absolute coordinate position information of multiple GCPs reflected in the aerial photograph.
  • calibrate the absolute position by selecting and using appropriate GCP position data from the data for which a valid positioning solution was obtained by GCP measurement.
  • This embodiment can be applied not only to exploring the buried object 11 or cavity 12 in the underground 10 from the ground as shown in FIG. 10, but also to exploring the inside of a concrete wall or the like.
  • this embodiment can be used for data collection purposes other than underground exploration data, such as collecting 3D point cloud data on the ground using stereo cameras, LiDAR (Light Detection And Ranging), and imaging RADAR (Radio Detection And Ranging). method may be applied.
  • the manhole shaft position can be identified on the radar exploration data image, it is possible to consider the manhole position (center) on the ground as the GCP, but the manholes are not installed in the measurement area M at a sufficient density as the GCP. There is a high possibility that the GCP cannot be taken at the ideal position such as the four corners or the center of the measurement area M. Furthermore, it is not always possible to obtain a good reception environment for GNSS signals at the location of a manhole.
  • Another possible method is to collect data on the surrounding environment using the imaging unit (camera) 36, LiDAR, RADAR, etc. mounted on the radar exploration device 3, and to identify the position of the radar exploration device 3 by performing scan matching on an environmental map.
  • the imaging unit (camera) 36, LiDAR, RADAR, etc. mounted on the radar exploration device 3
  • there are formal issues such as the need to prepare extremely high-precision environmental maps.
  • the above-mentioned problem is solved because the measurement data is synthesized in a plane using the ground texture characteristics of a predetermined surface and the absolute position is specified by virtual GCP. can do.
  • the data analysis device 9 can be realized by a computer and a program, but this program can also be recorded on a (non-temporary) recording medium or provided via a communication network such as the Internet.
  • the CPU 901 may not only be a single CPU but may also be a plurality of CPUs.
  • Exploration system 2 Measurement cart (an example of a moving object) 3 Radar exploration device 4 Tripod 6 Ground control point measurement device 9 Data analysis device 90 Acquisition section 91 Exploration system image data matching section 93 Data synthesis section 95 Absolute position system image data matching section 97 Absolute position calibration section 99 Output section A1 Image data (Example of first image data) B1 Image data (an example of second image data) A2 Radar exploration data (first radar exploration data) B2 Radar exploration data (second radar exploration data) AB2 Synthetic exploration data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The purpose of the present disclosure is to accurately identify the state of an underground object or cavity and locate the position thereof, without being restricted by the place. Accordingly, a data analysis device 9: performs image data matching between first image data A1 of a first prescribed surface and second image data B1 of a second prescribed surface, said first image data A1 being associated with radar exploration data A2 for the first prescribed surface, said second image data B1 being associated with radar exploration data B2 for the second prescribed surface adjacent to the first prescribed surface; and combines both radar exploration data to generate composite exploration data AB2. The data analysis device 9: performs image data matching between each of third image data of a specific surface 51 and fourth image data of a specific surface 52 acquired from a ground control point measurement device 6 and specific image data that is at least one of the first and second image data and includes the specific surface 51 and the specific surface 52; and, on the basis of each absolute position associated with the third and fourth image data, performs calibration of the absolute position related to the composite exploration data AB2.

Description

データ解析装置、探査システム、データ解析方法、及びプログラムData analysis device, exploration system, data analysis method, and program
 本開示内容は、道路の地中等の埋設物又は空洞を探査する技術に関する。 The present disclosure relates to technology for exploring buried objects or cavities in the ground of roads.
 車道及び歩道下の地中には、ガス、上下水道、電力、又は通信ケーブル等の管路が埋設されている。これらの管路の位置等の情報は、各事業者が個別に管理する埋設物台帳等に保管されている。そのため、例えば、道路の掘削工事の際には、管路の損傷及び切断事故を防止するために、工事に先立って予め埋設管路の位置及び深さを確認するための試掘が行われる。埋設物台帳等に記載された管路の情報が正確でない場合には、埋設された管路を損傷したり無駄な試掘が発生したりする場合がある。そこで、新設の埋設工事や電線類の無電柱化事業へ向けた地下インフラ設備の効率的な維持管理及び更新のために、地下に埋設された管路の正確な位置情報のデータベースを作成し、事業者間で共有する取組みが検討されている。 Pipelines for gas, water, sewage, electricity, or communication cables are buried underground under roadways and sidewalks. Information such as the location of these pipelines is stored in buried property registers, etc. that are individually managed by each business operator. Therefore, for example, during road excavation work, trial excavation is performed to confirm the position and depth of the buried pipe prior to construction in order to prevent damage to the pipe and cutting accidents. If the information about pipelines recorded in a buried property register or the like is inaccurate, the buried pipelines may be damaged or wasteful trial excavation may occur. Therefore, in order to efficiently maintain, manage, and update underground infrastructure facilities for new underground construction and electric wire removal projects, we have created a database of accurate location information for underground conduits. Efforts to share this information among business operators are being considered.
 地下に埋設された管路の位置を非破壊で探査する方法の一つとして、地中探査レーダー(Ground Penetration Radar:GPR)による計測方法がある。この方法を実行する地中探査レーダー計測装置は、電磁波(パルス波)を地表から地中に向けて放射し、地中の電磁気的性質の異なる境界面で反射した電磁波を受信することにより、地中探査を行う。反射波の特徴により、地表から数メートル程度までの深度に存在する管路を非破壊で特定することができる。また、反射波の走時(電磁波を放射してから反射波を受信するまでの時間)を測定することにより埋設管の深度を計測することができる。このようにして道路を掘削することなく、地表面から管路等の埋設物又は空洞などの位置及び地表からの深さを特定することができる。 One method of non-destructively detecting the location of underground pipelines is to use ground penetrating radar (GPR). Ground-penetrating radar measurement equipment that implements this method emits electromagnetic waves (pulse waves) from the ground surface into the ground, and receives the electromagnetic waves reflected at interfaces with different electromagnetic properties underground. Conduct medium exploration. The characteristics of the reflected waves make it possible to non-destructively identify pipes that exist at depths of several meters from the ground surface. Furthermore, the depth of a buried pipe can be measured by measuring the travel time of the reflected wave (the time from when the electromagnetic wave is emitted to when the reflected wave is received). In this way, the position and depth of buried objects such as pipes or cavities from the ground surface can be specified without excavating the road.
 また、電磁波の減衰及び到達距離が土質又は含水量によって異なるため、探査深度や分解能のターゲットに応じて、パルス幅又は中心周波数の異なる電磁波が使用される。波長が長い電磁波はより深くまで到達するが、指向性及び分解能が低下する。一方、電磁波の周波数が高い場合には、指向性は向上するが地表付近で反射して減衰量が大きく、探査深度が限られる。既存の地中探査レーダー計測装置では、数10MHz~数GHzの周波数の信号が使用される。 Furthermore, since the attenuation and reach of electromagnetic waves vary depending on soil quality or water content, electromagnetic waves with different pulse widths or center frequencies are used depending on the target depth of exploration and resolution. Electromagnetic waves with longer wavelengths reach deeper depths, but have lower directivity and resolution. On the other hand, when the frequency of electromagnetic waves is high, although the directivity is improved, the amount of attenuation is large due to reflection near the ground surface, and the exploration depth is limited. Existing ground-penetrating radar measurement equipment uses signals with frequencies ranging from several tens of MHz to several GHz.
 道路の地下に埋設された管路の探査では、一例として、探査者が道路を横断する方向に計測を行うライン(以下、「測線」と示す)を一定の間隔で引き、これに沿って地中探査レーダー計測装置を搭載したカート(以下、「計測カート」と示す)を走査する方法で計測を行う。この場合、各測線の計測データにおいて検知された管路の位置を推定により繋ぎ合わせるため、測線以外の領域の管路の正確な情報は得られない。また、通常、管路の位置は、歩道端又は電柱等のランドマーク(目標物)からの地表面上の距離として記録されるが、経時的に路上の環境が変化し、目標となるランドマークが失われるケースがある。 In the exploration of pipelines buried under roads, for example, the explorer draws measurement lines (hereinafter referred to as "measurement lines") at regular intervals in the direction across the road, and then traces the ground along these lines. Measurements will be made by scanning a cart equipped with a medium-search radar measurement device (hereinafter referred to as the "measurement cart"). In this case, since the positions of the pipelines detected in the measurement data of each survey line are connected by estimation, accurate information about the pipelines in areas other than the survey lines cannot be obtained. In addition, the position of a conduit is usually recorded as the distance on the ground from a landmark (target object) such as the edge of a sidewalk or a utility pole, but as the road environment changes over time, There are cases where it is lost.
 そこで、地下に埋設された管路の位置を、地表の絶対位置(緯度、経度、高度)及び地表からの深度により管理し、地表の周辺環境が経時的に変化しても恒久的に位置を特定することが可能な情報として記録することが検討されている。 Therefore, the location of pipelines buried underground is managed based on the absolute position (latitude, longitude, altitude) and depth from the ground surface, and the location is maintained permanently even if the surrounding environment on the ground surface changes over time. Consideration is being given to recording information that can be identified.
 更に、道路又は歩道の領域内の地下に埋設された管路を3次元のデータとして面的に探査する方法として、路面を地中探査レーダー計測装置により一定の間隔で隈なく走査し、複数の走査データを面的に貼り合わせる方法がある。この方法は3次元的に地中探査データを収集することで、測線を横断する管路の位置も正確に特定することができるメリットがある。実際の現場での計測作業では、測線に正確に沿った計測が実施できるとは限らず、交通の状況等により、計測カートを一時的に停止したり、車両や歩行者を回避するために測線から一時的に迂回したりすることがある。このため、計測したデータを面的に正確に貼り合わせるには、地中探査レーダー計測装置に搭載される地中探査信号アンテナの絶対位置を走査の際に概ね10cm以下の精度で正確にトレースした位置情報が必要となる。 Furthermore, as a method for surveying pipes buried underground in the area of roads or sidewalks as three-dimensional data, the road surface is thoroughly scanned at regular intervals with a ground-penetrating radar measuring device, and multiple There is a method of pasting scan data together. This method has the advantage of being able to accurately identify the location of pipelines that cross the survey line by collecting underground exploration data in three dimensions. In actual on-site measurement work, it is not always possible to carry out measurements along the measurement line accurately, and depending on traffic conditions, the measurement cart may be temporarily stopped or the measurement cart may be stopped along the measurement line to avoid vehicles or pedestrians. There may be a temporary detour. Therefore, in order to accurately paste the measured data together, the absolute position of the underground exploration signal antenna mounted on the underground exploration radar measurement device must be accurately traced to an accuracy of approximately 10 cm or less during scanning. Location information is required.
 そこで、現在では、計測カートにGNSS(Global Navigation Satellite Systems)アンテナを設置し、地中探査レーダーによる計測中に、RTK(Real-Time Kinematic)-GNSS方式により高精度な位置情報を同時に取得するソリューションが使用されている(非特許文献1参照)。RTK-GNSS方式では良好なGNSS信号の受信環境においては数センチメートルオーダーの精度の測位解が得られる。このような高精度な測位解を使用して合成される3次元の地中探査データを任意の深度の水平面や任意の方向の垂直面でスライスした反射データのイメージ(画像)を解析することにより、探査者は地下の管路や空洞の3次元の状態を知ることができる。更に、探査者は計測領域内のデータベース(台帳)にない管路や廃棄物を発見することができる。また、管路の位置は絶対位置として記録することができる。 Therefore, we are currently developing a solution that installs a GNSS (Global Navigation Satellite Systems) antenna on the measurement cart and simultaneously acquires highly accurate position information using the RTK (Real-Time Kinematic)-GNSS method while ground-penetrating radar is measuring. is used (see Non-Patent Document 1). With the RTK-GNSS method, positioning solutions with accuracy on the order of several centimeters can be obtained in a good GNSS signal reception environment. By analyzing images of reflection data obtained by slicing the three-dimensional underground exploration data synthesized using such highly accurate positioning solutions on a horizontal plane at an arbitrary depth or a vertical plane in an arbitrary direction. , explorers can learn about the three-dimensional conditions of underground pipes and cavities. Furthermore, explorers can discover pipes and waste that are not in the database (ledger) within the measurement area. Additionally, the position of the conduit can be recorded as an absolute position.
 しかし、地中探査レーダー計測位置の高精度測位を行うために、RTK-GNSS方式を使用する場合、GNSSアンテナの周辺の建造物等により、直接波として受信可能な衛星信号が遮蔽され、構造物により衛星信号が反射又は回折することによって生じるマルチパス信号を受信することになり、RTK-GNSS測位解の精度が劣化することがある。その場合、複数の測線の計測データをRTK-GNSSにより得られた測位解に基づき正確に貼り合わせる(合成する)ことができないため、面的なGPR計測データ(GPR反射データ画像)の取得が困難となり、地下に埋設された管路等の物体の正確な状態把握及び位置の特定ができないといった課題があった。 However, when using the RTK-GNSS method to perform high-precision positioning of ground-penetrating radar measurement positions, the satellite signals that can be received as direct waves are blocked by buildings etc. around the GNSS antenna. As a result, multipath signals generated by reflection or diffraction of satellite signals are received, which may degrade the accuracy of RTK-GNSS positioning solutions. In that case, it is difficult to obtain area-wide GPR measurement data (GPR reflection data images) because it is not possible to accurately paste (synthesize) the measurement data of multiple survey lines based on the positioning solution obtained by RTK-GNSS. Therefore, there was a problem in that it was not possible to accurately grasp the condition and pinpoint the location of objects such as pipes buried underground.
 本発明は、上述の事情に鑑みてなされたもので、場所の制限がなく、埋設物又は空洞の正確な状態把握及び位置の特定を行うことを目的とする。 The present invention was made in view of the above-mentioned circumstances, and its purpose is to accurately grasp the condition and pinpoint the position of buried objects or cavities without any location restrictions.
 上記目的を達成するため、請求項1に係る発明は、埋設物又は空洞の位置及び所定面からの深さに関するデータを解析するデータ解析装置であって、第1の所定面を走査することで得られた第1のレーダー探査データと位置に関して関連付けられ前記第1の所定面を撮影することで得られた第1の画像データと、前記第1の所定面に隣接する第2の所定面を走査することで得られた第2のレーダー探査データと位置に関して関連付けられ前記第2の所定面を撮影することで得られた第2の画像データとの画像データマッチングを行う探査系画像データマッチング部と、前記探査系画像データマッチング部による画像データマッチングに基づいて、前記第1の所定面及び前記第2の所定面の垂直面で前記第1のレーダー探査データ及び前記第2のレーダー探査データを合成することで合成探査データを生成するデータ合成部と、前記第1の所定面及び前記第2の所定面の少なくとも一方において特定面の地上標定点を計測することで得られた絶対位置に関連付けられ前記特定面を撮影することで得られた第3の画像データと、前記第1の所定面及び前記第2の所定面の少なくとも一方において前記特定面とは異なる他の特定面の地上標定点を計測することで得られた絶対位置に関連付けられ前記他の特定面を撮影することで得られた第4の画像データのそれぞれと、前記第1の画像データ及び前記第2の画像データの少なくとも一方であり、前記特定面及び前記他の特定面を含む特定の画像データとの画像データマッチングを行う絶対位置系画像データマッチング部と、前記絶対位置系画像データマッチング部による画像データマッチングに基づいて、前記合成探査データに関する絶対位置のキャリブレーションを行う絶対位置キャリブレーション部と、を有するデータ解析装置である。 In order to achieve the above object, the invention according to claim 1 is a data analysis device that analyzes data regarding the position of a buried object or a cavity and its depth from a predetermined surface, the data analysis device comprising: scanning a first predetermined surface; First image data obtained by photographing the first predetermined surface, which is correlated in terms of position with the obtained first radar exploration data, and a second predetermined surface adjacent to the first predetermined surface. an exploration-based image data matching unit that performs image data matching between second radar exploration data obtained by scanning and second image data obtained by photographing the second predetermined surface, which is associated with respect to position; and, based on the image data matching by the exploration system image data matching unit, the first radar exploration data and the second radar exploration data on a vertical plane of the first predetermined plane and the second predetermined plane. a data synthesis unit that generates synthetic exploration data by synthesizing the data; and association with an absolute position obtained by measuring a ground control point of a specific plane on at least one of the first predetermined plane and the second predetermined plane. third image data obtained by photographing the specific surface, and a ground control point of another specific surface different from the specific surface in at least one of the first predetermined surface and the second predetermined surface. each of the fourth image data obtained by photographing the other specific surface associated with the absolute position obtained by measuring, and at least the first image data and the second image data. On the other hand, an absolute position-based image data matching unit that performs image data matching with specific image data including the specific plane and the other specific plane, and an image data matching unit based on the image data matching by the absolute position-based image data matching unit. , and an absolute position calibration unit that performs absolute position calibration regarding the synthetic exploration data.
 以上説明したように本発明によれば、場所の制限がなく、埋設物又は空洞の正確な状態把握及び位置の特定を行うことができるという効果を奏する。 As explained above, according to the present invention, it is possible to accurately grasp the condition and pinpoint the position of buried objects or cavities without any restrictions on location.
実施形態に係る探査システムの全体構成図である。1 is an overall configuration diagram of an exploration system according to an embodiment. レーダー探査装置の構成図である。It is a block diagram of a radar exploration device. レーダー探査装置の変形例の構成図である。It is a block diagram of the modification of a radar exploration device. 地上標定点計測装置の構成図である。FIG. 2 is a configuration diagram of a ground control point measuring device. データ解析装置の電気的なハードウェア構成図である。FIG. 2 is an electrical hardware configuration diagram of the data analysis device. データ解析装置の機能構成図である。FIG. 2 is a functional configuration diagram of a data analysis device. レーダー探査装置が行う処理を示したフローチャートである。It is a flow chart showing processing performed by a radar exploration device. 地上標定点計測装置が行う処理を示したフローチャートである。It is a flow chart showing processing performed by the ground control point measuring device. データ解析装置の処理を示したフローチャートである。It is a flow chart showing processing of a data analysis device. 計測範囲及び測線の関係を示した図である。It is a figure showing the relationship between a measurement range and a survey line. 計測範囲、計測領域、及び撮影領域の関係を示した図である。FIG. 3 is a diagram showing the relationship between a measurement range, a measurement area, and a photographing area. 画像データのマッチング及びレーダー探査データの合成の処理を示した概念図である。FIG. 2 is a conceptual diagram showing processing for matching image data and combining radar exploration data.
 以下、図面を用いて本発明の実施形態を説明する。 Hereinafter, embodiments of the present invention will be described using the drawings.
 〔実施形態のシステム構成の概略〕
 図1を用いて、実施形態の探査システムの構成の概略について説明する。図1は、実施形態に係る探査システムの全体構成図である。
[Summary of system configuration of embodiment]
The outline of the configuration of the exploration system according to the embodiment will be described using FIG. 1. FIG. 1 is an overall configuration diagram of an exploration system according to an embodiment.
 図1に示されているように、探査システム1は、計測カート2に設置されたレーダー探査装置3、三脚4に設置された地上標定点計測装置6、及びデータ解析装置9によって構築されている。計測カート2は、人力又は自動で走行するカートである。例えば、計測カート2は、AGV(Automated Guided Vehicle無人搬送車)等であってもよい。 As shown in FIG. 1, an exploration system 1 is constructed by a radar exploration device 3 installed on a measurement cart 2, a ground control point measurement device 6 installed on a tripod 4, and a data analysis device 9. . The measurement cart 2 is a cart that runs manually or automatically. For example, the measurement cart 2 may be an AGV (Automated Guided Vehicle) or the like.
 レーダー探査装置3は、例えば、GPR(Ground Penetration Radar)の計測を行うためのGPR計測装置等である。レーダー探査装置3は、地面から地中方向へ向けて探査用の電磁波を送信し、地中からGPR反射波等のレーダー探査データを取得する。また、レーダー探査装置3は、地面を撮影して地面の画像データを得る。計測カート2の移動により、レーダー探査装置3は、地面を走査することで、連続的にレーダー探査データ及び画像データを取得し、これらを関連付けてレーダー探査装置3にて記憶しておく。その後、レーダー探査データ及び画像データは、データ解析装置9に渡される。 The radar exploration device 3 is, for example, a GPR measurement device for measuring GPR (Ground Penetration Radar). The radar exploration device 3 transmits electromagnetic waves for exploration from the ground toward the underground, and acquires radar exploration data such as GPR reflected waves from underground. The radar exploration device 3 also photographs the ground to obtain image data of the ground. As the measurement cart 2 moves, the radar exploration device 3 scans the ground to continuously acquire radar exploration data and image data, and stores them in association with each other. The radar survey data and image data are then passed to the data analysis device 9.
 地上標定点計測装置6は、地面の任意の点としての仮想的な地上標定点(Ground Control Point: GCP)の計測を行うことで、地上標定点の絶対位置(緯度、経度、高度)を計測するGCP計測装置等である。地上標定点計測装置6は、地上標定点を含む地面の特定面を撮影して特定面の画像データを得る。地上標定点計測装置6は、絶対位置の情報及び特定面の画像データを関連付けて地上標定点計測装置6にて記憶しておく。その後、絶対位置の情報及び特定面の画像データは、データ解析装置9に渡される。 The ground control point measuring device 6 measures the absolute position (latitude, longitude, altitude) of a ground control point by measuring a virtual ground control point (GCP) as an arbitrary point on the ground. GCP measurement equipment, etc. The ground control point measuring device 6 photographs a specific surface of the ground including the ground control point to obtain image data of the specific surface. The ground control point measuring device 6 stores absolute position information and image data of a specific plane in association with each other. Thereafter, the absolute position information and the image data of the specific plane are passed to the data analysis device 9.
 データ解析装置9は、レーダー探査装置3から得た複数の画像データ同士の画像データマッチングを行い、これらの画像データに関連付けられている各レーダー探査データの相対的な変位量を算出して、レーダー探査データを正確に面的に合成することで、合成探査データを生成する。また、データ解析装置9は、地上標定点計測装置6から得た特定面の画像データと、レーダー探査装置3から得た画像データの画像データマッチングを行い、合成探査データに関する絶対位置のキャリブレーションを行う。これにより、埋設物又は空洞の絶対位置に関するデータ解析を行う。 The data analysis device 9 performs image data matching between a plurality of image data obtained from the radar exploration device 3, calculates the relative displacement amount of each radar exploration data associated with these image data, and calculates the relative displacement amount of each radar exploration data associated with these image data. Synthetic exploration data is generated by accurately combining exploration data in a planar manner. Furthermore, the data analysis device 9 performs image data matching between the image data of a specific plane obtained from the ground control point measuring device 6 and the image data obtained from the radar exploration device 3, and performs absolute position calibration regarding the synthetic exploration data. conduct. This performs data analysis regarding the absolute position of the buried object or cavity.
 なお、レーダー探査装置3は、地上標定点計測装置6又はデータ解析装置9と一体となって、計測カート2に設けられてもよい。一体の場合は、後述のデータ記憶部35,65、又は撮影部36,66は、共有してもよい。 Note that the radar exploration device 3 may be provided in the measurement cart 2 integrally with the ground control point measurement device 6 or the data analysis device 9. When integrated, the data storage sections 35 and 65 or the photographing sections 36 and 66, which will be described later, may be shared.
 〔実施形態のシステム内の各構成〕
 <レーダー探査装置の構成>
 続いて、図2を用いて、レーダー探査装置3aの構成を説明する。図2は、レーダー探査装置の構成図である。なお、レーダー探査装置3aは、図1のレーダー探査装置3の一例である。
[Each configuration in the system of the embodiment]
<Configuration of radar exploration device>
Next, the configuration of the radar exploration device 3a will be explained using FIG. 2. FIG. 2 is a configuration diagram of the radar exploration device. Note that the radar exploration device 3a is an example of the radar exploration device 3 in FIG.
 図2に示すように、レーダー探査装置3aは、GNSSアンテナ30、時計部31、送信部33、GPRアンテナ32、受信部34、データ記憶部35、及び撮影部36を備えている。なお、レーダー探査装置3aは、GPR計測により地中探査を行う装置であるが、埋設物や空洞等を探査できれば、GPR計測以外の探査手段を使用してもよい。 As shown in FIG. 2, the radar exploration device 3a includes a GNSS antenna 30, a clock section 31, a transmitting section 33, a GPR antenna 32, a receiving section 34, a data storage section 35, and a photographing section 36. Note that although the radar exploration device 3a is a device that performs underground exploration using GPR measurement, exploration means other than GPR measurement may be used as long as buried objects, cavities, etc. can be explored.
 これらのうち、GNSSアンテナ30は、航法衛星から発射されるGNSS信号を受信するアンテナである。GNSS信号は、UTC(Coordinated Universal Time: 協定世界時)に同期した時刻情報を含む。 Among these, the GNSS antenna 30 is an antenna that receives GNSS signals emitted from navigation satellites. GNSS signals include time information synchronized with UTC (Coordinated Universal Time).
 時計部31は、GNSS受信機を内蔵し、GNSSアンテナ30によって受信されたGNSS信号を受信し、UTCに同期した時刻情報を管理する。時計部31は、タイムスタンプを打刻するために、例えば、GNSSに時刻同期することによりUTCに高精度に時刻同期し、高架下等で一時的にGNSS信号を受信できない場合であっても内蔵のクロックによるホールドオーバー(自走)動作により、一定の時間内は時計の時刻精度を維持する。時計部31はルビジウム発振器等を内蔵した原子時計であってもよい。 The clock unit 31 has a built-in GNSS receiver, receives GNSS signals received by the GNSS antenna 30, and manages time information synchronized with UTC. In order to stamp time stamps, the clock unit 31 can synchronize the time with UTC with high precision by synchronizing with GNSS, for example, and even when the GNSS signal cannot be received temporarily under an elevated railway, etc., the clock unit 31 is built-in. The clock's holdover (self-running) operation maintains the clock's time accuracy within a certain period of time. The clock section 31 may be an atomic clock incorporating a rubidium oscillator or the like.
 GPRアンテナ32は、GPR電磁波を送信し、これに対するGPR反射波を受信するアンテナである。GPRアンテナ32は、送信部33から同軸ケーブル等により伝送された探査用のGPR電磁波を地中方向へ向け送信し、地中からのGPR反射波を受信して、GPR反射波のデータ(以下、「GPRデータ」と示す)を同軸ケーブル等により受信部34へ伝送する。GPRアンテナ32は、複数のアンテナ素子を配列し、地面の一定の幅を一度に掃引することができるアレイ型アンテナであってもよい。また、複数の周波数領域をカバーするために、複数の送信部33とGPRアンテナ32が組み合わされていてもよい。 The GPR antenna 32 is an antenna that transmits GPR electromagnetic waves and receives GPR reflected waves therefrom. The GPR antenna 32 transmits GPR electromagnetic waves for exploration transmitted from the transmitter 33 via a coaxial cable or the like toward the ground, receives GPR reflected waves from underground, and receives data of the GPR reflected waves (hereinafter referred to as (denoted as "GPR data") is transmitted to the receiving section 34 via a coaxial cable or the like. The GPR antenna 32 may be an array type antenna in which a plurality of antenna elements are arranged and can sweep a certain width of the ground at once. Further, in order to cover a plurality of frequency regions, a plurality of transmitting sections 33 and GPR antennas 32 may be combined.
 送信部33は、地中等の探査用のGPR電磁波を発生させてGPRアンテナ32に送信する。送信部33は、送信する信号の周波数を時間軸上でステップ状に変化させてもよい。 The transmitter 33 generates GPR electromagnetic waves for underground exploration and transmits them to the GPR antenna 32. The transmitter 33 may change the frequency of the signal to be transmitted stepwise on the time axis.
 受信部34は、GPRアンテナ32からGPRデータを受信し、データ記憶部35に記憶する。 The receiving unit 34 receives GPR data from the GPR antenna 32 and stores it in the data storage unit 35.
 データ記憶部35は、あるタイムエポックにおいて、撮影部36の撮影により得た画像データと受信部34から得たGPRデータを関連付けて記憶する。これらのデータにはそれぞれ、時計部31から取得されたデータ取得時刻のタイムスタンプ(時刻情報の一例)が付与される。また、各データはどの測線の計測データであるかを識別するための測線識別子(識別情報の一例)が記録され、各測線単位で計測を開始及び終了した時刻が記憶される。なお、
開始時刻及び終了時刻の間の時刻を示す時刻情報は破棄されて記憶されなくてもよい。
The data storage unit 35 stores image data obtained by photographing by the photographing unit 36 and GPR data obtained from the receiving unit 34 in association with each other in a certain time epoch. A timestamp (an example of time information) of the data acquisition time acquired from the clock unit 31 is attached to each of these data. In addition, a survey line identifier (an example of identification information) for identifying which survey line the measurement data belongs to is recorded in each data, and the times at which measurement is started and ended for each survey line are stored. In addition,
Time information indicating a time between the start time and end time may be discarded and not stored.
 撮影部36は、可視光のカメラにより構成されており、GPRアンテナ32で走査する付近の地面を撮影する。カメラは一台であってもよいし、複数のカメラでそれぞれ撮影する範囲を分担してもよい。または、魚眼レンズを備えた全方位撮影カメラであってもよい。また、カメラは、単眼カメラであってもよく、ステレオカメラであってもよい。いずれの場合も、カメラの撮影する地面の範囲はGPRアンテナ32で走査する領域をカバーし、さらにGPRアンテナ32の両側の一定の幅(走査方向と直交する方向の幅)のエリアの地面を撮影範囲としてカバーするものとする。この一定の幅は、例えば10cmである。 The photographing unit 36 is composed of a visible light camera, and photographs the ground in the vicinity scanned by the GPR antenna 32. The number of cameras may be one, or a plurality of cameras may share the photographing range. Alternatively, it may be an omnidirectional camera equipped with a fisheye lens. Further, the camera may be a monocular camera or a stereo camera. In either case, the range of the ground photographed by the camera covers the area scanned by the GPR antenna 32, and also photographs the ground in an area of a certain width (width in the direction orthogonal to the scanning direction) on both sides of the GPR antenna 32. The scope shall be covered. This fixed width is, for example, 10 cm.
 レーダー探査装置3aの撮影部(カメラ)36の位置とGPRアンテナ32の位置は相対的に固定されており、それぞれの参照点間の相対的な距離(オフセット値)は、あらかじめ正確に計測されている。撮影部36は地面を撮影し、地面の画像データ上の任意のピクセルとGPRアンテナ32の位置の間のオフセットは予め計測し、校正されているものとする。計測時には必要に応じて照明等により地面の撮影範囲を撮影するのに必要な照度を確保する。また撮影部36は可視光以外の遠赤外線カメラ等を使用してもよい。 The position of the imaging unit (camera) 36 of the radar exploration device 3a and the position of the GPR antenna 32 are relatively fixed, and the relative distance (offset value) between the respective reference points is accurately measured in advance. There is. It is assumed that the photographing unit 36 photographs the ground, and that the offset between any pixel on the image data of the ground and the position of the GPR antenna 32 has been measured and calibrated in advance. At the time of measurement, the necessary illumination to photograph the photographing range of the ground is ensured by lighting, etc., as necessary. Further, the photographing unit 36 may use a far-infrared camera or the like that does not use visible light.
 <レーダー探査装置の変形例の構成>
 続いて、図3を用いて、レーダー探査装置3bの構成を説明する。図3は、レーダー探査装置の変形例の構成図である。なお、レーダー探査装置3bは、図1のレーダー探査装置3の一例である。なお、図2の構成と同じ構成については、同一の符号を付して説明を省略する。
<Configuration of modified example of radar exploration device>
Next, the configuration of the radar exploration device 3b will be explained using FIG. 3. FIG. 3 is a configuration diagram of a modified example of the radar exploration device. Note that the radar exploration device 3b is an example of the radar exploration device 3 in FIG. Note that the same components as those in FIG. 2 are designated by the same reference numerals and the description thereof will be omitted.
 図3に示すように、レーダー探査装置3bは、レーダー探査装置3aに対して、更に、ジャイロセンサ37、加速度センサ38、及びオドメトリ39を備えている。 As shown in FIG. 3, the radar exploration device 3b is further equipped with a gyro sensor 37, an acceleration sensor 38, and an odometry 39 compared to the radar exploration device 3a.
 ジャイロセンサ37は、計測カート2(レーダー探査装置3b)の角速度を検出するセンサである。加速度センサ38は、計測カート2(レーダー探査装置3b)の加速度を計測するセンサである。オドメトリ39は、例えば、ロータリーエンコーダを使用して計測カート2の車輪の回転数を計測することにより、車輪の回転数から、計測カート2の移動距離、速度、旋回角度を求めるデバイスである。オドメトリ39は、自己位置推定の精度を更に上げるためのデバイスであり、ジャイロセンサ37及び加速度センサ38を補完するデバイスであるため、レーダー探査装置3bに必ずしも設けられなくてもよい。 The gyro sensor 37 is a sensor that detects the angular velocity of the measurement cart 2 (radar exploration device 3b). The acceleration sensor 38 is a sensor that measures the acceleration of the measurement cart 2 (radar exploration device 3b). The odometry 39 is a device that measures the number of rotations of the wheels of the measurement cart 2 using a rotary encoder, for example, and calculates the moving distance, speed, and turning angle of the measurement cart 2 from the number of rotations of the wheels. The odometry 39 is a device for further increasing the accuracy of self-position estimation, and is a device that complements the gyro sensor 37 and the acceleration sensor 38, so it does not necessarily need to be provided in the radar exploration device 3b.
 ジャイロセンサ37、加速度センサ38、及びオドメトリ39からのデータは、撮影部36で撮影された画像データの品質が良好ではない、又は撮影部36に欠損を生じた場合に、その間のレーダー探査装置3の相対変位を計測することにより、GPRデータを取得したレーダー探査装置3の位置推定を補完するために使用される。 The data from the gyro sensor 37, the acceleration sensor 38, and the odometry 39 are transmitted to the radar exploration device 3 in between when the quality of the image data photographed by the photographing section 36 is not good or when a defect occurs in the photographing section 36. This is used to complement the position estimation of the radar exploration device 3 that has acquired the GPR data by measuring the relative displacement of the GPR data.
 <地上標定点計測装置の構成>
 続いて、図4を用いて、地上標定点計測装置6の構成を説明する。図4は、地上標定点計測装置の構成図である。
<Configuration of ground control point measuring device>
Next, the configuration of the ground control point measuring device 6 will be explained using FIG. 4. FIG. 4 is a configuration diagram of the ground control point measuring device.
 図4に示すように、地上標定点計測装置6は、GNSSアンテナ60、GNSS受信部61、補正データ受信部64、データ記憶部65、及び撮影部66を備えている。 As shown in FIG. 4, the ground control point measuring device 6 includes a GNSS antenna 60, a GNSS receiving section 61, a correction data receiving section 64, a data storage section 65, and a photographing section 66.
 GNSSアンテナ60は、図2のGNSSアンテナ30の構成と基本的に同様であるが、水平面内に一定距離の離隔を設けた2アンテナの構成であってもよい。 The GNSS antenna 60 has basically the same configuration as the GNSS antenna 30 in FIG. 2, but may have a configuration of two antennas separated by a certain distance in a horizontal plane.
 GNSS受信部61は、GNSSアンテナ60から複数の衛星からの航法衛星信号を受信し、GNSSアンテナ60の位置を特定する。この場合、補正データ受信部64から、搬送波位相測位の補正データを取得して、RTK-GNSS方式による測位演算を行う。これにより、GNSS受信部61は、最終的に特定した絶対位置(座標データ)の情報をデータ記憶部65に記憶する。GNSSアンテナ60が2アンテナ構成の場合には、GNSS受信部61はそれぞれのアンテナの位置を特定し、絶対位置(座標データ)の情報と2つのアンテナ位置を結ぶ線の方位を演算してデータ記憶部65に記憶する。 The GNSS receiving unit 61 receives navigation satellite signals from a plurality of satellites from the GNSS antenna 60 and identifies the position of the GNSS antenna 60. In this case, correction data for carrier phase positioning is acquired from the correction data receiving section 64, and positioning calculations using the RTK-GNSS method are performed. Thereby, the GNSS receiving section 61 stores information on the finally specified absolute position (coordinate data) in the data storage section 65. When the GNSS antenna 60 has a two-antenna configuration, the GNSS receiving unit 61 identifies the position of each antenna, calculates absolute position (coordinate data) information and the direction of a line connecting the two antenna positions, and stores the data. 65.
 補正データ受信部64は、通信ネットワーク等を介して搬送波位相測位の補正データを受信して、GNSS受信部61に補正データを送信する。なお、補正データ受信部64、又はGNSS受信部61の測位演算を行う機能が、クラウド処理基盤上に設けられてもよい。 The correction data receiving unit 64 receives correction data for carrier phase positioning via a communication network, etc., and transmits the correction data to the GNSS receiving unit 61. Note that the function of performing positioning calculations for the correction data receiving section 64 or the GNSS receiving section 61 may be provided on the cloud processing platform.
 データ記憶部65は、GNSS受信部61で測位演算の結果得られた絶対位置の情報、撮影部66で得られた画像データ、及びオフセット値等の情報を関連付けて記憶する。 The data storage unit 65 stores absolute position information obtained as a result of positioning calculation by the GNSS receiving unit 61, image data obtained by the imaging unit 66, and information such as offset values in association with each other.
 撮影部66は、図2の撮影部36の構成と基本的に同様である。GNSSアンテナ60と撮影部(カメラ)66の相対的な位置は固定し、計測されている。地上標定点計測装置6は、図1のように、三脚4の支持台を使用して地面に設置され、撮影部66は仮想的なGCPを含む地表面の画像を撮影する。撮影部66の光軸は必ずしも正確に鉛直下方向と一致する必要はないが、GNSSアンテナ60の位相中心と撮影部66により撮影される地表面の画像データの任意のピクセル(例えば画像データの中心)のオフセット値はあらかじめ計測し、校正されているものとする。撮影により得られた画像データ中の任意のピクセル(例えば画像データの中心)が仮想的なGCPとなる。又は、画像データの全体若しくは一部が仮想的なGCPであってもよい。撮影時には必要に応じて照明等により、地面の撮影範囲を撮影するのに必要な照度を確保する。撮影部66は、可視光以外の遠赤外線カメラ等であってもよい。この場合、図2の撮影部36も同様に、可視光以外の遠赤外線カメラ等である。 The configuration of the imaging unit 66 is basically the same as that of the imaging unit 36 in FIG. 2. The relative positions of the GNSS antenna 60 and the imaging unit (camera) 66 are fixed and measured. As shown in FIG. 1, the ground control point measuring device 6 is installed on the ground using the support of the tripod 4, and the photographing unit 66 photographs an image of the ground surface including a virtual GCP. The optical axis of the photographing section 66 does not necessarily have to exactly coincide with the vertical downward direction; ) is assumed to have been measured and calibrated in advance. Any pixel in the image data obtained by photography (for example, the center of the image data) becomes a virtual GCP. Alternatively, all or part of the image data may be a virtual GCP. When photographing, illuminance necessary for photographing the photographing range of the ground is ensured by lighting, etc. as necessary. The photographing unit 66 may be a far-infrared camera that uses other than visible light. In this case, the photographing unit 36 in FIG. 2 is also a far-infrared camera that uses light other than visible light.
 <データ解析装置の構成>
 データ解析装置は、コンピュータにより構成されているため、以下では、電気的な構成と機能的な構成に分けて説明する。
<Configuration of data analysis device>
Since the data analysis device is configured by a computer, the electrical configuration and functional configuration will be explained separately below.
 (電気的な構成)
 図5は、データ解析装置の電気的なハードウェア構成図である。
(electrical configuration)
FIG. 5 is an electrical hardware configuration diagram of the data analysis device.
 図5に示すように、データ解析装置9は、コンピュータとして、図5に示されているように、CPU901、ROM902、RAM903、SSD904、外部機器接続I/F(Interface)905、ネットワークI/F906、ディスプレイ907、入力デバイス908、メディアI/F909、及びバスライン910を備えている。 As shown in FIG. 5, the data analysis device 9 includes, as a computer, a CPU 901, ROM 902, RAM 903, SSD 904, external device connection I/F (Interface) 905, network I/F 906, It includes a display 907, an input device 908, a media I/F 909, and a bus line 910.
 これらのうち、CPU901は、データ解析装置9全体の動作を制御する。ROM902は、IPL等のCPU901の駆動に用いられるプログラムを記憶する。RAM903は、CPU901のワークエリアとして使用される。 Among these, the CPU 901 controls the operation of the data analysis device 9 as a whole. The ROM 902 stores programs used to drive the CPU 901 such as IPL. RAM903 is used as a work area for CPU901.
 SSD904は、CPU901の制御に従って各種データの読み出し又は書き込みを行う。なお、SSD904の代わりに、HDD(Hard Disk Drive)を用いてもよい。 The SSD 904 reads or writes various data under the control of the CPU 901. Note that an HDD (Hard Disk Drive) may be used instead of the SSD 904.
 外部機器接続I/F905は、各種の外部機器を接続するためのインターフェースである。この場合の外部機器は、ディスプレイ、スピーカ、キーボード、マウス、USBメモリ、及びプリンタ等である。 The external device connection I/F 905 is an interface for connecting various external devices. External devices in this case include a display, speaker, keyboard, mouse, USB memory, printer, and the like.
 ネットワークI/F906は、インターネット等の通信ネットワークを介してデータ通信をするためのインターフェースである。 The network I/F 906 is an interface for data communication via a communication network such as the Internet.
 ディスプレイ907は、各種画像を表示する液晶や有機EL(Electro Luminescence)などの表示手段の一種である。 The display 907 is a type of display means such as liquid crystal or organic EL (Electro Luminescence) that displays various images.
 入力デバイス908は、各種指示の選択や実行、処理対象の選択、カーソルの移動などを行う入力手段の一種である。入力デバイス908の一例として、ポインティングデバイスが挙げられる。 The input device 908 is a type of input means for selecting and executing various instructions, selecting a processing target, moving a cursor, and the like. An example of the input device 908 is a pointing device.
 メディアI/F909は、フラッシュメモリ等の記録メディア909mに対するデータの読み出し又は書き込み(記憶)を制御する。記録メディア909mには、DVDやBlu-ray Disc(登録商標)等も含まれる。 The media I/F 909 controls reading or writing (storage) of data to a recording medium 909m such as a flash memory. The recording media 909m includes DVDs, Blu-ray Discs (registered trademark), and the like.
 バスライン910は、図5に示されているCPU901等の各構成要素を電気的に接続するためのアドレスバスやデータバス等である。 The bus line 910 is an address bus, a data bus, etc. for electrically connecting each component such as the CPU 901 shown in FIG. 5.
 (機能的な構成)
 図6は、データ解析装置の機能構成図である。
(Functional configuration)
FIG. 6 is a functional configuration diagram of the data analysis device.
 図6に示すように、データ解析装置9は、取得部90、探査系画像データマッチング部91、データ合成部93、絶対位置系画像データマッチング部95、絶対位置キャリブレーション部97、及び出力部99を有している。これら各部は、RAM903等に記憶されているプログラムに従ったCPU901からの命令によって実現される機能である。 As shown in FIG. 6, the data analysis device 9 includes an acquisition section 90, an exploration image data matching section 91, a data synthesis section 93, an absolute position image data matching section 95, an absolute position calibration section 97, and an output section 99. have. Each of these units is a function realized by instructions from the CPU 901 according to a program stored in the RAM 903 or the like.
 これらのうち、取得部90は、通信ネットワーク等を介して、レーダー探査装置3から、画像データ、レーダー探査データ(GPRデータ)、識別情報、及び時刻情報等を取得する。また、取得部90は、通信ネットワーク等を介して、地上標定点計測装置6から、画像データ、及び絶対位置の情報等を取得する。 Among these, the acquisition unit 90 acquires image data, radar exploration data (GPR data), identification information, time information, etc. from the radar exploration device 3 via a communication network or the like. Further, the acquisition unit 90 acquires image data, absolute position information, etc. from the ground control point measuring device 6 via a communication network or the like.
 探査系画像データマッチング部91は、例えば、第1のレーダー探査データと位置に関して関連付けられた第1の画像データと、第2のレーダー探査データと位置に関して関連付けられた第2の画像データとの画像データマッチングを行う。この場合、第1の画像データで写し出される第1の所定面と、第2の画像データで写し出される第2の所定面は、地面において隣接する関係にある。 The exploration system image data matching unit 91 is configured to, for example, match images of first image data associated with the first radar exploration data in terms of position, and second image data associated with the second radar exploration data in terms of position. Perform data matching. In this case, the first predetermined surface depicted by the first image data and the second predetermined surface depicted by the second image data are adjacent to each other on the ground.
 データ合成部93は、探査系画像データマッチング部91による画像データマッチングに基づいて、例えば、第1の所定面及び前記第2の所定面の垂直面で第1のレーダー探査データ及び第2のレーダー探査データの間の相対的な変位を算出し、これを補正した上で合成することで合成探査データを生成する。 Based on the image data matching by the exploration system image data matching section 91, the data synthesis section 93 combines the first radar exploration data and the second radar exploration data on a vertical plane of the first predetermined surface and the second predetermined surface, for example. Composite exploration data is generated by calculating the relative displacement between the exploration data, correcting it, and then composing it.
 絶対位置系画像データマッチング部95は、例えば、第1の所定面又は第2の所定面における特定面(例えば、図11の特定面51)の地上標定点を計測することで得られた絶対位置に関連付けられ特定面を撮影することで得られた第3の画像データと、第1の所定面又は第2の所定面における他の特定面(例えば、図11の特定面52)の地上標定点を計測することで得られた絶対位置に関連付けられ他の特定面を撮影することで得られた第4の画像データのそれぞれと、第1の画像データ及び第2の画像データの少なくとも一方であり、特定面及び他の特定面を含む特定の画像データとの画像データマッチングを行う。 The absolute position-based image data matching unit 95 uses an absolute position obtained by measuring the ground control point of a specific plane (for example, the specific plane 51 in FIG. 11) on the first predetermined plane or the second predetermined plane, for example. The third image data obtained by photographing the specific surface associated with each of the fourth image data obtained by photographing another specific surface associated with the absolute position obtained by measuring the image data, and at least one of the first image data and the second image data. , performs image data matching with specific image data including the specific plane and other specific planes.
 なお、第3及び第4の画像データ及び各絶対位置の情報は、地上標定点計測装置6から取得されたデータである。また、特定面は、図11の特定面51のように計測範囲Aに含まれている場合だけでなく、図11の特定面52のように計測範囲A及び計測範囲Bの両方に含まれている場合も有り得る。なお、計測領域の面全体の座標を正確に校正するためには、計測領域内の4隅と中心位置付近の5点に仮想的な地上標定点を設定することが望ましいが、図11に示すように、少なくとも2か所の特定面(特定面51,52等)の座標値が計測されれば、第1の画像データ及び第2の画像データの方位と座標値が定まるため、少なくとも2か所の地上標定点を設定すればよい。なお、地上標定点計測装置6のGNSSアンテナ60が2アンテナ構成の場合には、特定面の画像データの座標値だけではなく、特定面の画像データの方位も計測するため、これに基づき第1の画像データ及び第2の画像データの方位と座標値を定めれば地上標定点の設定は1か所でもよい。 Note that the third and fourth image data and the information on each absolute position are data acquired from the ground control point measuring device 6. Furthermore, the specific surface is not only included in the measurement range A as in the case of the specific surface 51 in FIG. It is possible that there are. In addition, in order to accurately calibrate the coordinates of the entire surface of the measurement area, it is desirable to set virtual ground control points at the four corners and five points near the center position within the measurement area. As shown in FIG. All you have to do is set the ground control point at the location. Note that when the GNSS antenna 60 of the ground control point measuring device 6 has a two-antenna configuration, it measures not only the coordinate values of the image data of a specific plane but also the direction of the image data of the specific plane. As long as the orientation and coordinate values of the image data and the second image data are determined, the ground control point may be set at one location.
 また、特定の画像データは、第1の画像データ(計測範囲A等)及び第2の画像データ(計測範囲B等)が合成された合成画像データであってもよい。合成画像データの画像には、少なくとも2か所の特定面(特定面51,52等)が含まれている。この場合、データ合成部93が、第1の画像データ(計測範囲A等)及び第2の画像データ(計測範囲B等)を合成して、合成画像データを生成する。 Further, the specific image data may be composite image data in which first image data (measurement range A, etc.) and second image data (measurement range B, etc.) are combined. The image of the composite image data includes at least two specific surfaces (specific surfaces 51, 52, etc.). In this case, the data synthesis unit 93 synthesizes the first image data (measurement range A, etc.) and the second image data (measurement range B, etc.) to generate synthesized image data.
 絶対位置キャリブレーション部97は、絶対位置系画像データマッチング部95による画像データマッチングに基づいて、合成探査データに関する絶対位置のキャリブレーションを行う。 The absolute position calibration unit 97 performs absolute position calibration regarding the synthetic exploration data based on the image data matching performed by the absolute position system image data matching unit 95.
 出力部99は、以上の解析の結果を出力する。 The output unit 99 outputs the results of the above analysis.
 〔探査システムの処理又は動作〕
 続いて、図7乃至図12を用いて、探査システムの処理又は動作を説明する。
[Processing or operation of exploration system]
Next, the processing or operation of the exploration system will be explained using FIGS. 7 to 12.
 <レーダー探査装置の処理>
 図7を用いて、レーダー探査装置3が行う処理について説明する。図7は、レーダー探査装置3が行う処理を示したフローチャートである。
<Radar exploration device processing>
Processing performed by the radar exploration device 3 will be explained using FIG. 7. FIG. 7 is a flowchart showing the processing performed by the radar exploration device 3.
 まずは、図10及び図11を用いて、探査対象の地面の状況及びレーダー探査装置3を備えた計測カート2の移動に関して説明する。図10は、計測範囲及び測線の関係を示した図である。図11は、計測範囲、レーダー探査データの計測領域、及び撮影領域の関係を示した図である。 First, the situation of the ground to be explored and the movement of the measurement cart 2 equipped with the radar exploration device 3 will be explained using FIGS. 10 and 11. FIG. 10 is a diagram showing the relationship between the measurement range and the measuring line. FIG. 11 is a diagram showing the relationship among the measurement range, the measurement area of radar exploration data, and the imaging area.
 図10に示すように、GPR計測の対象となる領域(以下、「計測領域」)Mの地面(所定面の一例)の地中10には、埋設物11、又は空洞12が存在する。探査者が、計測領域M全体をレーダー探査装置3(GPRアンテナ32)が一度にレーダー探査データを取得することができる幅と同等、又はそれよりも狭い間隔で、複数の領域(以下、「計測範囲」)A~Dに分割する。なお、計測範囲Aの地表面は第1の所定面の一例であり、計測範囲Bの地表面は第2の所定面の一例である。 As shown in FIG. 10, a buried object 11 or a cavity 12 exists underground 10 in the ground (an example of a predetermined surface) of a region M that is a target of GPR measurement (hereinafter referred to as "measurement region"). An explorer divides the entire measurement area M into multiple areas (hereinafter referred to as "measurement Range”) Divide into A to D. Note that the ground surface in the measurement range A is an example of the first predetermined surface, and the ground surface in the measurement range B is an example of the second predetermined surface.
 そして、探査者は、分割した計測範囲毎にその上を沿って計測カート2を移動させて探査を行うための測線a~dを引く。各測線a~dの位置は各計測範囲A~Dの進行方向の中心線であってもよいし、レーダー探査装置3で計測範囲を走査するために中心線に平行に描かれた任意の位置の線であってもよい。その場合、レーダー探査装置3の対応した位置にマーカーを印し、レーダー探査装置3が計測範囲を測線に沿って正しく掃引するための目印とする。 The explorer then moves the measurement cart 2 along each of the divided measurement ranges to draw survey lines a to d for exploration. The position of each survey line a to d may be the center line in the advancing direction of each measurement range A to D, or any position drawn parallel to the center line for scanning the measurement range with the radar exploration device 3. It may be a line. In that case, a marker is marked at the corresponding position on the radar exploration device 3, and is used as a mark for the radar exploration device 3 to correctly sweep the measurement range along the survey line.
 但し、実際の計測では、地面の凹凸、傾斜、道路の交通状況等により、レーダー探査装置3を完全に測線に沿って走査することは必ずしも容易ではない。そのため、計測領域内のデータを隈なく取得するために、図11に示すように、レーダー探査装置3がレーダー探査データを一度に取得(走査)することができる計測領域41の幅より計測範囲(図11では計測範囲B)の幅はやや小さく設定される。即ち、レーダー探査装置3がレーダー探査データを得るために走査する領域には多少のオーバーラップがあってもよいが、計測領域M内を面的に隙間なく走査するようにする。 However, in actual measurements, it is not always easy to completely scan the radar exploration device 3 along the survey line due to the unevenness of the ground, slope, road traffic conditions, etc. Therefore, in order to acquire all the data within the measurement area, as shown in FIG. 11, the measurement range ( In FIG. 11, the width of measurement range B) is set slightly smaller. That is, although there may be some overlap in the area scanned by the radar exploration device 3 to obtain radar exploration data, the area within the measurement area M is scanned without gaps.
 一方、地面の撮影領域42の幅はレーダー探査データを計測するための計測領域41の幅よりさらに広く、計測領域41だけではなく隣接する計測範囲をもカバーするように設定される。一例として測線bでの計測における計測範囲Bとレーダー探査データの計測領域41及び地面の撮影領域42の関係は図11に示す状態になる。即ち、レーダー探査データを得るための計測領域41の幅(走査方向と直交する方向の幅)が、画像データを得るための撮影領域42の幅(走査方向と直交する方向の幅)よりも狭く、計測範囲A等の所定面の幅が計測領域41の幅よりも狭い。 On the other hand, the width of the ground imaging area 42 is wider than the width of the measurement area 41 for measuring radar exploration data, and is set to cover not only the measurement area 41 but also the adjacent measurement range. As an example, the relationship between the measurement range B in the measurement along the survey line b, the measurement area 41 of the radar exploration data, and the photographing area 42 of the ground is as shown in FIG. 11. That is, the width of the measurement area 41 for obtaining radar exploration data (the width in the direction orthogonal to the scanning direction) is narrower than the width of the imaging area 42 for obtaining the image data (the width in the direction orthogonal to the scanning direction). , the width of the predetermined plane such as the measurement range A is narrower than the width of the measurement area 41.
 計測カート2は、必ずしも測線a、b、c、・・・の順に移動する必要はない。各測線に沿って計測カート2を移動させながら、レーダー探査装置3が繰り返し探査を行う。各測線の探査は時間的に連続して実施する必要はなく、それぞれ任意の時刻に実施することができる。 The measurement cart 2 does not necessarily need to move in the order of survey lines a, b, c, . . . The radar exploration device 3 repeatedly performs exploration while moving the measurement cart 2 along each survey line. The exploration of each survey line does not need to be carried out consecutively in time, and can be carried out at any arbitrary time.
 以上の状況下で、レーダー探査装置3の処理を図7により説明する。 Under the above circumstances, the processing of the radar exploration device 3 will be explained with reference to FIG.
 S11:図10に示すように、計測カート2を移動させることで、GPRアンテナ32及び撮影部36は、地面のレーダー探査データ及び地面の画像データを同時に取得する。 S11: As shown in FIG. 10, by moving the measurement cart 2, the GPR antenna 32 and the imaging unit 36 simultaneously acquire ground radar exploration data and ground image data.
 S12:データ記憶部35は、レーダー探査データ及び画像データ、並びに測線識別情報及び取得時刻情報を関連付けて記憶する。 S12: The data storage unit 35 stores radar exploration data and image data, as well as survey line identification information and acquisition time information in association with each other.
 S13:撮影及び探査の走査が終了していない場合(NO)、処理S11に戻る。一方、撮影及び探査の走査が終了した場合(YES)、図7の処理は終了する。 S13: If the imaging and exploration scans have not been completed (NO), return to process S11. On the other hand, if the imaging and exploration scans are completed (YES), the process in FIG. 7 is completed.
 レーダー探査装置3による計測の際には、別途、探査者等の操作端末の画面に表示した地図上で計測領域(図10の計測領域M)を選択し、計測範囲および測線が自動的に設定されてもよい。また、GNSS/INS(Inertial Navigation System)による複合測位による測位結果を使用して測線に沿った計測カート2の手動での移動操作のマシンガイダンスや計測カート2の自動走行を行ってもよい。 When measuring with the radar exploration device 3, a measurement area (measurement area M in Fig. 10) is selected on a map displayed on the screen of an operating terminal of the explorer, etc., and the measurement range and survey line are automatically set. may be done. Furthermore, positioning results obtained by composite positioning using GNSS/INS (Inertial Navigation System) may be used to perform machine guidance for manual movement operation of the measurement cart 2 along the survey line or for automatic travel of the measurement cart 2.
 <地上標定点計測装置の処理>
 次に、図8を用いて、地上標定点計測装置6が行う処理について説明する。図8は、地上標定点計測装置6が行う処理を示したフローチャートである。
<Processing of ground control point measuring device>
Next, the processing performed by the ground control point measuring device 6 will be described using FIG. 8. FIG. 8 is a flowchart showing the processing performed by the ground control point measuring device 6.
 まずは、図10及び図11を用いて、計測(探査)対象の地面の状況及び地上標定点計測装置6が設けられた三脚4の設置位置に関して説明する。 First, the situation of the ground to be measured (explored) and the installation position of the tripod 4 on which the ground control point measuring device 6 is installed will be explained using FIGS. 10 and 11.
 図11の特定面51等のように、計測領域M内のGCP計測点の位置はあらかじめ決めておいてもよいし、RTK-GNSSの測位状態(測位解の収束状態)を確認しながら計測領域M内の任意の位置をGCP計測点として選定してもよい。また、同じ位置で有効なGNSS測位解が得られるまで時間帯を変えてRTK-GNSS測位を繰り返し実施してもよい。地上標定点計測装置6のGNSS受信部61におけるGNSS測位と撮影部66の撮影のタイミングは地上標定点計測装置6が設けられた三脚4の設置位置を動かさなければ、必ずしも一致する必要はない。例えば、地上標定点計測装置6で仮想的なGCP付近の地面を撮影後にRTK-GNSS測位を開始し、有効な収束(FIX)解が得られるまで測位を継続してもよい。 The position of the GCP measurement point within the measurement area M may be determined in advance, such as the specific surface 51 in FIG. Any position within M may be selected as a GCP measurement point. Furthermore, RTK-GNSS positioning may be repeatedly performed at different time zones until a valid GNSS positioning solution is obtained at the same location. The timing of GNSS positioning in the GNSS receiving section 61 of the ground control point measuring device 6 and the timing of photographing in the photographing section 66 do not necessarily have to coincide unless the installation position of the tripod 4 on which the ground control point measuring device 6 is installed is moved. For example, RTK-GNSS positioning may be started after photographing the ground near a virtual GCP using the ground control point measuring device 6, and positioning may be continued until an effective convergence (FIX) solution is obtained.
 以上の状況下で、地上標定点計測装置6の処理を図8により説明する。なお、ここでは、図11に示す計測範囲Aの特定面51について説明するが、特定面はこの特定面51に限るものではない。 Under the above circumstances, the processing of the ground control point measuring device 6 will be explained with reference to FIG. Note that although the specific surface 51 of the measurement range A shown in FIG. 11 will be described here, the specific surface is not limited to this specific surface 51.
 S21:GNSS受信部61は、地面の特定面51におけるGNSS信号及び補正データに基づき、特定面の絶対位置を計測する。 S21: The GNSS receiving unit 61 measures the absolute position of the specific surface 51 of the ground based on the GNSS signal and correction data on the specific surface 51.
 S22:撮影部66は、特定面51の画像データを取得する。 S22: The photographing unit 66 acquires image data of the specific surface 51.
 S23:データ記憶部65は、特定面51の絶対位置を含む標定点データ及び特定面の画像データを関連付けて記憶する。 S23: The data storage unit 65 stores the control point data including the absolute position of the specific surface 51 and the image data of the specific surface in association with each other.
 本実施形態の地上標定点(GCP)は路面の画像データに基づく仮想的なものであるため地面に物理的な標識やマーカーを設置する必要はなく、計測領域M内の任意の地点をGCPに設定することができる。従って、標識又はマーカーの位置に拘束されず、できるだけGNSS信号の受信環境の良好な地点をGCPに選定することができる。さらに、天空上のGNSS衛星の位置が経時的に変化するため、周辺の構造物との位置関係から良好な測位解が得られる可能性のある時間帯を選んでGCP計測を行うことも可能である。 Since the ground control point (GCP) of this embodiment is a virtual one based on image data of the road surface, there is no need to install physical signs or markers on the ground, and any point within the measurement area M can be used as a GCP. Can be set. Therefore, a point with as good a GNSS signal reception environment as possible can be selected as a GCP without being restricted by the position of a sign or marker. Furthermore, since the position of the GNSS satellite in the sky changes over time, it is also possible to perform GCP measurements by selecting a time period when a good positioning solution is likely to be obtained based on the positional relationship with surrounding structures. be.
 GNSS受信特性(測位可能性)を予測する方法としては、例えば建物の高さ情報を含む3次元地図データを使用し、GNSS可視衛星信号のDOP(Dilution Of Precision)値が小さく(良好に)なる時間帯を選択する方法がある。それでもGCP計測点におけるRTK-GNSS測位が困難な場合には、図1の地上標定点計測装置6をドローン等の飛行体に搭載し、GNSSアンテナ60の周辺が構造物で遮られない上空でRTK-GNSS測位を行い、望遠レンズでGCP計測点付近の地面の撮影を行うことが考えられる。この場合は、上空からGCPの位置を識別しやすいように地上に標識やマーカーを設置してもよい。それでもGCP計測点における測位が困難な場合はトータルステーション等の光学的な測距手段を併用してもよい。 As a method of predicting GNSS reception characteristics (possibility of positioning), for example, 3D map data including building height information is used, and the DOP (Dilution Of Precision) value of the GNSS visible satellite signal is reduced (better). There is a way to select the time zone. If RTK-GNSS positioning at the GCP measurement point is still difficult, the ground control point measurement device 6 shown in Figure 1 can be mounted on a flying vehicle such as a drone, and the RTK-GNSS positioning at the GCP measurement point can be carried out in the sky where the area around the GNSS antenna 60 is not obstructed by structures. -It is possible to perform GNSS positioning and photograph the ground near GCP measurement points using a telephoto lens. In this case, signs or markers may be installed on the ground to make it easier to identify the location of the GCP from the sky. If positioning at the GCP measurement point is still difficult, optical ranging means such as a total station may be used in combination.
 <データ解析装置の処理>
 次に、図9を用いて、データ解析装置9が行う処理について説明する。図9は、データ解析装置が行う処理を示したフローチャートである。なお、データ解析装置9による解析処理は、レーダー探査装置3が探査中にリアルタイムで実施してもよいし、レーダー探査装置3の探査後に(オフラインで)実施してもよい。
<Processing of data analysis device>
Next, the processing performed by the data analysis device 9 will be explained using FIG. 9. FIG. 9 is a flowchart showing the processing performed by the data analysis device. Note that the analysis processing by the data analysis device 9 may be performed in real time while the radar exploration device 3 is exploring, or may be performed (offline) after the radar exploration device 3 is exploring.
 S31:取得部90は、レーダー探査装置3及び地上標点計測装置6から各データを取得する。 S31: The acquisition unit 90 acquires each data from the radar exploration device 3 and the ground reference point measurement device 6.
 S32:探査系画像データマッチング部91は、レーダー探査装置3から取得されたデータ(第1及び第2の画像データ等)の画像データマッチングを行う。例えば、図12に示すように、探査系画像データマッチング部91は、レーダー探査装置3から、計測範囲Aの地面である第1の所定面を含む画像データA1と、計測範囲Aに隣接する計測範囲Bの地面である第2の所定面を含む画像データB1との画像データマッチングを行う。図12に示すように、計測範囲Aよりもレーダー探査データA2で示される地面部分の方が広く、レーダー探査データA2で示される地面部分よりも画像データA1の範囲の方が広い。同様に、計測範囲Bよりもレーダー探査データB2で示される地面部分の方が広く、レーダー探査データB2で示される地面部分よりも画像データB1の範囲の方が広い。従って、画像データA1と画像データB1の撮像領域はそれぞれの一部がオーバーラップする。画像データA1,B1は、2次元の平面画像であるが、レーダー探査データA2,B2は、図12に示すような直方体の3次元のデータである。なお、画像データA1は、第1の画像データの一例であり、画像データB1は、第2の画像データの一例である。また、レーダー探査データA2は第1のレーダー探査データの一例であり、レーダー探査データB2は第2のレーダー探査データの一例である。 S32: The exploration system image data matching unit 91 performs image data matching of the data (first and second image data, etc.) acquired from the radar exploration device 3. For example, as shown in FIG. 12, the exploration system image data matching unit 91 receives image data A1 including the first predetermined surface, which is the ground of the measurement range A, from the radar exploration device 3, and the measurement data adjacent to the measurement range A. Image data matching is performed with image data B1 including a second predetermined surface that is the ground in range B. As shown in FIG. 12, the ground area indicated by the radar exploration data A2 is wider than the measurement range A, and the range of the image data A1 is wider than the ground area indicated by the radar exploration data A2. Similarly, the ground area indicated by the radar exploration data B2 is wider than the measurement range B, and the range of the image data B1 is wider than the ground area indicated by the radar exploration data B2. Therefore, the imaging areas of image data A1 and image data B1 partially overlap. The image data A1 and B1 are two-dimensional planar images, but the radar exploration data A2 and B2 are three-dimensional rectangular parallelepiped data as shown in FIG. Note that the image data A1 is an example of first image data, and the image data B1 is an example of second image data. Further, the radar exploration data A2 is an example of first radar exploration data, and the radar exploration data B2 is an example of second radar exploration data.
 画像データの参照点(画像データ中の任意のピクセル)とGPRアンテナ32の位置の相対的なオフセットはあらかじめ計測されているため、探査系画像データマッチング部91は、画像データをマッチングした際に隣接する測線のレーダー探査データA2とレーダー探査データB2の相対的な変位量を算出することができる。 Since the relative offset between the reference point of the image data (any pixel in the image data) and the position of the GPR antenna 32 is measured in advance, the exploration system image data matching unit 91 performs matching of the adjacent image data when matching the image data. It is possible to calculate the relative displacement amount between the radar exploration data A2 and the radar exploration data B2 of the survey line.
 具体的には、探査系画像データマッチング部91は、同じ地面の領域を撮影した隣接する測線の計測の画像データ同士をマッチングする。探査系画像データマッチング部91は、画道路の舗装面の表層にある凹凸、ひび割れ等の路面性状、又はラインの塗装状態などから得られる路面のテクスチャ情報を画像データから抽出することにより、画像データのマッチングを行う。例えば、探査系画像データマッチング部91は、図10の測線aの走査において撮影された、計測カート2(レーダー探査装置3)の進行方向(x方向)に向かって左側の計測範囲B内の画像データ(図12の画像データA1)と、測線bの計測において撮影された計測範囲B内の画像データ(図12の画像データB1)をマッチングする。 Specifically, the exploration-based image data matching unit 91 matches image data obtained by measurements of adjacent survey lines photographing the same ground area. The exploration image data matching unit 91 extracts from the image data the texture information of the road surface obtained from the road surface properties such as unevenness and cracks on the surface layer of the pavement surface of the road, or the painted condition of the lines. Perform matching. For example, the exploration system image data matching unit 91 generates an image within a measurement range B on the left side when facing the traveling direction (x direction) of the measurement cart 2 (radar exploration device 3), which was photographed while scanning the survey line a in FIG. 10. The data (image data A1 in FIG. 12) is matched with the image data (image data B1 in FIG. 12) within the measurement range B photographed in the measurement of the survey line b.
 また、探査系画像データマッチング部91は、測線bの計測において撮影された、レーダー探査装置3の進行方向(-x方向)左側の計測範囲A内の画像データ(図12の画像データB1)を、測線aの計測において撮影された、計測範囲A内の画像データ(図12の画像データA1)とマッチングする。この時、探査系画像データマッチング部91は、各測線の計測データの測線識別子に加え、タイムスタンプのデータを参照してマッチング対象となる画像データの範囲を絞り込んでもよい。 In addition, the exploration system image data matching unit 91 uses image data (image data B1 in FIG. 12) within the measurement range A on the left side of the traveling direction (-x direction) of the radar exploration device 3, which was photographed during the measurement of the survey line b. , is matched with the image data (image data A1 in FIG. 12) within the measurement range A that was photographed during the measurement of the measuring line a. At this time, the exploration image data matching unit 91 may narrow down the range of image data to be matched by referring to time stamp data in addition to the survey line identifier of the measurement data of each survey line.
 なお、変形例であるレーダー探査装置3bの場合、探査系画像データマッチング部91は、さらにジャイロセンサ37、加速度センサ38、又はオドメトリ39のからのデータを参照してマッチング対象となる画像データの範囲を絞り込んでもよい。探査系画像データマッチング部91は、一度画像データがマッチングした後はその前後の時刻の対応する画像データ同士をマッチング対象として絞り込んでもよい。 In the case of the modified radar exploration device 3b, the exploration system image data matching unit 91 further refers to data from the gyro sensor 37, acceleration sensor 38, or odometry 39 to determine the range of image data to be matched. You can also narrow down the results. Once the image data has been matched, the exploration image data matching unit 91 may narrow down the matching target to corresponding image data at times before and after the matching.
 S33:データ合成部93は、画像データマッチングに基づき、レーダー探査装置3から取得されたデータ(第1及び第2のレーダー探査データ等)を合成して合成探査データを生成する。例えば、図12に示すように、データ合成部93は、レーダー探査データA2及びレーダー探査データB2を、第1の所定面及び前記第2の所定面の垂直面で合成することで、合成探査データAB2を生成する。 S33: The data synthesis unit 93 synthesizes the data acquired from the radar exploration device 3 (first and second radar exploration data, etc.) based on image data matching to generate composite exploration data. For example, as shown in FIG. 12, the data synthesizing unit 93 synthesizes the radar exploration data A2 and the radar exploration data B2 on a plane perpendicular to the first predetermined plane and the second predetermined plane, thereby generating the synthesized exploration data. Generate AB2.
 具体的には、データ合成部93は、同一測線内の画像データとタイムスタンプ(取得された時刻情報)から計測カート2(レーダー探査装置3)の進行方向の変位量を算出すると共に、探査系画像データマッチング部91で得られた隣接する測線のレーダー探査データの相対変位から、これに直交する方向の変位量を算出し、これに基づきレーダー探査データを貼り合わせて合成する。合成する際に重複する位置のレーダー探査データについては適宜、一方を削除して合成する。 Specifically, the data synthesis unit 93 calculates the amount of displacement in the traveling direction of the measurement cart 2 (radar exploration device 3) from image data and time stamps (obtained time information) within the same survey line, and also From the relative displacement of the radar exploration data of adjacent survey lines obtained by the image data matching unit 91, the amount of displacement in the direction orthogonal to this is calculated, and based on this, the radar exploration data is pasted and synthesized. When combining radar survey data at overlapping positions, one of them is deleted as appropriate and combined.
 以上の処理を繰り返すことにより、計測領域M全体の面的な地表面の画像と3次元のレーダー探査データ(合成探査データ)を得ることができる。本実施形態の方法は、地表面の各画像データに基づき、画像データに紐づくレーダー探査データの相対的な位置関係を特定する点に特徴がある。レーダー探査装置3の撮影部36とGPRアンテナ32の参照点は物理的に相対的な位置関係が固定されているため、高い精度で異なる測線で計測されたレーダー探査データを合成することができる。 By repeating the above processing, it is possible to obtain a planar ground surface image of the entire measurement area M and three-dimensional radar exploration data (synthetic exploration data). The method of this embodiment is characterized in that, based on each image data of the ground surface, the relative positional relationship of radar exploration data linked to image data is specified. Since the reference points of the imaging unit 36 of the radar exploration device 3 and the GPR antenna 32 have a physically fixed relative positional relationship, radar exploration data measured on different survey lines can be synthesized with high accuracy.
 S34:絶対位置系画像データマッチング部95は、地上標点計測装置6から取得されたデータ(例えば、第3及び第4の画像データのそれぞれ)と、レーダー探査装置3から取得されたデータ(第1及び第2の画像データに係る特定の画像データ)との画像データマッチングを行う。なお、第1及び又は第2の画像データに係る特定の画像データは、第1の画像データ及び第2の画像データの少なくとも一方であり、特定面(特定面51等)及び他の特定面(特定面52等)を含む画像データである。 S34: The absolute position system image data matching unit 95 matches the data acquired from the ground reference point measurement device 6 (for example, each of the third and fourth image data) and the data acquired from the radar exploration device 3 (the third image data). image data (specific image data related to the first and second image data) is performed. Note that the specific image data related to the first and/or second image data is at least one of the first image data and the second image data, and includes a specific surface (such as the specific surface 51) and another specific surface (such as the specific surface 51). This is image data including a specific surface 52, etc.).
 S33までの手順で複数のレーダー探査データを相対的な変位量に基づき合成することで生成された合成探査データは、計測領域Mの独自の座標系に基づき相対的位置が決定された状態にあるが、さらに地上標定点計測装置6により計測された複数のGCP計測点の画像データの絶対位置の情報を使用して、この座標系に絶対座標を紐づける。 The composite exploration data generated by combining multiple pieces of radar exploration data based on the relative displacement amount in the steps up to S33 has its relative position determined based on the unique coordinate system of the measurement area M. However, absolute coordinates are further linked to this coordinate system using information on the absolute positions of image data of a plurality of GCP measurement points measured by the ground control point measurement device 6.
 地上標定点計測で撮影されたGCP計測点の画像データとGPR計測において撮影した画像データのマッチング処理はGCP計測点の概略位置からGPR計測データの測線識別子(識別情報)やタイムスタンプ(時刻情報)のデータを参照してマッチング対象となる画像データの範囲を絞り込んでもよい。 The matching process between the image data of the GCP measurement point taken during ground control point measurement and the image data taken during GPR measurement is based on the approximate position of the GCP measurement point, and the line identifier (identification information) and time stamp (time information) of the GPR measurement data. The range of image data to be matched may be narrowed down by referring to the data in .
 S35:絶対位置キャリブレーション部97は、画像データマッチングに基づき、合成探査データに関する絶対位置のキャリブレーションを行う。 S35: The absolute position calibration unit 97 performs absolute position calibration regarding the synthetic exploration data based on image data matching.
 S34の手順で画像データのマッチング処理が完了したら複数の仮想的なGCPの座標値を使用してレーダー探査データ全体の座標を絶対座標に校正する。これは航空写真に映り込んだ複数のGCPの絶対座標の位置情報により航空写真のオルソ画像全体の座標データを校正するのと基本的には同様の手順である。GCP計測とGPR計測を同時に行う場合はGCP計測で有効な測位解が得られたデータから、適切なGCP位置のデータを選択して使用して絶対位置のキャリブレーションを行う。 Once the image data matching process is completed in step S34, the coordinates of the entire radar exploration data are calibrated to absolute coordinates using the coordinate values of a plurality of virtual GCPs. This is basically the same procedure as calibrating the coordinate data of the entire orthogonal image of an aerial photograph using the absolute coordinate position information of multiple GCPs reflected in the aerial photograph. When performing GCP measurement and GPR measurement at the same time, calibrate the absolute position by selecting and using appropriate GCP position data from the data for which a valid positioning solution was obtained by GCP measurement.
 S36:出力部99は、解析結果を出力する。 S36: The output unit 99 outputs the analysis result.
 〔本実施形態の適用例〕
 本実施形態は、図10のように地面から地中10内の埋設物11又は空洞12を探査するだけでなく、コンクリートの壁等の内部の探査にも適用することができる。
[Application example of this embodiment]
This embodiment can be applied not only to exploring the buried object 11 or cavity 12 in the underground 10 from the ground as shown in FIG. 10, but also to exploring the inside of a concrete wall or the like.
 また、なお、地中探査データ以外のデータ収集の用途、例えばステレオカメラ、LiDAR(Light Detection And Ranging)、imaging RADAR(Radio Detection and Ranging)による地上の3次元点群データの収集に本実施形態の方法を適用してもよい。 Furthermore, this embodiment can be used for data collection purposes other than underground exploration data, such as collecting 3D point cloud data on the ground using stereo cameras, LiDAR (Light Detection And Ranging), and imaging RADAR (Radio Detection And Ranging). method may be applied.
 〔実施形態の効果〕
 以上説明したように、本実施形態によれば、場所の制限がなく、埋設物又は空洞の正確な状態把握及び位置の特定を行うことができるという効果を奏する。
[Effects of embodiment]
As explained above, according to the present embodiment, there is no restriction on location, and there is an effect that the state and position of a buried object or cavity can be accurately determined.
 また、本実施形態によればGNSS衛星信号の難受信環境においてもレーダー探査データの絶対位置を高精度に校正し、精度の高い地下埋設管路の3次元データを取得することが可能となる。また、各測線の計測は任意の順序で、任意の時間帯に実施することができるため作業計画を柔軟に策定することができる。 Furthermore, according to this embodiment, it is possible to calibrate the absolute position of radar exploration data with high precision even in an environment where it is difficult to receive GNSS satellite signals, and to obtain highly accurate three-dimensional data of underground pipes. In addition, measurement of each survey line can be carried out in any order and at any time, making it possible to flexibly formulate work plans.
 レーダー探査データの画像上ではマンホールの立坑位置が特定できるため、地上のマンホール位置(の中心)をGCPとすることも考えられるが、マンホールは計測領域M内にGCPとして十分な密度で設置されているとは限らず、計測領域Mの四隅や中心といった理想的な位置にGCPが取れない可能性が大きい。また、マンホールの位置において必ずしもGNSS信号の良好な受信環境が得られるとは限らない。
 また、レーダー探査装置3に搭載した撮影部(カメラ)36、LiDAR、RADAR等で周辺環境のデータを収集し、環境地図にスキャンマッチングすることによりレーダー探査装置3の位置を特定する方法も考えられるが、要求される位置推定精度(概ね10cm以下)を実現するために極めて高精度な環境地図を準備する必要があるといった方式的な課題がある。
Since the manhole shaft position can be identified on the radar exploration data image, it is possible to consider the manhole position (center) on the ground as the GCP, but the manholes are not installed in the measurement area M at a sufficient density as the GCP. There is a high possibility that the GCP cannot be taken at the ideal position such as the four corners or the center of the measurement area M. Furthermore, it is not always possible to obtain a good reception environment for GNSS signals at the location of a manhole.
Another possible method is to collect data on the surrounding environment using the imaging unit (camera) 36, LiDAR, RADAR, etc. mounted on the radar exploration device 3, and to identify the position of the radar exploration device 3 by performing scan matching on an environmental map. However, in order to achieve the required position estimation accuracy (approximately 10 cm or less), there are formal issues such as the need to prepare extremely high-precision environmental maps.
 IMU(Inertial Measurement Unit: 慣性計測装置)をレーダー探査装置3の走査に伴う変位の計測に使用する方法では複数の測線間のレーダー探査装置3の変位を計測するためには隣接する測線のデータを連続して計測する必要があり、さらにIMUの累積誤差の問題がある。 In the method of using an IMU (Inertial Measurement Unit) to measure the displacement associated with the scanning of the radar exploration device 3, in order to measure the displacement of the radar exploration device 3 between multiple survey lines, data from adjacent survey lines are used. Continuous measurements are required, and there is also the problem of cumulative errors in the IMU.
 更に、隣接する測線のレーダー探査データの計測領域に互いにオーバーラップする領域を増やし、レーダー探査データの画像そのものをマッチング処理に使用し、レーダー探査データを貼り合わせる方法も考えられるが、レーダー探査データの画像は路面の画像データと比較して空間分解能に劣り、特徴点が少ないためマッチング処理の実現性、精度に課題がある。また、計測領域のオーバーラップ領域を増やすのに伴い測線の間隔が狭くなるため、結果として全体の作業量が増大するデメリットがある。 Furthermore, it is possible to increase the overlapping area in the measurement areas of radar survey data of adjacent survey lines, use the image of the radar survey data itself for matching processing, and paste the radar survey data together, but it is possible to combine the radar survey data. Images have inferior spatial resolution compared to road surface image data and have fewer feature points, so there are issues with the feasibility and accuracy of matching processing. Furthermore, as the overlapping area of the measurement area is increased, the interval between the survey lines becomes narrower, resulting in a disadvantage that the overall amount of work increases.
 また、機械学習等により自動的にレーダー探査データの貼り合わせを行う方法は処理負荷の問題があり、実現性に課題がある。 Additionally, methods that automatically combine radar exploration data using machine learning, etc. have problems with processing load and are not practical.
 これに対して、本実施形態では、所定面の地面のテクスチャ特徴を使用して計測データの面的な合成を行い、仮想的なGCPにより絶対位置を特定するため、上記のような問題を解決することができる。 On the other hand, in this embodiment, the above-mentioned problem is solved because the measurement data is synthesized in a plane using the ground texture characteristics of a predetermined surface and the absolute position is specified by virtual GCP. can do.
 〔補足〕
 以上、本発明は、上記の実施形態に限定されることなく、例えば以下に示すように、種々の変更及び応用が可能である。
〔supplement〕
As described above, the present invention is not limited to the above-described embodiments, and various modifications and applications can be made, for example, as shown below.
 (1)データ解析装置9は、コンピュータとプログラムによって実現できるが、このプログラムを(非一時的)記録媒体に記録することも、インターネット等の通信ネットワークを介して提供することも可能である。 (1) The data analysis device 9 can be realized by a computer and a program, but this program can also be recorded on a (non-temporary) recording medium or provided via a communication network such as the Internet.
 (2)CPU901(マイクロプロセッサ)は、単一だけでなく、複数であってもよい。 (2) The CPU 901 (microprocessor) may not only be a single CPU but may also be a plurality of CPUs.
1 探査システム
2 計測カート(移動体の一例)
3 レーダー探査装置
4 三脚
6 地上標定点計測装置
9 データ解析装置
90 取得部
91 探査系画像データマッチング部
93 データ合成部
95 絶対位置系画像データマッチング部
97 絶対位置キャリブレーション部
99 出力部
A1 画像データ(第1の画像データの一例)
B1 画像データ(第2の画像データの一例)
A2 レーダー探査データ(第1のレーダー探査データ)
B2 レーダー探査データ(第2のレーダー探査データ)
AB2 合成探査データ
1 Exploration system 2 Measurement cart (an example of a moving object)
3 Radar exploration device 4 Tripod 6 Ground control point measurement device 9 Data analysis device 90 Acquisition section 91 Exploration system image data matching section 93 Data synthesis section 95 Absolute position system image data matching section 97 Absolute position calibration section 99 Output section A1 Image data (Example of first image data)
B1 Image data (an example of second image data)
A2 Radar exploration data (first radar exploration data)
B2 Radar exploration data (second radar exploration data)
AB2 Synthetic exploration data

Claims (8)

  1.  埋設物又は空洞の位置及び所定面からの深さに関するデータを解析するデータ解析装置であって、
     第1の所定面を走査することで得られた第1のレーダー探査データと位置に関して関連付けられ前記第1の所定面を撮影することで得られた第1の画像データと、前記第1の所定面に隣接する第2の所定面を走査することで得られた第2のレーダー探査データと位置に関して関連付けられ前記第2の所定面を撮影することで得られた第2の画像データとの画像データマッチングを行う探査系画像データマッチング部と、
     前記探査系画像データマッチング部による画像データマッチングに基づいて、前記第1の所定面及び前記第2の所定面の垂直面で前記第1のレーダー探査データ及び前記第2のレーダー探査データを合成することで合成探査データを生成するデータ合成部と、
     前記第1の所定面及び前記第2の所定面の少なくとも一方において特定面の地上標定点を計測することで得られた絶対位置に関連付けられ前記特定面を撮影することで得られた第3の画像データと、前記第1の所定面及び前記第2の所定面の少なくとも一方において前記特定面とは異なる他の特定面の地上標定点を計測することで得られた絶対位置に関連付けられ前記他の特定面を撮影することで得られた第4の画像データのそれぞれと、前記第1の画像データ及び前記第2の画像データの少なくとも一方であり、前記特定面及び前記他の特定面を含む特定の画像データとの画像データマッチングを行う絶対位置系画像データマッチング部と、
     前記絶対位置系画像データマッチング部による画像データマッチングに基づいて、前記合成探査データに関する絶対位置のキャリブレーションを行う絶対位置キャリブレーション部と、
     を有するデータ解析装置。
    A data analysis device that analyzes data regarding the position and depth of a buried object or cavity from a predetermined surface,
    first image data obtained by photographing the first predetermined surface, which is correlated in terms of position with first radar exploration data obtained by scanning the first predetermined surface; an image of second radar exploration data obtained by scanning a second predetermined surface adjacent to the surface and second image data associated with position and obtained by photographing the second predetermined surface; An exploration image data matching unit that performs data matching;
    The first radar exploration data and the second radar exploration data are combined on a vertical plane of the first predetermined plane and the second predetermined plane based on image data matching by the exploration system image data matching unit. a data synthesis unit that generates synthetic exploration data by
    A third predetermined surface obtained by photographing the specific surface is associated with an absolute position obtained by measuring the ground control point of the specific surface on at least one of the first predetermined surface and the second predetermined surface. The image data is associated with an absolute position obtained by measuring a ground control point of another specific surface different from the specific surface on at least one of the first predetermined surface and the second predetermined surface. and at least one of the first image data and the second image data, and includes the specific surface and the other specific surface. an absolute position image data matching unit that performs image data matching with specific image data;
    an absolute position calibration unit that calibrates the absolute position regarding the synthetic exploration data based on image data matching by the absolute position system image data matching unit;
    A data analysis device with
  2.  前記第1の画像データには前記第1の所定面を識別するための第1の識別情報が関連付けられており、前記第2の画像データには前記第2の所定面を識別するための第2の識別情報が関連付けらており、
     前記探査系画像データマッチング部は、前記第1の識別情報及び前記第2の識別情報に基づいて、前記第1の画像データ及び前記第2の画像データとの画像データマッチングを行う、
     請求項1に記載のデータ解析装置。
    The first image data is associated with first identification information for identifying the first predetermined surface, and the second image data is associated with first identification information for identifying the second predetermined surface. 2 identification information is associated,
    The exploration image data matching unit performs image data matching with the first image data and the second image data based on the first identification information and the second identification information.
    The data analysis device according to claim 1.
  3.  前記第1の画像データには前記第1の所定面の走査時の第1の時刻情報が関連付けられており、前記第2の画像データには前記第2の所定面の走査時の第2の時刻情報が関連付けらており、
     前記探査系画像データマッチング部は、前記第1の時刻情報及び前記第2の時刻情報に基づいて、前記第1の画像データ及び前記第2の画像データのマッチング対象となる範囲を絞り込む、
     請求項1に記載のデータ解析装置。
    The first image data is associated with first time information when the first predetermined surface is scanned, and the second image data is associated with second time information when the second predetermined surface is scanned. Time information is associated,
    The exploration image data matching unit narrows down a range to be matched between the first image data and the second image data based on the first time information and the second time information.
    The data analysis device according to claim 1.
  4.  埋設物又は空洞の位置及び所定面からの深さに関するデータを解析するデータ解析装置であって、
     第1の所定面を走査することで得られた第1のレーダー探査データと位置に関して関連付けられ前記第1の所定面を撮影することで得られた第1の画像データと、前記第1の所定面に隣接する第2の所定面を走査することで得られた第2のレーダー探査データと位置に関して関連付けられ前記第2の所定面を撮影することで得られた第2の画像データとの画像データマッチングを行う探査系画像データマッチング部と、
     前記探査系画像データマッチング部による画像データマッチングに基づいて、前記第1の所定面及び前記第2の所定面の垂直面で前記第1のレーダー探査データ及び前記第2のレーダー探査データを合成することで合成探査データを生成するデータ合成部と、
     前記第1の所定面及び前記第2の所定面の少なくとも一方において特定面の地上標定点を計測することで得られた絶対位置及び方位に関連付けられ前記特定面を撮影することで得られた第3の画像データと、前記第1の画像データ及び前記第2の画像データの少なくとも一方であり、前記特定面を含む特定の画像データとの画像データマッチングを行う絶対位置系画像データマッチング部と、
     前記絶対位置系画像データマッチング部による画像データマッチングに基づいて、前記合成探査データに関する絶対位置のキャリブレーションを行う絶対位置キャリブレーション部と、
     を有するデータ解析装置。
    A data analysis device that analyzes data regarding the position and depth of a buried object or cavity from a predetermined surface,
    first image data obtained by photographing the first predetermined surface, which is correlated in terms of position with first radar exploration data obtained by scanning the first predetermined surface; an image of second radar exploration data obtained by scanning a second predetermined surface adjacent to the surface and second image data associated with position and obtained by photographing the second predetermined surface; An exploration image data matching unit that performs data matching;
    The first radar exploration data and the second radar exploration data are combined on a vertical plane of the first predetermined plane and the second predetermined plane based on image data matching by the exploration system image data matching unit. a data synthesis unit that generates synthetic exploration data by
    The first predetermined surface obtained by photographing the specific surface is associated with the absolute position and orientation obtained by measuring the ground control point of the specific surface on at least one of the first predetermined surface and the second predetermined surface. an absolute position system image data matching unit that performs image data matching between the image data of No. 3 and specific image data that is at least one of the first image data and the second image data and includes the specific surface;
    an absolute position calibration unit that calibrates the absolute position regarding the synthetic exploration data based on image data matching by the absolute position system image data matching unit;
    A data analysis device with
  5.  請求項1乃至3のいずれか一項に記載のデータ解析装置と、
     前記第1の所定面を走査することで前記第1のレーダー探査データ及び前記第1の画像データを得て、前記第2の所定面を走査することで前記第2のレーダー探査データ及び前記第2の画像データを得るレーダー探査装置と、
     前記特定面の地上標定点を計測することで当該特定面の絶対位置を得ると共に前記特定面を撮影することで前記第3の画像データを得て、前記他の特定面の地上標定点を計測することで当該他の特定面の絶対位置を得ると共に前記他の特定面を撮影することで前記第4の画像データを得る地上標定点計測装置と、
     を有する探査システム。
    A data analysis device according to any one of claims 1 to 3,
    The first radar exploration data and the first image data are obtained by scanning the first predetermined surface, and the second radar exploration data and the first image data are obtained by scanning the second predetermined surface. a radar exploration device that obtains image data of 2;
    Obtaining the absolute position of the specific surface by measuring the ground control point of the specific surface, obtaining the third image data by photographing the specific surface, and measuring the ground control point of the other specific surface. a ground control point measuring device that obtains the absolute position of the other specific surface by doing so and obtains the fourth image data by photographing the other specific surface;
    An exploration system with
  6.  前記第1及び第2のレーダー探査データを得るための計測領域の幅が、前記第1及び第2の画像データを得るための撮影領域の幅よりも狭く、前記第1及び第2の所定面の幅が前記計測領域の幅よりも狭い、請求項5に記載の探査システム。 The width of the measurement area for obtaining the first and second radar exploration data is narrower than the width of the imaging area for obtaining the first and second image data, and the first and second predetermined surfaces The exploration system according to claim 5, wherein the width of the measurement area is narrower than the width of the measurement area.
  7.  埋設物又は空洞の位置及び所定面からの深さ関するデータを解析するデータ解析装置が実行するデータ解析方法であって、
     前記データ解析装置は、
     第1の所定面を走査することで得られた第1のレーダー探査データと位置に関して関連付けられ前記第1の所定面を撮影することで得られた第1の画像データと、前記第1の所定面に隣接する第2の所定面を走査することで得られた第2のレーダー探査データと位置に関して関連付けられ前記第2の所定面を撮影することで得られた第2の画像データとの画像データマッチングを行う探査系画像データマッチング処理と、
     前記探査系画像データマッチング処理による画像データマッチングに基づいて、前記第1の所定面及び前記第2の所定面の垂直面で前記第1のレーダー探査データ及び前記第2のレーダー探査データを合成することで合成探査データを生成するデータ合成処理と、
     前記第1の所定面及び前記第2の所定面の少なくとも一方において特定面の地上標定点を計測することで得られた絶対位置に関連付けられ前記特定面を撮影することで得られた第3の画像データと、前記第1の所定面及び前記第2の所定面の少なくとも一方において前記特定面とは異なる他の特定面の地上標定点を計測することで得られた絶対位置に関連付けられ前記他の特定面を撮影することで得られた第4の画像データのそれぞれと、前記第1の画像データ及び前記第2の画像データの少なくとも一方であり、前記特定面及び前記他の特定面を含む特定の画像データとの画像データマッチングを行う絶対位置系画像データマッチング処理と、
     前記絶対位置系画像データマッチング処理による画像データマッチングに基づいて、前記合成探査データに関する絶対位置のキャリブレーションを行う絶対位置キャリブレーション処理と、
     を実行するデータ解析方法。
    A data analysis method executed by a data analysis device that analyzes data regarding the position and depth of a buried object or cavity from a predetermined surface,
    The data analysis device includes:
    first image data obtained by photographing the first predetermined surface, which is correlated in terms of position with first radar exploration data obtained by scanning the first predetermined surface; an image of second radar exploration data obtained by scanning a second predetermined surface adjacent to the surface and second image data associated with position and obtained by photographing the second predetermined surface; Exploration-based image data matching processing that performs data matching,
    The first radar exploration data and the second radar exploration data are combined on a vertical plane of the first predetermined plane and the second predetermined plane based on image data matching by the exploration system image data matching process. A data synthesis process that generates synthetic exploration data by
    A third predetermined surface obtained by photographing the specific surface is associated with an absolute position obtained by measuring the ground control point of the specific surface on at least one of the first predetermined surface and the second predetermined surface. The image data is associated with an absolute position obtained by measuring a ground control point of another specific surface different from the specific surface on at least one of the first predetermined surface and the second predetermined surface. and at least one of the first image data and the second image data, and includes the specific surface and the other specific surface. Absolute position image data matching processing that performs image data matching with specific image data;
    Absolute position calibration processing that calibrates the absolute position regarding the synthetic exploration data based on image data matching by the absolute position system image data matching processing;
    Data analysis method to perform.
  8.  コンピュータに、請求項7に記載の方法を実行させるプログラム。 A program that causes a computer to execute the method according to claim 7.
PCT/JP2022/032777 2022-08-31 2022-08-31 Data analysis device, exploration system, data analysis method, and program WO2024047798A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/032777 WO2024047798A1 (en) 2022-08-31 2022-08-31 Data analysis device, exploration system, data analysis method, and program
PCT/JP2023/024327 WO2024048056A1 (en) 2022-08-31 2023-06-30 Data analysis device, search system, data analysis method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032777 WO2024047798A1 (en) 2022-08-31 2022-08-31 Data analysis device, exploration system, data analysis method, and program

Publications (1)

Publication Number Publication Date
WO2024047798A1 true WO2024047798A1 (en) 2024-03-07

Family

ID=90098942

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2022/032777 WO2024047798A1 (en) 2022-08-31 2022-08-31 Data analysis device, exploration system, data analysis method, and program
PCT/JP2023/024327 WO2024048056A1 (en) 2022-08-31 2023-06-30 Data analysis device, search system, data analysis method, and program

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/024327 WO2024048056A1 (en) 2022-08-31 2023-06-30 Data analysis device, search system, data analysis method, and program

Country Status (1)

Country Link
WO (2) WO2024047798A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6364180A (en) * 1986-09-04 1988-03-22 Alps Electric Co Ltd Image synthesizing system by hand scanner input
JP2018028507A (en) * 2016-08-19 2018-02-22 株式会社カナン・ジオリサーチ Underground survey device
JP2020056657A (en) * 2018-10-01 2020-04-09 パイオニア株式会社 Information processing device
KR20210085054A (en) * 2019-12-30 2021-07-08 주식회사 지오메카이엔지 Handy multi gpr road surface imaging system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6364180A (en) * 1986-09-04 1988-03-22 Alps Electric Co Ltd Image synthesizing system by hand scanner input
JP2018028507A (en) * 2016-08-19 2018-02-22 株式会社カナン・ジオリサーチ Underground survey device
JP2020056657A (en) * 2018-10-01 2020-04-09 パイオニア株式会社 Information processing device
KR20210085054A (en) * 2019-12-30 2021-07-08 주식회사 지오메카이엔지 Handy multi gpr road surface imaging system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SHINOHARA JUN; KATO YUGO; OKINO ATSUSHI; SHUKLA ELVIS ANUP; BABA TSUTOMU: "GMS3 a unified system of ground penetrating radar and camera vector for efficient road infrastructure maintenance", 2018 17TH INTERNATIONAL CONFERENCE ON GROUND PENETRATING RADAR (GPR), IEEE, 18 June 2018 (2018-06-18), pages 1 - 4, XP033389266, DOI: 10.1109/ICGPR.2018.8441641 *

Also Published As

Publication number Publication date
WO2024048056A1 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
Haala et al. Mobile LiDAR mapping for 3D point cloud collection in urban areas—A performance test
Puente et al. Review of mobile mapping and surveying technologies
JP6694395B2 (en) Method and system for determining position relative to a digital map
Tao Mobile mapping technology for road network data acquisition
US11237005B2 (en) Method and arrangement for sourcing of location information, generating and updating maps representing the location
EP3077845B1 (en) System and methods for scanning with integrated radar detection and image capture
CN206208259U (en) A kind of unmanned aerial vehicle onboard three-dimensional laser scanner
KR101886932B1 (en) Positioning system for gpr data using geographic information system and road surface image
Haala et al. Hybrid georeferencing of images and LiDAR data for UAV-based point cloud collection at millimetre accuracy
US20220282967A1 (en) Method and mobile detection unit for detecting elements of infrastructure of an underground line network
Grejner-Brzezinska et al. From Mobile Mapping to Telegeoinformatics
Rizos Surveying
EP1662228A1 (en) Scanning of three-dimensional objects
Chang et al. Validation of DEMs derived from radar interferometry, airborne laser scanning and photogrammetry by using GPS-RTK
KR20220083195A (en) Method and Apparatus for Detecting and Achieving 3D Information of Underground Facility
WO2024047798A1 (en) Data analysis device, exploration system, data analysis method, and program
Harrap et al. An overview of LIDAR: collection to application
JP6773473B2 (en) Survey information management device and survey information management method
Yakar et al. The use of laser scanner in caves, encountered problems and solution suggestion
Tamimi Relative Accuracy found within iPhone data collection
Alamús et al. On the accuracy and performance of the GEOMÒBIL System
Chang et al. Assessment of digital elevation models using RTK GPS
Tamimi et al. Performance Assessment of a Mini Mobile Mapping System: Iphone 14 pro Installed on a e-Scooter
Prince Investigation of the possible applications of drone-based data acquisition for the development of road information systems
Alamús et al. Geomobil: ICC land based mobile mapping system for cartographic data capture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22957396

Country of ref document: EP

Kind code of ref document: A1