Engel et al., 2019 - Google Patents
Deeplocalization: Landmark-based self-localization with deep neural networksEngel et al., 2019
View PDF- Document ID
- 10045795247411356140
- Author
- Engel N
- Hoermann S
- Horn M
- Belagiannis V
- Dietmayer K
- Publication year
- Publication venue
- 2019 IEEE Intelligent Transportation Systems Conference (ITSC)
External Links
Snippet
We address the problem of vehicle selflocalization from multi-modal sensor information and a reference map. The map is generated off-line by extracting landmarks from the vehicle's field of view, while the measurements are collected similarly on the fly. Our goal is to …
- 230000001537 neural 0 title abstract description 13
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically, i.e. tracking systems
- G01S3/7864—T.V. type tracking systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6288—Fusion techniques, i.e. combining data from various sources, e.g. sensor fusion
- G06K9/629—Fusion techniques, i.e. combining data from various sources, e.g. sensor fusion of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00624—Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
- G06K9/0063—Recognising patterns in remote scenes, e.g. aerial images, vegetation versus urban areas
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/10—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems where the wavelength or the kind of wave is irrelevant
- G01S13/72—Radar-tracking systems; Analogous systems where the wavelength or the kind of wave is irrelevant for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems where the wavelength or the kind of wave is irrelevant for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
- G01S13/726—Multiple target tracking
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/26—Navigation; Navigational instruments not provided for in preceding groups specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in preceding groups specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Engel et al. | Deeplocalization: Landmark-based self-localization with deep neural networks | |
Huang | Review on LiDAR-based SLAM techniques | |
CN112639502B (en) | Robot pose estimation | |
Mutz et al. | Large-scale mapping in complex field scenarios using an autonomous car | |
Badino et al. | Visual topometric localization | |
CN110726406A (en) | Improved nonlinear optimization monocular inertial navigation SLAM method | |
Mueller et al. | GIS-based topological robot localization through LIDAR crossroad detection | |
Hata et al. | Monte Carlo localization on Gaussian process occupancy maps for urban environments | |
Guizilini et al. | Dynamic hilbert maps: Real-time occupancy predictions in changing environments | |
Chiu et al. | Precise vision-aided aerial navigation | |
CN114088081A (en) | Map construction method for accurate positioning based on multi-segment joint optimization | |
Zhou et al. | Terrain traversability mapping based on lidar and camera fusion | |
Stübler et al. | A continuously learning feature-based map using a bernoulli filtering approach | |
Martinez et al. | Pit30m: A benchmark for global localization in the age of self-driving cars | |
Mutz et al. | What is the best grid-map for self-driving cars localization? An evaluation under diverse types of illumination, traffic, and environment | |
Linxi et al. | Human Following for Outdoor Mobile Robots Based on Point‐Cloud's Appearance Model | |
Chen et al. | SCL-SLAM: A scan context-enabled LiDAR SLAM using factor graph-based optimization | |
Sujiwo et al. | Localization based on multiple visual-metric maps | |
Guo et al. | Occupancy grid based urban localization using weighted point cloud | |
Gao et al. | Vido: A robust and consistent monocular visual-inertial-depth odometry | |
Gao et al. | Deep masked graph matching for correspondence identification in collaborative perception | |
Hu et al. | An Enhanced-LiDAR/UWB/INS Integrated Positioning Methodology for Unmanned Ground Vehicle in Sparse Environments | |
Soleimani et al. | A disaster invariant feature for localization | |
Zhuang et al. | A biologically-inspired simultaneous localization and mapping system based on LiDAR sensor | |
Liu et al. | Laser 3D tightly coupled mapping method based on visual information |