Le et al., 2013 - Google Patents
Vehicle localization using omnidirectional camera with GPS supporting in wide urban areaLe et al., 2013
View PDF- Document ID
- 5519373106059012735
- Author
- Le M
- Hoang V
- Vavilin A
- Jo K
- Publication year
- Publication venue
- Computer Vision-ACCV 2012 Workshops: ACCV 2012 International Workshops, Daejeon, Korea, November 5-6, 2012, Revised Selected Papers, Part I 11
External Links
Snippet
This paper proposes a method for long-range vehicle localization using fusion of omnidirectional camera and Global Positioning System (GPS) in wide urban environments. The main contributions are twofold: first, the positions estimated by visual sensor overcome …
- 230000004807 localization 0 title abstract description 13
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/10—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Scaramuzza et al. | Visual odometry [tutorial] | |
Xiong et al. | G-VIDO: A vehicle dynamics and intermittent GNSS-aided visual-inertial state estimator for autonomous driving | |
Gonzalez et al. | Combined visual odometry and visual compass for off-road mobile robots localization | |
Sujiwo et al. | Monocular vision-based localization using ORB-SLAM with LIDAR-aided mapping in real-world robot challenge | |
Shen et al. | Localization through fusion of discrete and continuous epipolar geometry with wheel and IMU odometry | |
Le et al. | Vehicle localization using omnidirectional camera with GPS supporting in wide urban area | |
Samadzadegan et al. | Autonomous navigation of Unmanned Aerial Vehicles based on multi-sensor data fusion | |
Ramezani et al. | Omnidirectional visual-inertial odometry using multi-state constraint Kalman filter | |
Hoang et al. | 3D motion estimation based on pitch and azimuth from respective camera and laser rangefinder sensing | |
Pang et al. | Low-cost and high-accuracy LIDAR SLAM for large outdoor scenarios | |
Khan et al. | Ego-motion estimation concepts, algorithms and challenges: an overview | |
Hoang et al. | A simplified solution to motion estimation using an omnidirectional camera and a 2-D LRF sensor | |
Hoang et al. | Combining edge and one-point ransac algorithm to estimate visual odometry | |
Krejsa et al. | Fusion of local and global sensory information in mobile robot outdoor localization task | |
Hoang et al. | Planar motion estimation using omnidirectional camera and laser rangefinder | |
Van Hamme et al. | Robust visual odometry using uncertainty models | |
Gang et al. | Robust tightly coupled pose estimation based on monocular vision, inertia, and wheel speed | |
Aggarwal | Machine vision based SelfPosition estimation of mobile robots | |
Burusa | Visual-inertial odometry for autonomous ground vehicles | |
Zeng et al. | LTI-SAM: LiDAR-template matching-inertial odometry via smoothing and mapping | |
Liu et al. | Stereo-image matching using a speeded up robust feature algorithm in an integrated vision navigation system | |
Huntsberger et al. | Sensory fusion for planetary surface robotic navigation, rendezvous, and manipulation operations | |
Li et al. | Image-based self-position and orientation method for moving platform | |
Kawasaki et al. | Line-based SLAM using non-overlapping cameras in an urban environment | |
Liu et al. | 6-DOF motion estimation using optical flow based on dual cameras |